Help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world at Capgemini
Requirements
- Strong hands-on experience with DBT (modular SQL development, testing, documentation)
- Proficiency in Snowflake (data warehousing, performance tuning, security)
- Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages
- Solid understanding of data modeling concepts (star/snowflake schemas, normalization)
- Experience with version control systems (e.g., Git) and CI/CD practices
- Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus
- 3+ years of experience in data engineering or a related field
Responsibilities
- Design and implement scalable data models and transformation pipelines using DBT on Snowflake
- Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks
- Optimize Snowflake performance through query tuning, clustering, and resource management
- Ensure data quality, integrity, and governance through testing, documentation, and monitoring
- Participate in code reviews, architecture discussions, and continuous improvement initiatives
- Maintain and enhance CI/CD pipelines for DBT projects
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions
Other
- 3+ years of experience in data engineering or a related field
- Flexible work
- Healthcare including dental, vision, mental health, and well-being programs
- Paid time off and paid holidays
- Paid parental leave