Migrate existing AWS data solutions to GCP equivalents for Capgemini.
Requirements
- Hands-on experience with both AWS and GCP data services.
- Proficiency in SQL, Python, and data transformation techniques.
- Experience with GCP tools such as BigQuery, Dataflow, Cloud Storage, and Pub/Sub.
- Familiarity with AWS services like S3, Redshift, RDS, and Glue.
- Experience with data migration tools (e.g., AWS DMS, Google DMS).
- Strong understanding of data governance, security, and compliance.
Responsibilities
- Design and implement scalable data pipelines on GCP using tools like Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
- Migrate existing AWS data solutions (e.g., S3, Redshift, RDS, Glue) to GCP equivalents.
- Develop ETL/ELT workflows to transform and load data into GCP data warehouses and lakes.
- Collaborate with data architects, cloud engineers, and application teams to ensure smooth migration and integration.
- Optimize data processing for performance, reliability, and cost-efficiency.
- Ensure data quality, integrity, and security throughout the migration process.
- Automate data workflows using Infrastructure as Code (e.g., Terraform) and CI/CD pipelines.
Other
- 3 plus years of experience in data engineering or cloud data development.
- Flexible work
- Healthcare including dental, vision, mental health, and well-being programs
- Financial well-being programs such as 401(k) and Employee Share Ownership Plan
- Paid time off and paid holidays