Migrate data and ETL workflows from Teradata to GCP-based services, optimizing performance and cost in the cloud.
Requirements
- Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
- Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
- Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
- Proven ability to refactor and translate legacy logic from Teradata to GCP.
- Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments.
- GCP certification (Preferred: Professional Data Engineer).
- Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.
Responsibilities
- Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
- Analyze and map existing Teradata workloads to appropriate GCP equivalents.
- Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
- Develop automated workflows for data movement and transformation using GCP native tools and/or custom scripts (Python).
- Optimize data storage, query performance, and costs in the cloud environment.
- Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Other
- Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
- Strong analytical, troubleshooting, and communication skills.
- Experience working in the healthcare domain.
- Knowledge of data governance, security, and compliance in cloud ecosystems.