Infosys is seeking a Google Cloud (GCP) data engineer to enable digital transformation for their clients in a global delivery model.
Requirements
- GCP with data engineering – data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query.
- Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks.
- Good experience in end-to-end implementation of data warehouse and data marts.
- Strong knowledge and hands-on experience in Python and SQL.
- Knowledge on CICD pipeline using Terraform in Git.
- Good knowledge on Google Big Query, using advance SQL programing techniques to build Big Query Data sets in Ingestion and Transformation layer.
- Knowledge on Airflow Dag creation, execution, and monitoring.
Responsibilities
- research on technologies independently
- recommend appropriate solutions
- contribute to technology-specific best practices and standards
- apply your technical proficiency across different stages of the Software Development Life Cycle
- Experience working with technologies like – GCP with data engineering – data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query.
- Strong knowledge on Python Program development to build reusable frameworks, enhance existing frameworks.
- Good experience in end-to-end implementation of data warehouse and data marts.
Other
- Candidate must be located within commuting distance of Detroit, MI or Dearborn, MI or be willing to relocate to the area.
- This position may require travel in the US
- Candidates authorized to work for any employer in the United States without employer-based visa sponsorship are welcome to apply.
- Infosys is unable to provide immigration sponsorship for this role at this time
- Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams.