Infosys is seeking a Google Cloud data engineer to enable digital transformation for clients in a global delivery model.
Requirements
- Experience working with GCP with data engineering – data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query
- ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka
- Strong knowledge on Python Program development
- Knowledge on CICD pipeline using Terraform in Git
- Good knowledge on Google Big Query, using advance SQL programing techniques to build Big Query Data sets in Ingestion and Transformation layer
- Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
- Knowledge on Airflow Dag creation, execution, and monitoring
Responsibilities
- Enable digital transformation for clients in a global delivery model
- Research on technologies independently and recommend appropriate solutions
- Contribute to technology-specific best practices and standards
- Interface with key stakeholders and apply technical proficiency across different stages of the Software Development Life Cycle
- Implement data warehouse and data marts
- Develop reusable frameworks and enhance existing frameworks using Python
- Work on end-to-end implementation of data engineering projects
Other
- Bachelor’s degree or foreign equivalent required from an accredited institution
- At least 4 years of Information Technology experience
- Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams
- Experience and desire to work in a global delivery environment
- The job may entail extensive travel
- The job may also entail sitting as well as working at a computer for extended periods of time
- Candidates should be able to effectively communicate by telephone, email, and face to face