Ingredion is looking to build out a robust, resilient, and rationalized hybrid data ecosystem on Azure and Google Cloud platforms based on Ingredion’s analytics governance guidelines as well as industry best practices.
Requirements
- Demonstrated programming skills in Python, SQL, and at least one ETL tool (like Google Cloud Dataflow, Azure Data Factory, Airflow, DataStage, Talend, or Informatica).
- Ability to analyze and profile data.
- A deep understanding of data engineering, platform engineering, data modeling, data management, and DevOps concepts.
- Ability to easily understand and debug complex data pipelines and logic to ingest, process, and store data.
- Familiarity with Google Cloud Platform (GCP) and/or Microsoft Azure platform. Preferably both.
Responsibilities
- Demonstrate understanding of Azure tools such as Data Factory, Analysis Services, Databricks, Power Apps, Power BI, Purview, etc.
- Demonstrate understanding of Google Cloud tools such as Big Query, PubSub, DataFlow, Cloud Composer, AI platform, Dataplex, etc.
- Identify ingestion, storage, and processing needs in current analytics ecosystem, and develop or modify frameworks that to automate and parametrize these functions.
- Design and build a data catalog that includes business, technical, and operational metadata.
- Work closely across the teams and participate in project initiatives, making recommendations and troubleshooting technical issues and refining processes.
Other
- Progress towards a bachelor's or master’s degree program in Computer Science, Mathematics, Engineering, or a related field.
- A methodical and a detail-oriented way of thinking.
- Ability to quickly learn new tool/technologies and must have strong communication and listening skills.
- Proven ability to deliver under agreed upon deadlines
- Ability to work collaboratively with peers and others