Design and build data pipelines to transform data into a usable format and load it into databases or data warehouses for analysis and reporting.
Requirements
- Experience in ETL development using Informatica IICS IDMC CDI Teradata AWS CI CD pipelines
- Proficiency in Python SQL scripting.
- Ability to test pipelines ETL ingestion EDW target data
- Experience with cloud platforms (AWS Snowflake) and data warehousing.
Responsibilities
- Design develop and optimize ETL workflows using Informatica IDMC AWS BTEQ etc.
- Migrate legacy SAS jobs to modern cloud-based ETL platforms.
- Integrate data from diverse source systems into Teradata Oracle Snowflake AWS etc.
- Build scalable data pipelines and enhance existing ETL Data ingestion jobs.
- Document processes workflows and technical specifications for ongoing support.
- Perform data cleansing transformation and validation to ensure high-quality datasets.
Other
- Coordinate with business users’, analysts and developers to gather requirements and deliver solutions.
- Excellent communication problem-solving and stakeholder engagement skills.
- Familiarity with Agile/Kanban/Jira methodologies
- Informatica Certification
- Base SAS Certification