Design, develop, and manage complex Airflow DAGs for data workflows, ETL/ELT pipelines, and task automation to support end-to-end data solutions.
Requirements
- 3+ years of hands-on experience with Apache Airflow in production environments
- Strong programming skills in Python
- Solid understanding of data engineering concepts, including ETL/ELT processes, workflow orchestration, and data validation
- Experience with SQL and relational databases
- Familiarity with cloud platforms and their storage and compute services
- Knowledge of containerization tools like Docker and orchestration with Kubernetes (a plus)
Responsibilities
- Design, develop, and manage complex Airflow DAGs for data workflows, ETL/ELT pipelines, and task automation
- Integrate Airflow with various data sources and targets
- Optimize and monitor pipeline performance and troubleshoot failures in a timely manner
- Implement best practices for code quality, testing, version control, and deployment
- Maintain documentation and provide technical mentorship to junior engineers
- Collaborate with data engineers, analysts, and platform teams to support end-to-end data solutions
- Troubleshoot failures in a timely manner
Other
- Full time employment required
- No applications will be considered if received more than 120 days after the date of this post