Build and maintain ELT pipelines using Apache Airflow, dbt, and Snowflake in an AWS cloud environment.
Requirements
- Strong SQL and Snowflake expertise, including performance tuning and data modeling
- Proficient in Python for scripting, automation, and working with REST APIs
- Experience with Apache Airflow for orchestration and workflow monitoring
- Hands-on with dbt for modular, version-controlled data transformations
- Solid experience with AWS services (e.g., S3, Lambda, IAM, CloudWatch) in data engineering workflows
- Experience integrating and processing data from REST APIs
- Understanding of data quality, governance, and cloud-native troubleshooting
Responsibilities
- Build and maintain ELT pipelines using Apache Airflow, dbt, and Snowflake
- Deploy container based services in AWS with monitoring setup
- Develop complex DBT models including incremental models, snapshots and documentation
- Implement DBT tests for data validation and quality checks
- Manage DBT projects using git, including implementing CI/CD process from the scratch
- Utilize AWS S3 for data storage, including best practices for organization and security
- Use AWS redshift for data warehousing and performance optimization
Other
- Great Communicator/Client Facing Individual Contributor
- 100% Hands on in the mentioned skills