Angi is looking for a Senior Data Engineer to architect, build, and maintain the data infrastructure that empowers analytics, business intelligence, and operational decision-making across Angi, delivering scalable, reliable, and well-governed data solutions in a modern cloud environment.
Requirements
- Deep expertise with dbt and modern ELT patterns.
- Strong SQL skills and programming experience in Python, Go, or Kotlin.
- Experience with containerized deployments, CI/CD, and infrastructure-as-code.
- Proficiency with AWS services relevant to data engineering.
- Hands-on experience with modern data lake/lakehouse architectures.
- Knowledge of developing within BI tools such as Looker, including LookML.
- Proven experience in data modeling and warehouse design.
Responsibilities
- Design and implement scalable data models and pipelines using industry-standard techniques (Kimball, Data Vault, etc.)
- Build, optimize, and maintain ELT workflows leveraging dbt and cloud-based ETL/ELT platforms (Fivetran, Stitch, etc.).
- Develop orchestration workflows using tools like Airflow or Dagster.
- Write efficient SQL and develop in programming languages such as Python, Go, or Kotlin.
- Implement DevOps practices including containerization (Docker), CI/CD pipelines (GitLab), and infrastructure-as-code (Terraform, Helm, EKS).
- Work with AWS services (S3, IAM, AWS CLI, Glue) and data lake/lakehouse frameworks (Apache Iceberg, Glue Catalog).
- Integrate with and optimize cloud data warehouse solutions (Snowflake, Redshift, Trino).
Other
- Collaborate closely with stakeholders to understand requirements and deliver solutions that meet business needs.
- Operate autonomously—creating work items, prioritizing tasks, and tracking progress independently.
- Communicate effectively with both technical and non-technical audiences.
- Strong problem-solving skills and ability to work independently.
- Excellent communication and stakeholder management skills.