AirOps is looking for a Data Engineer to design and maintain the high-scale data infrastructure that powers the AirOps platform, ensuring data accuracy, reliability, and readiness for analysis for brand visibility insights.
Requirements
- Strong fluency in Python and SQL
- Experience with modern data modeling tools such as dbt
- Experience with data warehouses and OLAP databases (e.g., Redshift, Snowflake, BigQuery, ClickHouse)
- Proven ability to design and maintain production-grade data pipelines in cloud environments (AWS, GCP, or similar)
- Familiarity with orchestration frameworks (Airflow, Dagster, Prefect)
Responsibilities
- Design, build, and maintain scalable ETL/ELT pipelines for ingesting and transforming large volumes of data
- Implement automated data validation, monitoring, and alerting to ensure quality and reliability
- Integrate diverse internal and external data sources into unified, queryable datasets
- Optimize storage and query performance for analytical workloads
- Collaborate with data scientists to productionize ML models and ensure they run reliably at scale
- Work with product and engineering teams to meet data needs for new features and insights
- Maintain cost efficiency and operational excellence in cloud environments
Other
- 4+ years of experience in data engineering, ideally in AI, SaaS, or data-intensive products
- Comfort operating in fast-paced, ambiguous environments where you ship quickly and iterate
- You collaborate well with data scientists, product managers, and engineers
- You take pride in operational excellence and care about the quality of the data you deliver
- Equity in a fast-growing startup