AirOps is looking to scale its data platform to power insights on AI search visibility and content performance, driving customer-facing analytics and product features.
Requirements
- Expert SQL and Python with deep experience building production pipelines at scale
- Hands-on with dbt and a workflow manager such as Airflow or Prefect
- Strong background in dimensional and event-driven modeling and a company-wide metrics layer
- Experience with Snowflake or BigQuery, plus Postgres for transactional use cases
- Track record building data products for analytics and customer reporting
- Cloud experience on AWS or GCP and infrastructure as code such as Terraform
- Domain experience in SEO, content analytics, or growth experimentation is a plus
Responsibilities
- design, build, and operate batch and streaming pipelines that ingest data from crawlers, partner APIs, product analytics, and CRM.
- define and maintain company-wide models for content entities, search queries, rankings, AI agent answers, engagement, and revenue attribution.
- implement workflow management with Airflow or Prefect, dbt-based transformations, version control, and automated testing.
- set SLAs, add tests and data contracts, monitor lineage and freshness, and lead root cause analysis.
- run Snowflake or BigQuery and Postgres with strong performance, cost management, and partitioning strategies.
- deliver clear, documented metrics datasets that power dashboards, experiments, and product activation.
- partner with Product and Customer teams to define tracking plans and measure content impact across on-site and off-site channels.
Other
- 5+ years in data engineering with 2+ years leading projects
- Clear communicator with a bias for action, curiosity, and a high bar for quality
- Extreme Ownership
- Quality
- Curiosity and Play