Thatch is looking to build and scale its data infrastructure, models, and pipelines to support growth, revenue dashboards, core analytics, and predictive modeling efforts.
Requirements
- Strong software engineering skills (Python preferred) with experience in modular pipelines and tooling.
- Advanced SQL and hands-on experience with dbt.
- Familiarity with orchestration tools (Airflow, Dagster, Prefect) and modern data warehouses (Redshift, Snowflake, BigQuery).
- Experience with CI/CD, data testing (e.g., dbt tests, Great Expectations), and data observability.
- Led or contributed to a cloud data warehouse migration (e.g., Redshift to Snowflake or BigQuery), including managing dbt model portability, cost/performance tuning, and coordination across teams.
Responsibilities
- Own and evolve Thatch’s modern data stack (Redshift, dbt, Fivetran, Dagster/Airflow, Metabase, Hex).
- Build reliable ELT pipelines to integrate internal systems and third-party APIs with strong observability and scale.
- Model core business logic (e.g., plan years, employer funding, engagement metrics) into scalable dbt models.
- Establish trust in data through monitoring, testing, and CI/CD best practices.
- Collaborate cross-functionally to support analytics, experimentation, and strategic decision-making.
Other
- 3–5+ years of experience in data engineering, analytics engineering, or backend engineering with a data focus.
- Comfortable partnering cross-functionally with product, ops, and analytics.
- Prior experience building data platforms or analytics foundations in a fast-paced startup.
- Previous work in the healthcare or fintech industries.
- Interview rigorously based on integrity, talent, and drive.