Sigma is looking to build the data foundation that powers critical insights across Engineering and Tech Operations.
Requirements
- Strong experience working with APIs and building pipelines into cloud platforms (e.g., Snowflake, Databricks)
- Expertise in SQL and dbt; fluency in at least one programming language (e.g., Python, R, Scala)
- Experience implementing data governance frameworks that scale
- Experience with modern orchestrators (e.g., Astronomer, Airflow, Dagster)
- Experience building scalable ML systems such as recommendation engines, search or machine translation
- Fivetran, dbt, Snowflake, Sigma, Hightouch, Metaplane
- AWS, GCP, Azure
Responsibilities
- Design, build, and maintain core data models and visualizations in Sigma to support Engineering & Tech Operations initiatives
- Architect and manage production data pipelines in Snowflake and how they are consumed in Sigma
- Build foundational data assets for Tech Operations, including Support insights and internal telemetry
- Create observability datasets from Sigma’s cloud infrastructure platforms (AWS, GCP, Azure)
- Partner with infrastructure engineering team to ensure high availability of all key data assets
- Build internal data products and enable self-service usage across Tech Operations
- Identify and execute high-impact data projects in ambiguous environments
Other
- 3+ years of experience in a Data Engineering role
- Startup experience
- A drive to continuously learn (and share those learnings) about the evolving data ecosystem
- Collaborate across Engineering, Product, and GTM teams to deliver on all of the above
- Flexible time off policy. Take the time off you need!
- Paid bonding time for all new parents
- Traditional and Roth 401k
- Commuter and FSA benefits
- Lunch Program
- Dog friendly office
- In-office work environment in all our offices in SF, NYC, and London