Sanity is seeking a Data Engineer to scale and evolve its data infrastructure to enable better data-driven decisions for its B2B SaaS platform.
Requirements
- deep expertise in SQL, Python, and Node.js/TypeScript for data engineering workflows
- Production experience with workflow orchestration tools like Airflow, and customer data platforms like RudderStack, ideally in a B2B SaaS environment
- Proven experience integrating and maintaining data flows with CRM systems like Salesforce, Marketo, or HubSpot
- Track record of building reliable data infrastructure that supports rapid business growth and evolving analytics needs
- Experience implementing data quality frameworks and monitoring systems to ensure reliable data delivery to stakeholders
- Experience with Google Cloud Platform and BigQuery
- Experience with product analytics tools like Amplitude, Mixpanel, or PostHog
Responsibilities
- Design, develop, and maintain scalable ETL/ELT pipelines to ensure data is efficiently processed, transformed, and made available across the company.
- Collaborate with engineering teams to implement and scale product telemetry across our product surfaces.
- Develop and maintain data models in BigQuery that balance performance, cost, and usability
- Establish best practices for data ingestion, transformation, and orchestration, ensuring reliability and efficiency.
- Orchestrate data workflows to reduce manual effort, improve efficiency, and maintain high data quality standards.
- Build and maintain comprehensive monitoring, alerting, and logging systems for all data pipelines and infrastructure
- Implement SLAs/SLOs for critical data pipelines and establish incident response procedures
Other
- Remote in Europe or North America (East Coast/ET)
- 4+ years of experience building data pipelines at scale
- Proactive mindset with attention to detail, particularly in maintaining comprehensive documentation and data lineage
- Strong communication skills with demonstrated ability to collaborate effectively across US and European time zones
- Work closely with data analysts, engineers, and other internal stakeholders to understand their data needs and design robust pipelines that support data-driven decision-making.