Shape the future of healthcare data through scalable, high-impact solutions. Architect and implement robust data infrastructure that powers internal operations and client-facing products.
Requirements
- Strong experience with data pipeline architecture, including data lakes, message queuing, and stream processing.
- Advanced SQL and data modeling skills for both structured and unstructured datasets.
- Proven ability to design and implement scalable data transformation processes and data governance frameworks.
- Experience with big data technologies such as Apache Spark, Kafka, Airflow, Iceberg, and cloud-based platforms (AWS, SQL/NoSQL databases).
Responsibilities
- Architect, design, and own the strategy for scalable data platforms and pipelines.
- Build and maintain robust ETL/ELT processes to integrate large, complex datasets from multiple sources.
- Implement and enforce data governance, best practices, and observability standards across the organization.
- Create data tools to support analytics, data science, and product development.
- Solve complex data challenges using modern big data technologies and cloud-based solutions.
- Mentor and guide other engineers in developing scalable, high-quality data pipelines.
Other
- Collaborate across Product, Sales, Clinical, and Executive teams to translate business requirements into technical solutions
- Maintain best practices for data governance and pipeline management.
- Mission-driven professional who thrives in a fast-paced environment, enjoys solving complex data challenges, and is passionate about leveraging data to improve outcomes.
- Mentor peers, establish engineering standards, and contribute to building a collaborative, high-performing engineering culture.
- Strong problem-solving, analytical, project management, and organizational skills.
- Excellent communication and collaboration skills, with a mission-driven mindset.