Brightline is looking to advance its business-critical data platform by hiring a Senior Data Engineer to design, build, and maintain robust data pipelines and infrastructure that power analytics, reporting, and decision-making across the organization.
Requirements
- Expert-level proficiency in SQL and data warehousing using Snowflake.
- Deep experience with dbt for data modeling, testing, and documentation.
- Deep experience with version control (Github) and CI/CD practices.
- Some experience with Python and APIs.
- Ability to design efficient data systems that meet real-world business needs.
- Experience with tools such as Segment, Amplitude, or Iterable
- Familiarity with AI/ML engineering practices
Responsibilities
- Own and maintain Brightline’s modern data platform, ensuring data integrity, scalability, and security across all systems.
- Monitor and improve the performance of our Snowflake data warehouse, Stitch ingestion layer, dbt transformation layer, and Tableau analytics layer.
- Build data pipelines / APIs to connect across data systems and enable new business capabilities.
- Implement best practices in data engineering, including CI/CD workflows, data testing, and documentation.
- Reduce un-needed complexity and implement solutions to minimize snowflake storage and consumption costs.
- Unlock new data platform capabilities, and enable more powerful analytics across the business.
- Design and implement scalable dbt models and ELT pipelines that serve analytics, reporting, and potential AI/ML use cases.
Other
- Commitment to improving behavioral health care and supporting families in accessing affordable, high-quality care.
- Comfortable working in ambiguous or evolving business contexts and providing data clarity.
- Strong ability to communicate with both technical and business audiences.
- Skilled at breaking down complex systems and communicating architecture, trade-offs, and outcomes.
- Marketing analytics and technology experience