The Enterprise Data Warehouse (EDW) is looking to solve the problem of designing, building, and operating enterprise-scale data platforms and data products, with an emphasis on scalability, performance, cost efficiency, and analytical correctness.
Requirements
- Strong hands-on experience with cloud data warehouses (BigQuery strongly preferred)
- Advanced SQL expertise and strong working knowledge of Python
- Proven experience designing enterprise-grade data models, including curated/ADS datasets
- Experience building and supporting semantic layers (AtScale preferred, but not required)
- Experience with universal or shared semantic modeling across multiple domains or products
- Experience with CI/CD and orchestration frameworks, including Jenkins and Airflow
- Strong understanding of performance tuning and cost optimization techniques in large data environments
Responsibilities
- Design, build, and support large-scale backend data pipelines on cloud data platforms (GCP/BigQuery)
- Lead development and evolution of semantic layers, including universal / reusable semantic models that support multiple business domains and analytics tools
- Design and maintain Analytical Data Store (ADS) data structures optimized for analytics, reporting, and advanced modeling.
- Partner with multiple EDW product teams to ensure consistent data modeling, metric definitions, and access patterns
- Drive platform reliability, scalability, and operational excellence across data products
- Lead query performance tuning and cost optimization efforts (slot usage, partitioning, clustering, aggregates, workload management)
- Support and modernize data orchestration frameworks
Other
- Must be eighteen years of age or older.
- Must be legally permitted to work in the United States.
- The knowledge, skills and abilities typically acquired through the completion of a bachelor's degree program or equivalent degree in a field of study related to the job.
- 3+ years of experience
- No travel required.