Mass General Brigham needs to establish a data foundation for a suite of new analytic products to ensure data consistency and growth potential. This involves delivering the underlying data assets and pipelines for this foundation.
Requirements
- Experience in data engineering, with a focus on building and maintaining data infrastructure and pipelines
- 5-7 years required and Data warehousing development in large reporting environments
- 3-5 years required and Experience working with developing data pipelines using on Snowflake features ( Snowpipe, SnowSQL, Snow Sight, Data Streams ) required
- Hands-on development experience with ETL/ELT tools, such as dbt, Fivetran, or Informatic required
- Experience working in Agile software development environment required
- Working knowledge of cloud computing platforms such as AWS, GCP, or Azure.
- Experience with enterprise database solutions in cloud or on-premise environments
Responsibilities
- Design, develop, and implement data pipelines and ETL/ELT code to support business requirements.
- Work on cross-functional teams delivering enterprise solutions for internal and external clients.
- Assume ownership for delivering code revisions and enhancements from design through development and production installation.
- Maintain and optimize various components of the data pipeline architecture.
- Become subject matter expert for internal and external data products
- Ensure design solutions can scale and meet technical standards and performance benchmarks.
- Identify inefficient processes and develop recommendations and design solutions.
Other
- Bachelor's Degree Related Field of Study required
- We can consider and review experience in lieu of a degree
- Hybrid Onsite Flexible working model required for weekly onsite work at Assembly Row/ Local MGB sites
- Business needs and team needs determine in office work
- Remote working days require stable, secure, quiet, compliant work area