Travelers Data Engineering team needs to construct pipelines to contextualize and provide easy access to data by the entire enterprise, and grow and transform the analytics landscape.
Requirements
- Proficient use of tools, techniques, and manipulation including Cloud platforms, programming languages, and an understanding of data engineering practices.
- Experience with ETL development in data lake and data warehouse (preferably Snowflake)
- Exhibited hands-on experience in building data pipelines and reusable components using AWS Services and PySpark, and Snowflake within the last two years
- Exposure to Databricks Unity Catalog and UniForm is a plus
- Programming languages
- Cloud platforms
- Data engineering practices
Responsibilities
- Build and operationalize complex data solutions, correct problems, apply transformations, and recommend data cleansing/quality solutions.
- Design data solutions.
- Analyze sources to determine value and recommend data to include in analytical processes.
- Incorporate core data management competencies including data governance, data security and data quality.
- Collaborate within and across teams to support delivery and educate end users on data products/analytic environment.
- Perform data and system analysis, assessment and resolution for defects and incidents of moderate complexity and correct as appropriate.
- Test data movement, transformation code, and data components.
Other
- Bachelor’s Degree in STEM related field or equivalent.
- Six years of related experience.
- Strong verbal and written communication skills with the ability to interact with team members and business partners.
- Leadership - Intermediate leadership skills with a proven track record of self-motivation in identifying personal growth opportunities.
- Applicants must be authorized to work for ANY employer in the U.S.