The company is looking to solve the problem of extracting, transforming, cleaning, and moving data and metadata to load into a data warehouse, data mart, or operational data store, in order to provide accurate, timely, and actionable information to support critical corporate decisions.
Requirements
- Strong experience in data engineering with a total of 8 years’ experience using ETL tools such as Informatica, DataStage or ELT tools such as DBT
- 1 - 4 years of Snowflake experience preferred
- 1+ years of experience developing or deploying data solutions in any cloud such as AWS, Google or Azure preferred
- 5 - 6 years of demonstrated knowledge of designing dimensional modeling or relational data modeling
- 6 - 8 years of strong knowledge of source to target mappings
- 6 - 8 years of strong relational database knowledge
- 4 - 6 years’ experience with programming languages such as Python or Shell scripting is essential
Responsibilities
- Maps source system data to data warehouse models or other file transformations (source to target mappings)
- Creates ETL processes / Data pipelines and is responsible for creating and maintaining High Level Design documents
- Develops complex ETL processes
- Creates scripts when necessary
- Follows ETL standards, goals, and objectives
- Tests (unit level, integration level, system level, and user acceptance level) complex ETL processes / Data pipelines
- Troubleshoots data problems
Other
- Bachelor’s degree in Computer Science, MIS or related field with 8 years of experience or Master’s with 6 years of experience or an equivalent combination of education and experience is required
- Travel: Travels out of town to attend seminars, workshops, etc. (1 to 2 times per year)
- Work Schedule: Standard work schedule (40.0 hours per week)
- Strong verbal and written communication skills
- Ability to effectively provide work direction to team members and delegate tasks