HealthStream is looking to solve the problem of improving healthcare outcomes and empowering healthcare professionals by streamlining everyday tasks, improving performance, engagement, and safety through innovative data solutions.
Requirements
- Three or more years’ experience in data warehousing and ETL tools / frameworks.
- Three or more years working experience with programming languages (e.g. Python, Java, SQL), data modeling and database Management systems.
- Two or more years’ experience working in big data technologies such as Kafka, Spark, Hive.
- At least one year of experience with developing data processes using AWS or Azure.
- Proficient using GIT Version using CLI.
- Proficient in DBT using Jinga Templates
- Proficient in understanding Data Pipelines using SQL statements.
Responsibilities
- Design and develop new and tune existing SQL queries as needed
- Design, develop and implement automated solutions to extract data from different sources within Healthstream
- Design, develop and implement automated solutions to ingest data into HealthStream’s data Lake
- Design, develop and implement automated solutions to transform data within HealthStream’s data Lake
- Participate in the on-call rotation to troubleshoot and resolve data issues
- Investigate and resolve data issues as needed.
- Evaluate and implement new data related technologies.
Other
- Must be methodical and able to establish priorities
- Proficient in SDLC methodologies such as Agile and Kanban.
- Strong collaboration and communication skills for working with cross-functional teams.
- Excellent problem-solving skills with the ability to troubleshoot and resolve complex technical issues.
- Experience developing and supporting mission critical applications that function 24 hours a day, 7 days a week