The Data Engineer role at Global Healthcare Exchange (GHX) aims to support product and technology initiatives by developing and executing data solutions, including general application development activities and collaboration with Product and Engineering teams to design solutions and enable new data capabilities.
Requirements
- Strong demonstrable SQL and Python skills
- Experience in a diverse set of Amazon Web Services services such as SNS/SQS, S3, Glue, Lambda, API Gateway
- Strong development experience in Python, PySpark, and SQL
- Experience developing in Snowflake
- Knowledge of data governance, API security, and best practices for cloud-based systems
- Thorough understanding of, and support for, Agile development methodologies
- Ability to design, collect, and analyze large datasets
Responsibilities
- Lead and contribute to the backend and ETL development effort of our data platform using Python and SQL
- Integrate and optimize data flows between AWS and Snowflake for application, analytics, and reporting use cases
- Implement and manage data quality, security, and monitoring frameworks
- Develop and maintain infrastructure-as-code using tools such as CloudFormation or CDK
- Contribute to DevOps practices for CI/CD pipeline automation, version control, and deployment
- Provide architectural guidance and development/build standards for the team
- Troubleshoot and resolve issues in APIs, data pipeline lines, and infrastructure
Other
- Bachelor's degree in Computer Science, Mathematics, or related fields
- 5+ years of data engineering experience building business intelligence applications
- Ability to communicate technical concepts and designs to cross-functional and offshore teams
- Demonstrated organizational, prioritization, and time management skills
- Ability and willingness to travel nationally to remote offices and partners approximately 10% of the time