Gradient AI is looking to solve the problem of predicting risk more accurately, improving profitability, and automating underwriting and claims in the Group Health and P&C insurance industry using AI-powered solutions.
Requirements
- Strong proficiency in Python and SQL within a professional environment.
- Hands-on knowledge of big data tools like Apache Spark (PySpark), DataBricks, Snowflake, or similar platforms.
- Skilled in using data orchestration frameworks such as Airflow, Dagster, or Prefect.
- Comfortable working within cloud computing environments, preferably AWS, along with Linux systems.
- Knowledge of healthcare data standards and a solid understanding of healthcare data privacy and security regulations (such as HIPAA) are highly desirable.
- Experience serving as a technical lead, setting coding standards, and mentoring other engineers is strongly preferred.
- Ability to work with and visualize health and/or medical data, with Insurtech industry exposure, is considered a plus.
Responsibilities
- Own the technical implementation process for new customers, from ingestion to deployment, ensuring accuracy, consistency, and performance with an eye for scalable and repeatable processes.
- Build and maintain infrastructure for the extraction, transformation, and loading (ETL) of data from a variety of sources using SQL, AWS, and healthcare-specific big data technologies and analytics platforms.
- Innovate new tools to quickly extract, process, and validate client data from different sources and platforms.
- Collaborate with data scientists to transform large volumes of health-related and bioinformatics data into modeling-ready formats, prioritizing data quality, integrity, and reliability in healthcare applications.
- Apply health and bioinformatics expertise to design data pipelines that translate complex medical concepts into actionable requirements.
- Use Airflow to orchestrate ETL pipelines, ensuring the efficient, reliable movement of healthcare data across systems.
- Work closely with our engineering and client teams to ensure smooth data integration and top-notch customer support.
Other
- BS in Computer Science, Bioinformatics, or another quantitative discipline
- 5+ years of experience implementing, managing, or optimizing data solutions in a professional setting
- 3+ years of experience using data orchestration frameworks such as Airflow, Dagster, Prefect
- Ability to work in a fully remote environment
- Strong team-oriented mindset and ability to work collaboratively with engineering and client teams