The company is looking to clean and reconcile critical insurance producer data across 70 operating units, ensuring data integrity and compliance.
Requirements
- 4–6 years in data engineering, ETL development, or data analytics.
- Proven experience building cloud-based data pipelines and data reconciliation solutions.
- Prior work with Databricks or similar big data platforms (Spark, Hadoop).
- Experience integrating multiple source systems into a centralized data platform.
- Nice-to-Have: Insurance domain knowledge and familiarity with APS and Guidewire systems.
Responsibilities
- Build ETL pipelines and reconciliation logic for APS ↔ GWPC data.
- Work in a Databricks-heavy environment with SQL, Python, Delta Lake, and cloud platforms (Azure preferred).
- Collaborate with business teams to ensure data accuracy and governance.
Other
- 100% remote opportunity.
- Background check required.
- Drug screen required.
- Face to face interview not required.
- Candidate must be authorized to work without sponsorship.