Everest needs to design, build, and manage a modern data and analytics platform to support its global insurance business, requiring expertise in cutting-edge data technologies and architectural best practices.
Requirements
- Deep experience in Azure and Databricks
- In-depth experience in Medallion/Delta Lake Architecture, ML/AI, and Azure cloud services, Azure Data technologies such as ADLS, ADF, Fabric, SQL, Synapse, PowerBI etc.
- Strong experience in implementing in IaC automation, DevOps, RBAC/ABAC, MLOPS etc.
- Strong experience with orchestration tools such as Airflow and Event Driven data integration tools such as Kafka etc.
- Strong experience with programming languages and tools such as Python, PySpark, SQL etc.
- Proven success building scalable platforms in complex, global environments
- Experience deploying pipelines via Azure DevOps with code review and branching strategies
Responsibilities
- Architect and manage Everest’s Azure-based data platform using Databricks and cloud-native tools
- Define and enforce data architecture standards, blueprints, and best practices
- Lead adoption of emerging technologies through PoCs and pilot implementations
- Translate business and technical requirements into an architectural blueprints, roadmaps and solution designs adhering to enterprise standards.
- Provide technology leadership to delivery teams to build and operate data products effectively
- Design and develop reusable components including data ingestion for structured and unstructured data, data transformation, logging, and observability
- Drive MLOps and self-service analytics frameworks for model lifecycle management
Other
- Hybrid role based in our Warren, NJ headquarters (3 days onsite / 2 remote)
- Strong communication, leadership, and stakeholder engagement skills
- Certifications in Azure and Databricks preferred
- Insurance industry knowledge (especially P&C) is a strong plus
- All colleagues are held accountable to upholding and supporting our values and behaviors across the company.