Zinnia is the leading technology platform for accelerating life and annuities growth. With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products. All of which enables more people to protect their financial futures.
Requirements
- Strong expertise in data integration, transformation, and orchestration using Spark, Hive, and Airflow.
- Proficiency in Lakehouse platforms (Delta Lake, Apache Hudi, Apache Iceberg) and data warehousing concepts.
- Familiarity with cloud-based data environments (AWS, GCP, or Azure).
- In-depth understanding of scalable data pipelines, distributed computing, and modern data architectures.
- Programming knowledge in Scala or Python
- Knowledge of data quality and governance principles and experience implementing them within the Big Data lifecycle.
Responsibilities
- Build the next-gen data infrastructure for Zinnia using Lakehouse frameworks, Apache Airflow, Apache Spark, and Hive.
- Design, build, and optimize data workflows across real-time, nearline, and offline data ecosystems.
- Leverage Lakehouse platforms (Delta Lake, Hudi, Iceberg) to enable unified batch and streaming pipelines.
- Collaborate with stakeholders and cross-functional teams to understand business requirements and translate them into scalable, data-driven technical solutions.
- Provide technical expertise in troubleshooting and resolving complex, distributed data-related issues.
- Stay up to date with Big Data, cloud, and Lakehouse trends, recommending best practices for data engineering and integration.
- Mentor and guide junior engineers, fostering a culture of innovation, automation, and continuous learning.
Other
- 10+ years of experience in Big Data engineering or a similar role, with proven leadership and project management experience.
- Excellent communication, leadership, and interpersonal skills, with the ability to collaborate across teams.
- Proven ability to adapt to changing priorities, manage multiple projects simultaneously, and deliver results in a fast-paced environment.
- Bachelor’s degree in computer science, Information Technology, or related field.
- competitive compensation