The Hartford insurance company is seeking a Principal AI Data Engineer to modernize their data infrastructure and build the next generation of data pipelines that power their AI and machine learning initiatives.
Requirements
- Technical expertise in LLMs, AI platforms, prompt engineering, LLM optimization, Retrieval-Augmented Generation (RAG) architectures and vector database technologies (Vertex AI, Postgres, OpenSearch, Pinecone etc.).
- Strong experience with GCP, Vertex AI or AWS required.
- Experience in multi cloud environment.
- Experience in Lang chain, AI agents, Vertex AI and Google Agent ecosystem.
- Strong experience with the design and development of complex data ecosystems leveraging next-generation cloud technology stack across AWS or GCP Cloud and Snowflake.
- Mastery level data engineering and architecture skills, including deep expertise in data architecture patterns, data warehouse, data integration, data lakes, data domains, data products, business intelligence, and cloud technology capabilities.
- 3+ years of AI/ML experience
Responsibilities
- Mastery Level Skills AI Data Engineering leader responsible for a large portfolio (e.g. LOB) or across LOBs
- AI data Engineering lead responsible for Implementing end-to-end AI data pipelines bringing structured, semi-structured and unstructured data together supporting AI and Agentic solutions.
- Real-Time Data Streaming: Design, build and maintain scalable real-time data pipelines for efficient ingestion, processing, and delivery.
- Drive best practices in AI data engineering by establishing standardized processes, promoting cutting-edge technologies, and ensuring data quality and compliance across the enterprise.
- Data and Analytics Management: Oversee the design, development, and maintenance of data pipelines, data warehouses, data lakes and reporting systems.
- Expertise in data engineering practices, knowledge of AI technologies, and the ability to lead cross-functional teams.
- Drive efficiency and Productivity: Identify and champion developer productivity improvements across the end-to-end data management lifecycle.
Other
- Candidates must be authorized to work in the US without company sponsorship.
- 12 years in data engineering, data management and building large-scale data ecosystems.
- Bachelor's or Master’s degree in Computer Science, Data Science or a related field.
- Exceptional presentation and verbal/written communication skills; must be able to communicate effectively at all levels across the organization.
- Ability to lead successfully in a lean, agile, and fast-paced organization, leveraging Scaled Agile principles and ways of working.