Enhance customer and agent experiences by designing, building, and deploying solutions that leverage large language models (LLMs), Generative AI, and natural language processing (NLP). Develop intelligent automation and AI-powered services, supporting teams in the responsible and effective use of AI technologies.
Requirements
- Experience building and deploying LLMs, transformers, or GenAI tools.
- Hands-on knowledge of Python and tools like TensorFlow, PyTorch, or scikit-learn.
- Worked with Hugging Face, OpenAI APIs, AWS Bedrock, LangChain, or other GenAI platforms.
- Built AI solutions in AWS, Azure, or GCP using SageMaker, Azure ML, or Vertex AI.
- Familiar with data engineering tools like Apache Spark, Kafka, and Airflow.
- Experience working with modern data stack technologies such as Snowflake, Redshift, Delta Lake, S3, Azure Data Lake, and BigQuery.
- Some experience with DevOps practices, like using CI/CD pipelines, Docker, and Kubernetes.
Responsibilities
- Ingest and preprocess diverse file types: PDFs, scanned images, emails (EML/MSG), audio, video, and structured/unstructured text from content management systems.
- Apply OCR (Optical Character Recognition) and speech-to-text models to extract meaningful data from documents and media.
- Use LangChain + LangGraph to orchestrate agentic workflows for parsing and reasoning across multimodal inputs.
- Build AI pipelines that classify, extract, and validate key entities (e.g., policy numbers, claim dates, insured parties) from documents.
- Integrate LLMs via Bedrock or Hugging Face to summarize, interpret, and flag anomalies in claims and underwriting documents.
- Implement retrieval-augmented generation (RAG) using Vector DBs to ground LLM responses in enterprise knowledge.
- Train and fine-tune models in SageMaker using custom datasets and embeddings.
Other
- Success in this role requires creativity, a passion for learning, and the ability to clearly explain complex ideas.
- You’ll join a team that values curiosity, open communication, and a strong commitment to responsible AI—working together to create smart, impactful tools that make a real difference.
- Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field, or equivalent work experience.
- 8+ years in software, data, or AI engineering, with 3+ years working directly with AI/ML models.
- Comfortable working with teams to understand business needs and explain how AI can help.