Optum is looking to automate and improve medical coding processes at scale within their SaaS platform by integrating cutting-edge machine learning models, including large language models (LLMs).
Requirements
- 7+ years of experience building and operating machine learning systems in production environments with a proven impact
- 5+ years of hands-on experience working with Azure cloud services
- Solid software engineering fundamentals, including production-grade coding in Python and/or C-Sharp, testing, and performance profiling
- Expertise in supervised learning, feature engineering, evaluation methodologies, and deep learning or gradient boosting techniques
- Experience with privacy, security, and responsible AI practices, including GDPR, CCPA, PII handling, and fairness considerations
- Proficiency in data engineering tasks such as ETL/ELT, SQL, distributed processing (e.g., Spark), and feature pipeline management
- Knowledge of LLM operations, including prompt engineering, retrieval-augmented generation, fine-tuning, and safety measures
- Expertise in MLOps best practices, including CI/CD pipelines, containerization, Kubernetes, model registry, and monitoring
- Experience designing and analyzing A/B tests, guardrail metrics, and causal inference techniques
Responsibilities
- Lead end-to-end machine learning projects, including problem definition, data strategy, feature engineering, modeling, evaluation, deployment, and ongoing monitoring
- Architect scalable training and inference systems with clear SLAs, high observability, and cost efficiency
- Establish rigorous experimentation protocols, including offline evaluation, A/B testing, and causal analysis to validate model performance
- Drive MLOps excellence by implementing reproducible pipelines, model registry governance, automated retraining, and drift detection
- Collaborate with product and design teams to translate ambiguous business goals into measurable ML problems, defining success metrics and attribution models
- Partner with data engineering teams to develop feature pipelines, data contracts, and ensure data quality and parity between online and offline environments
- Design, develop, and deploy AI-powered solutions that translate business needs into scalable applications, enhancing workflows and decision-making processes
Other
- Strong communication skills with the ability to translate complex technical concepts to non-technical stakeholders
- Mentor engineering teams, conduct code and design reviews, and set standards for reliability, documentation, and testing
- Communicate trade-offs, insights, and results effectively to both technical and non-technical stakeholders, influencing product roadmaps and priorities
- BS/MS/PhD in Computer Science, Engineering, Statistics, or a related field, or 4+ years of equivalent practical experience
- Experience with open-source contributions, publications, or patents is a plus