Develop and evaluate scalable safeguards for foundation models, with a focus on large language or multi-modal models (LLMs/ LMMs), to design, deploy, and monitor trustworthy AI systems across a broad range of products at Oracle
Requirements
- Ph.D. in Computer Science, Machine Learning, NLP, or a related field, with publications in top-tier AI/ML conferences or journals
- Hands-on experience with LLMs including fine-tuning, evaluation, and prompt engineering
- Demonstrated expertise in building or evaluating Responsible AI systems (e.g., fairness, safety, interpretability)
- Proficiency in Python and ML/DL frameworks such as PyTorch or TensorFlow
- Strong understanding of model evaluation techniques and metrics related to bias, robustness, and toxicity
- Experience with RLHF (Reinforcement Learning from Human Feedback) or other alignment methods
- Open-source contributions in the AI/ML community
Responsibilities
- Conduct cutting-edge research and development in Responsible AI, including fairness, robustness, explainability, and safety for generative models
- Design and implement safeguards, red teaming pipelines, and bias mitigation strategies for LLMs and other foundation models
- Contribute to the fine-tuning and alignment of LLMs using techniques such as prompt engineering, instruction tuning, and RLHF/DPO
- Define and implement rigorous evaluation protocols (e.g., bias audits, toxicity analysis, robustness benchmarks)
- Collaborate cross-functionally with product, policy, legal, and engineering teams to ensure Responsible AI principles are embedded throughout the model lifecycle
- Publish in top-tier venues (e.g., NeurIPS, ICML, ICLR, ACL, CVPR) and represent the company in academic and industry forums
Other
- Ph.D. in Computer Science, Machine Learning, NLP, or a related field
- US: Hiring Range in USD from: $120,100 - $251,600 per year
- May be eligible for bonus, equity, and compensation deferral
- Comprehensive benefits package including medical, dental, and vision insurance
- 11 paid holidays and paid sick leave