Datadog's ML Observability team needs to build cutting-edge tools to monitor, explain, and improve AI systems in production, particularly those leveraging Large Language Models (LLMs) and generative AI. The Staff Engineer will lead the development of new features and foundational capabilities within Datadog’s LLM Observability product, shaping product direction and solving open-ended problems in the fast-moving AI landscape.
Requirements
- Deep understanding of distributed systems and scalable backend architectures
- Hands-on experience building and shipping LLM-powered or GenAI applications.
- Understanding of model internals, inference pipelines, evaluation techniques, and prompt engineering
- Experience with observability tools/platforms
Responsibilities
- Drive design and implementation of LLM observability features.
- Ideate, prototype, and scale new product features to provide insights and drive improvements for generative AI systems
- Develop and extend tools for tracing, evaluating, and debugging LLMs
- Influence architecture decisions and mentor engineers to build resilient, high-performance systems
- Stay current with industry trends and advancements in machine learning and observability, driving innovation within the team
Other
- Work cross-functionally with other eng teams, product, UX, and applied science to iterate fast and find product-market fit
- Stay close to customer pain points and use those insights to guide product and engineering priorities
- Ability to thrive in ambiguous, fast-changing spaces and have a product-oriented mindset
- You’re excited to shape the next generation of AI observability tools from the ground up
- Communicate clearly, think rigorously, and take pride in clean, maintainable code