Visa is looking to industrialize AI and operationalize the delivery of AI and decision intelligence to ensure ongoing business values, by building a high-performance, low-latency model inference engine and ensuring its rock-solid reliability through a best-in-class observability platform.
Requirements
- Strong programming proficiency in at least one of the following languages: Rust/C++/Go/Java
- Experience with performance optimization.
- Hands-on experience with containerization and orchestration technologies (e.g., Docker, Kubernetes).
- Familiarity with observability tools (e.g., Prometheus, Grafana, OpenTelemetry) and cloud platforms (AWS, GCP, Azure).
- Knowledge of network programming, gRPC, Protocol Buffers, or other RPC frameworks.
- Experience working on large-scale distributed systems.
Responsibilities
- Design, build, and maintain the full lifecycle of our machine learning systems, from our low-latency inference engine to the observability platform that supports it.
- Develop and optimize high-performance, mission-critical services using languages like Rust, Python, and Go.
- Enhance the reliability and visibility of our MLOps ecosystem by building and scaling solutions for monitoring, logging, and tracing.
- Collaborate closely with data scientists and ML engineers to deploy, scale, and troubleshoot machine learning models in production.
- Write clean, high-quality, and well-tested code, and participate in code reviews to raise the bar for the entire team.
- Diagnose and resolve performance bottlenecks and system failures in our production environment.
Other
- 2 or more years of work experience with a Bachelor’s Degree or an Advanced Degree (e.g. Masters, MBA, JD, MD).
- Travel 5-10% of the time.
- Work in an office setting, with the ability to sit and stand at a desk, communicate in person and by telephone, and frequently operate standard office equipment.
- Must be willing to work in a hybrid position based in Austin, TX, with at least 50% office presence.
- Must be eligible to work in the United States.