eBay is looking to solve the problem of efficiently processing and managing the swift flow of data within its marketplace to enable near real-time buyer experiences, seller insights, and a data-driven commerce business by developing, building, and operating a core data pipeline infrastructure.
Requirements
- Strong proficiency in Java and common design patterns.
- Hands-on experience with streaming and messaging technologies such as Apache Kafka, Flink, and Pulsar.
- Familiarity with monitoring and observability tools like Grafana, Prometheus, and ELK.
- Experience with Kafka and Flink cluster operations is a significant plus.
- Strong knowledge of Kubernetes and containerized environments.
- Familiarity with databases such as Oracle, MySQL, and Redis.
- Deep understanding of distributed system design principles, including high availability, scalability, and fault tolerance.
Responsibilities
- Build, operate, and continuously optimize eBay’s messaging and streaming platform, delivering reliability, scalability, and high performance at global scale.
- Develop and implement new functionalities on the platform and automation tools to boost system resilience and improve developer efficiency.
- Troubleshoot and resolve complex production issues with a focus on minimizing downtime and maintaining business continuity.
- Strengthen system monitoring, logging, and alerting to ensure proactive detection and resolution of problems.
- Create and maintain comprehensive documentation, including system designs, operational runbooks, and best practices to support long-term platform health.
Other
- A Master’s degree in Computer Science or related field (or equivalent experience).
- Proven problem-solving skills and expertise in troubleshooting production issues.