Amplitude is looking to solve hardcore Infrastructure challenges like designing for extreme throughput, designing/optimizing systems with millisecond latencies, and building resilient systems to achieve close to zero downtime for their data analytics platform. They also aim to enhance the value of customer data by powering more workflows outside of Amplitude and building a next-generation analytics experience for instant product insights.
Requirements
- Strong computer science fundamentals (data structures, algorithms, software design).
- Solid programming skills in at least one modern language (Java, Go, or Python preferred).
- An eagerness to learn about distributed systems, large-scale data processing, and data products.
- Prior internship or project experience with backend systems, data pipelines, or cloud services.
- Familiarity with data tools like dbt, Temporal, or Apache Iceberg.
- Experience working with cloud data warehouses (Snowflake, BigQuery, Redshift).
Responsibilities
- Contribute to backend and data product features that integrate with cloud data warehouses.
- Design and build systems that are reliable, performant, and scalable as we grow.
- Collaborate with Product, Design, and other engineers to bring ideas from concept to implementation.
- Participate in code reviews and design discussions to grow your skills and share knowledge.
- Learn modern data engineering tools and best practices while contributing to real production systems.
Other
- A degree in Computer Science or related technical field, or equivalent practical experience.
- A passion for solving challenging technical problems and working collaboratively.
- Contributions to team projects, open-source code, or technical communities.
- LI-Hybrid