Airbnb is looking for a Data Warehouse Infrastructure engineer to design and build the next generation big data compute platform to empower data ETL, analytics and machine learning at Airbnb.
Requirements
- Proficiency in big data technologies such as Hadoop, Yarn, Spark, Presto/ Trino, Hive, and AWS EMR.
- Strong programming skills in languages such as Java, Scala.
- Extensive experience in designing, building, and maintaining scalable, fault-tolerant distributed systems.
- Demonstrated expertise in multi-threading and concurrency programming.
- Familiar with database systems, both SQL and NoSQL.
- Capacity to troubleshoot and resolve complex data infrastructure problems.
Responsibilities
- Design and build the next generation big data compute platform to empower data ETL, analytics and machine learning at Airbnb
- Operate, manage and improve the reliability, performance, observability and cost efficiency of the platform.
- Write maintainable and self-documenting code, perform code reviews.
- Work on, and contribute to open source software, and have industry impact.
Other
- 10+ years of experience working with data infrastructure, with a focus on big data technologies.
- Proven ability to collaborate with other teams to define system requirements, identify potential solutions, and test and integrate systems.
- Strong communication skills, both written and verbal.
- Ability to work effectively in a team environment.
- US - Remote Eligible