The team is responsible for discovering insights from a large amount of data, and their infrastructure needs innovative ideas to improve its performance and ease-of-use. The job is looking to solve the challenges of building and maintaining a large-scale analytics infrastructure.
Requirements
- Proficiency with distributed compute & storage technologies (e.g., Spark, Flink, HDFS)
- Proficiency designing ETL pipelines and with automation services (e.g., Airflow)
- 2 years of programming experience in Python, Java, Swift, or C++
- Experience designing and developing production-level software
Responsibilities
- Build high-throughput data ingestion and real-time analytics pipelines.
- Design APIs and build REST services to lend insights to stakeholders.
- Instrument operational telemetry and build dashboards to monitor system health.
- Provide meaningful insights to teams and influence decisions across Apple on a broad range of products.
- Write production-level software.
- effective provisioning, installation/configuration, operation, and maintenance of our teamʼs analytics infrastructure.
- research and development to enable continued innovation and progress within the infrastructure.
Other
- excellent programming, problem solving and communication skills.
- Self-motivated and able to work independently
- Strong analytical thinking
- Excellent spoken and written communication skills
- Experience driving cross-functional projects with diverse sets of stakeholders