Apple is looking to personalize the user experience by developing intelligent systems that understand location context and help users achieve their goals, wherever they are. This involves inferring device patterns using various data sources and applying machine learning to deliver rich contextual intelligence within resource constraints.
Requirements
- Machine Learning algorithms: Strong grasp of supervised/unsupervised learning, regression, classification, clustering, and model evaluation techniques.
- Data processing: Skilled in working with large, noisy datasets.
- Experience with libraries like NumPy, pandas, scikit-learn, and PyTorch or TensorFlow.
- Experience shipping production software for mobile and/or other resource-constrained devices.
- Capability of creating, analyzing, and modifying SW functionality, ideally in C++/Obj-C/Swift codebases
- Hands-on experience with applied probability, statistics, and empirical and/or ML algorithms.
- Classical estimation, signal processing, and/or training supervised ML models are relevant.
Responsibilities
- design, build, and evaluate production ML systems that infer a device’s patterns by inference on date like GPS, Wi-Fi, and accelerometers and higher semantic signals - combining estimation techniques with machine learning.
- test and refine your work, use it yourself, track metrics, and iterate for quality.
- identifying behavioral patterns
- shape new and enhanced user experiences by collaborating closely with teams across sensing, Siri, and apps.
- take ownership of complex, end-to-end problems
Other
- Laser focus on customer impact and product experience.
- Excellent communication, verbally and in writing.
- You can succeed in a collaborative environment, and are comfortable with what will sometimes feel like a high degree of uncertainty.
- You can innovate within tight memory, CPU, and schedule constraints, and deliver on time.
- These constraints motivate you, and ignite your creativity.