McDonald's is looking to solve the problem of delivering fast, easy, and personalized experiences to its 65M+ customers daily through the use of technology, including AI, robotics, and emerging tech, to digitize the Golden Arches and improve data product lifecycle, standards, and practices.
Requirements
- 10+ years of hands-on experience in data engineering, specifically with AWS & GCP backend tech stack, including but not limited to S3, Redshift, Glue, Lambda, GCS, BigQuery, Cloud Functions, Cloud Run, etc.
- 8+ years of technically focused experience in related technologies and languages (Web Services, Java, XML, S3, NoSQL, Kafka, Spark, etc.)
- Hands-on with Event, Workflow and Application driven Data Architecture. API and Microservices management
- Working knowledge of cloud-based data systems, including AWS and GCP
- Seasoned technical leader with hands-on development and deployment experience with insights into the newest technology and development trends.
- 7+ years of experience with DevOps and data reliability engineering
- 7+ years of hands-on experience with data modeling, ETL / ELT development, and data integration techniques (including real time data integration and processing)
Responsibilities
- Build and maintains relevant and reliable data products that support the business needs. Develops and implements new technology solutions as needed to ensure ongoing improvement with data reliability and observability in-view
- Collaborate with Enterprise Architecture to launch scalable and reliable data solutions, support system integration efforts, and create operational efficiencies through automation and process improvements
- Participates in new software development engineering. Helps to define business rules that determines the quality of data, assists the product owner in writing test scripts that validates business rules, and performs detailed and rigorous testing to ensure data quality
- Develops a solid understanding of the technical details of data domains, and clearly understands what business problems are being solved
- Owns engineering modules, functionalities and supports them throughout the data development lifecycle
- Designing and developing end-to-end data pipelines and ETL processes to extract, transform, and load data from various sources into cloud data storage solutions (e.g., S3, Redshift, GCS, BigQuery).
- Implementing and maintaining scalable data architectures that support efficient data storage, retrieval, and processing.
Other
- BS in Computer Science or related field is REQUIRED
- Ability to drive continuous data management quality (i.e. timeliness, completeness, accuracy) through defined and governed principles
- Ability to perform extensive data analysis (comparing multiple datasets) using a variety of tools
- Strong communication skills with ability to communicate complex technical concepts and align organization on decisions with stakeholders of all levels of expertise
- Active coach and mentor whose goal is to grow and maximize team members’ potential