Adtalem is looking to scale its data platform to meet ever growing business needs by implementing data solutions that power strategic and tactical business decisions and support Analytics and Artificial Intelligence operations.
Requirements
- 2+ years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows
- 6+ years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics
- Expert knowledge on SQL and Python programming
- Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
- Experience in tuning queries for performance and scalability
- Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
- Experience working in agile environment
Responsibilities
- Work closely with various business, IT, Analyst and Data Science groups to collect business requirements
- Design, develop, deploy and support high performance data pipelines both inbound and outbound.
- Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
- Optimize data pipelines for performance, scalability, and reliability.
- Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
- Develop REST APIs to expose data to other teams within the company.
- Troubleshoot and resolve data engineering issues as they arise.
Other
- Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field
- Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field
- Excellent organizational, prioritization and analytical abilities
- Have proven experience working in incremental execution through successful launches.
- Excellent problem-solving and critical-thinking skills to recognize and comprehend complex data issues affecting the business environment.