Dentsu is looking to solve data warehousing and engineering problems using Azure Data Factory and other Azure tools.
Requirements
- 9+ years of experience in data warehousing/engineering.
- 3+ years experience in Azure Data Factory architecture and implementation (migration or new implementation).
- Experience with ADF components: Pipelines, Datasets, Linked Services, Integration Runtime, Data Flows, and Triggers.
- Proven experience in building and managing global data warehouse solutions, integrating data from multiple countries and ensuring localization and compliance.
- Experience with Azure tool stack.
- Experience in Python and PySpark, kafka, kinesis for data processing and scripting.
- Familiarity with Azure Synapse Analytics, Azure Data Lake, and Azure Key Vault.
Responsibilities
- You will architect, design and build Cloud solutions using Azure Data Factory.
- You will design and build global data warehouse solutions, ensuring data consistency, quality, and compliance across international data sources.
- You will develop and optimize ETL/ELT CI/CD workflows using ADF pipelines, Data Flows, Linked Services, Integration Runtimes, and Triggers.
- You will use PySpark, Kafka, Kenesis and Python for data transformation, cleansing, and enrichment tasks within Azure Synapse or Databricks environments.
- You will collaborate with cross-functional teams to define data architecture standards, governance, and best practices.
- You will provide technical leadership and mentorship to junior engineers.
- You will ensure performance tuning, monitoring, and troubleshooting of data pipelines and workflows.
Other
- Bachelor's degree in computer science, Information Systems, or related field.
- Travel requirements not specified.
- Must be eligible to work in the USA.
- Dentsu is committed to providing equal employment opportunities to all applicants and employees.
- Dentsu is committed to providing reasonable accommodation to, among others, individuals with disabilities and disabled veterans.