Dentsu is looking to solve the problem of delivering sophisticated data integration solutions on time and within budget, while building efficient real-time and batch data processing pipelines, and strengthening client relationships through high-quality technical solutions.
Requirements
- Experience with cloud data platforms (AWS, GCP, Azure) and big data technologies
- 4+ years of experience with ETL tools (Talend, Informatica, SSIS, or DataStage)
- 4+ years of experience with database technologies (SQL Server, Oracle, or other major RDBMS)
- Advanced knowledge of ETL architecture, processes, and best practices
- Expert-level SQL and database programming skills
- Strong experience designing and implementing real-time data ingestion systems using APIs
- Proficiency with file transfer protocols and security mechanisms (sFTP, PGP Encryption)
Responsibilities
- You will own end-to-end technical architecture and delivery of data engineering solutions
- You'll design and implement scalable ETL architectures and data pipelines
- You'll develop and optimize SQL procedures and ETL processes for batch and real-time data processing
- Perform data modeling and schema design for efficient data processing
- Create robust API integrations for real-time data ingestion from various sources
- Review code and engineering solutions for performance, scalability, and compliance with standards
- You'll create comprehensive technical documentation for data pipelines and integrations
Other
- Lead a team of data engineers in a matrix organization
- Provide mentoring in ETL best practices and data engineering methodologies
- Estimate work effort and assist Project Managers with task planning and resource allocation
- Collaborate directly with clients on technical requirements and solution design
- Develop and mentor data engineers across the organization