Exadel is looking to solve the problem of delivering enterprise-scale, cloud-native data engineering solutions for their Fortune 500 clients, requiring leadership in technical strategy, architecture, and delivery, as well as driving engineering best practices and automation.
Requirements
- 8+ years of experience in Data Engineering, Big Data, or Cloud Data delivery, with 2–4+ years in technical leadership or team lead roles
- Proven track record in: Delivering enterprise-scale data platforms, pipelines, and transformations
- Proven track record in: Leading small teams or squads within consulting or enterprise project settings
- Strong technical skills in: Data ingestion frameworks (e.g., Kafka, Event Hubs, Pub/Sub)
- Strong technical skills in: ETL/ELT tools (e.g., dbt, Spark, ADF, Glue)
- Strong technical skills in: Data storage platforms (e.g., Delta Lake, Snowflake, BigQuery, Synapse)
- Programming: Python, SQL, and optionally Scala or Java
Responsibilities
- Contribute to the execution of the Data Engineering strategy and operating model, supporting client delivery, innovation, and engineering best practices
- Lead and mentor a small, high-performing team of Data Engineers, DevOps Engineers, and Cloud Engineers (team size: ~3–8) within projects or a functional group
- Lead the hands-on design and delivery of enterprise-scale, cloud-native data engineering solutions across: Data Ingestion (batch, streaming, real-time), Data Transformation (ETL/ELT, orchestration, compute frameworks), Data Storage (Data Lakes, Lakehouse, Warehouses, NoSQL), Data Consumption (APIs, BI tools, AI/ML platforms)
- Advocate for and implement modern data engineering patterns such as Data Mesh principles, DataOps, Event-Driven Architectures, and Serverless Pipelines
- Collaborate closely with Architects, Data Governance, AI/ML, and Consulting teams to ensure solutions are aligned with security, performance, and business goals
- Drive the adoption of engineering best practices in coding, DevOps, CI/CD, and DataOps across project teams
- Support the evaluation and application of modern tools and frameworks, including: Apache Spark, Databricks, Snowflake, BigQuery, Synapse, dbt, Kafka, Airflow, Fivetran, Informatica, Matillion, Cloud platforms (Azure, AWS, GCP) and Infrastructure as Code (Terraform, Pulumi)
Other
- Lead and mentor a small, high-performing team of Data Engineers, DevOps Engineers, and Cloud Engineers (team size: ~3–8) within projects or a functional group
- Foster a culture of technical excellence, collaboration, and continuous improvement within engineering teams
- Strong communicator and collaborator, able to engage both technical teams and business stakeholders
- Committed to mentoring peers, sharing knowledge, and continuously improving delivery capability
- Upper-Intermediate English level