Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

Corteva Logo

Senior Data Architect & Engineer - Azure Databricks Platform

Corteva

$126,610 - $158,260
Sep 12, 2025
Des Moines, IA, USA
Apply Now

Corteva Agriscience is seeking a Data Architect & Engineer to lead the design, development, and implementation of scalable data models and pipelines in Azure Databricks. This role is instrumental in building a high-performance enterprise data lakehouse supporting commercial, production, and finance domains, which will serve as the foundation for data-driven decisions, advanced analytics, and AI model development across the organization.

Requirements

  • Hands-on experience with Databricks on Azure, including Delta Lake table and Unity Catalog.
  • Proven expertise in data modeling (dimensional) and pipeline development (batch, stream) for cross-functional enterprise data.
  • Proven experience with big data environments, familiar with modern data formats (e.g., Parquet, Avro) and open table formats (e.g., Delta Lake, Apache Iceberg
  • Proficient in SQL, PySpark, dbt and Kafka for data engineering and transformation workflows.
  • Deep understanding of Azure ecosystem (e.g., ADLS Gen2, Synapse, ADF, Key Vault).
  • Experience with version control and CI/CD practices for data projects.
  • Strong knowledge of modern enterprise data architectures, (including data warehouses, data lakehouses, data fabric, and data mesh,) with an understanding of their trade-offs.

Responsibilities

  • Design scalable and maintainable data models across commercial, production, and finance domains.
  • Define and enforce enterprise-wide data architecture standards, naming conventions, and data modeling best practices.
  • Build and optimize data pipelines in Databricks using PySpark, SQL, Delta Lake, and delta live tables.
  • Implement data transformation logic (ELT) to curate clean, trusted, and high-performance data layers.
  • Develop data products using Unity Catalog, Alation, data asset bundle and Gitlab CI/CD workflows.
  • Ensure query optimization, data quality, and high availability of data pipelines.
  • Manage and orchestrate workflows using Databricks Workflows, Azure Data Factory, or equivalent tools.

Other

  • 5+ years of experience in data engineering, data architecture, or enterprise analytics roles.
  • Excellent collaboration and communication skills across business and technical teams.
  • Comfortable in agile, fast-paced, and highly accountable environments.
  • Able to translate complex data problems into practical, scalable solutions.
  • Background in data integration for commercial, operations, and financial domains.