Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

Steel Dynamics, Inc Logo

Data Engineer - Copperworks

Steel Dynamics, Inc

Salary not specified
Sep 20, 2025
New Haven, IN, US
Apply Now

SDI Lafarga Copperworks is looking to evolve its modern data stack to move data from diverse sources into trusted, analytics-ready models that power decision-making across the business. This includes migrating from a hybrid infrastructure to a more cloud-centric setup leveraging the Fabric platform.

Requirements

  • Hands-on with orchestration tools (Dagster preferred; Airflow/Prefect acceptable).
  • Proficiency with a modeling framework like dbt or sqlmesh (tests, snapshots, macros).
  • Intermediate Python (data access, transformations, packaging/venv, type-safe code, unit tests).
  • SQL expertise (advanced T-SQL) window functions, performance tuning, query plans, indexing strategies.
  • Experience with Spark SQL or similar query engines; strong comfort with DuckDB (or willingness to ramp quickly).
  • Azure Data Lake (ADLS Gen2) and Data Factory for ingestion/orchestration.
  • Working knowledge of Microsoft Fabric and Power BI semantic modeling (dimensional design, DAX measures).

Responsibilities

  • Design, build, and maintain ELT pipelines in Dagster/Python with robust scheduling, observability, and alerting.
  • Develop modular, tested data models in dbt (sources, staging, marts), including incremental strategies and documentation.
  • Implement performant transformations using T-SQL and DuckDB (or Spark SQL equivalents) for analytics at scale.
  • Ingest and orchestrate data flows with Azure Data Factory and Azure Data Lake; manage datasets, and cost/performance.
  • Build and maintain Power BI semantic models (star schemas, relationships, calculation groups/measures), optimizing for refresh and query performance
  • Leverage Microsoft Fabric for end-to-end analytics workflows, governance, and distribution.
  • Manage integrations with external APIs/applications such as our Process AI platform and Salesforce CRM

Other

  • 3-5 years of professional data engineering experience in a production environment.
  • Linux/Bash skills; ability to work in WSL Ubuntu.
  • API/application integrations experience (REST/JSON, OAuth2/keys, Odata).
  • Version control with Git and collaborative workflows (PRs, code reviews).
  • Strong communication, documentation, and stakeholder partnership skills.