tastytrade is looking for a Data Engineer to join their Analytics team to own data pipeline architecture and influence the development of their data platform. The role involves designing systems, making technical decisions, and shipping/iterating on data solutions to support business processes and customer interaction analysis.
Requirements
- Expert SQL skills—you think in queries, optimize ruthlessly
- Real dbt experience—not just exposure, actual production work
- Solid Python—production code, not scripts
- Hands-on Airflow (or Prefect/Dagster)
- Cloud data warehouse experience (Snowflake, BigQuery, Redshift)
- Proficiency with version control systems (Git/GitHub) and command line interfaces
- Kubernetes or infrastructure-as-code experience
Responsibilities
- Design and build scalable data pipelines using dbt, Python, Airflow, and Snowflake to support various business processes (such as our executive reporting dashboards)
- Own data quality and observability—you're the one who cares about data being right
- Make architectural decisions: schema design, tool selection, trade-offs
- Write ELT code using modern data engineering practices (Git, automated testing and deployments)
- Debug production data issues when they break (and they will)
- Process, clean, and validate data from a variety of external sources and internal data pipelines
- Contribute to data visualization and reporting efforts in Tableau
Other
- Hybrid 4 days/week in office
- Upskill those around you through code review, design collaboration, and knowledge sharing
- You communicate clearly with non-engineers and engineers alike
- Financial services background
- Experience with agile development practices and project lifecycle