Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

Quilt Software  Logo

Senior Data Engineer

Quilt Software

Salary not specified
Dec 29, 2025
Remote, US
Apply Now

Quilt Software is looking for a Senior Data Engineer to design, build, and optimize their data platforms so teams across the company can make fast, reliable, data-driven decisions. The goal is to build scalable data solutions that power analytics, reporting, and data products.

Requirements

  • Strong hands-on experience with Databricks (or a very similar cloud data platform) including cluster management, jobs, and notebooks.
  • Advanced experience with Apache Spark for batch and/or streaming data processing.
  • Expert-level SQL skills (complex joins, window functions, query optimization).
  • Strong Python skills for data engineering (e.g., PySpark, data processing libraries, scripting).
  • Proven experience in data modeling and designing schemas for analytics and reporting.
  • Experience building and maintaining data pipelines in a cloud environment (AWS, Azure, or GCP).
  • Strong understanding of data warehousing concepts, ETL/ELT best practices, and data lifecycle.

Responsibilities

  • Design and build data pipelines
  • Develop, maintain, and optimize ETL/ELT pipelines on Databricks and Spark.
  • Integrate data from multiple internal and external sources into a centralized data platform.
  • Design and maintain robust data models (e.g., star/snowflake schemas, data vault, dimensional models) to support analytics and self-service BI.
  • Implement data quality checks, validation frameworks, and monitoring.
  • Tune queries and jobs for performance and cost efficiency in Databricks and downstream systems.
  • Contribute to and refine our data governance, security, and access control practices.

Other

  • 7+ years of professional experience as a Data Engineer, Software Engineer, or similar role.
  • Excellent communication skills and the ability to collaborate with technical and non-technical stakeholders.
  • A product mindset: you think about the end users of data and build with usability in mind.
  • A bias for automation, reliability, and scalability over one-off solutions.
  • Comfort with ambiguity, ownership of complex problems, and a desire to continuously improve the data ecosystem.