Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

Sezzle Logo

Senior Data Engineer

Sezzle

$5,000 - $9,000
Nov 13, 2025
Remote, US
Apply Now

Sezzle is experiencing rapid growth in data generation and consumption, requiring efficient and scalable solutions to empower the business, engineers, and the rest of the organization to analyze large volumes of data quickly. The company aims to improve its existing data ecosystem and explore new tooling and technologies to meet these demands.

Requirements

  • Deep expertise with AWS Redshift or similar products, including performance tuning, table design, and workload management.
  • Strong hands-on experience with ETL/ELT frameworks, especially DBT, AWS DMS, and similar tools.
  • Proficiency in SQL (advanced level) and at least one programming language such as Python, Scala, or Java.
  • Experience building and maintaining AWS-based data platforms, including S3, Lambda, Glue, or EMR.
  • Track record designing scalable, fault-tolerant data pipelines using modern orchestration tools (Airflow, Dagster, Prefect, etc.) processing more than 100GB - 1 TB of new data a day
  • Strong understanding of data modeling, distributed systems, and warehouse/lake design patterns.
  • Familiarity with streaming technologies (Kafka, Kinesis, Flink, Spark Streaming).
  • Knowledge of lakehouse architectures and modern stacks such as Snowflake, Databricks, Iceberg, or Delta Lake.

Responsibilities

  • Design, build, and optimize large-scale, high-performance data pipelines to support analytics, product insights, and operational workflows.
  • Architect and evolve Sezzle’s data ecosystem, driving improvements in reliability, scalability, and efficiency.
  • Lead development of ETL/ELT workflows using Redshift, DBT, AWS DMS, and related modern data tooling.
  • Evaluate and integrate new technologies, guiding the evolution of Sezzle’s data stack and infrastructure.
  • Optimize Redshift and warehouse performance, including query tuning, modeling improvements, and cost management.
  • Partner with cross-functional teams (engineering, analytics, product, finance, risk) to gather or adapt requirements and deliver robust, high-quality datasets.

Other

  • 9+ years of experience in data engineering, with a strong track record of production-grade systems.
  • Ability to work in a fast-paced, collaborative environment with excellent communication and documentation skills.
  • Prior experience in high-growth, data-intensive fintech or similar regulated environments.
  • Experience leading data platform migrations, warehouse re-architectures, or large-scale performance overhauls.
  • Enthusiasm for automation, CI/CD for data, and infrastructure as code (Terraform, CloudFormation).