Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

Popeyes Logo

Sr. Software Engineer - Data, Popeyes

Popeyes

Salary not specified
Oct 17, 2025
Miami, FL, US
Apply Now

Popeyes runs on trusted, well-modeled data. We’re hiring a Senior Data Engineer to own the data warehouse end-to-end, blending modern Lakehouse practices (medallion) with classic Kimball dimensional modeling. You’ll partner closely with analytics and business stakeholders to anticipate needs, model clean and flexible data, and ensure our teams can make fast, confident decisions.

Requirements

  • 6+ years in data engineering with a focus on dimensional modeling and production warehouses.
  • deep dbt expertise (models, snapshots, macros, tests, docs) and strong Kimball fundamentals (grain, conformed dimensions, SCD1/2, surrogate keys, slowly changing patterns, late/early data).
  • Advanced SQL and performance tuning in a cloud warehouse (Snowflake).
  • Experience owning a warehouse or major domain: SLAs, backlog, stakeholder comms, and incident prevention/response.
  • Pipeline orchestration experience (dbt Cloud/Airflow/Prefect/Glue) and CI practices for data.
  • Python for data tooling, AWS Glue/Spark, streaming/CDC familiarity, and metrics/semantic layers.
  • Core: SQL, dbt (Core with Snowflake), Git

Responsibilities

  • Own the warehouse: set modeling standards, review PRs, manage environments, and drive a roadmap for conformed dimensions and high-quality facts.
  • Model the business (Kimball-first): define grains, conformed dimensions, SCD1/2, surrogate keys, late-arriving/early-arriving data, bridge tables, and audit patterns.
  • Build in dbt: develop models, snapshots, macros, seeds, tests (unique/not null/relationships/accepted values), tags, and exposures; maintain docs and lineage.
  • Medallion + marts: structure bronze silver gold layers and publish well-governed marts for analytics and operational use.
  • Orchestrate & harden pipelines: run dbt Core against Snowflake via CI/CD (CircleCI/GitHub Actions) and/or Airflow/Glue; use Snowflake Tasks (and Streams where appropriate) for scheduled runs and dependency management; keep SLAs/freshness green.
  • Performance & cost: tune materializations (table/view/incremental/ephemeral), clustering/partitioning, MERGE strategies, and dependency graphs.
  • Data quality & governance: enforce data contracts, freshness checks, observability/alerting, documentation, and access controls (PII handling).

Other

  • Partner with the business: work with Analytics, Ops, Marketing, and Restaurant Tech to translate questions into durable dimensional models and standard metrics.
  • A genuine passion for data with curiosity, product thinking, and pride in clean, well-modeled tables.
  • This position is based in Miami, FL and is in the office 5 days a week.
  • Accommodation is available for applicants with disabilities upon request.