Xponential Fitness is looking to design and evolve its enterprise data architecture to support scalable, secure, and high-performing data infrastructure that powers real-time analytics, AI/ML capabilities, and strategic decision-making across the organization.
Requirements
- Expertise in ELT/ETL design, real-time streaming, data modeling, and orchestration frameworks
- Hands-on experience with scalable compute (e.g., container-based workloads), relational and non-relational storage, caching systems, and infrastructure automation tools
- Proficient in tools like Snowflake, dbt, Apache Airflow, Fivetran, and orchestration via GitHub Actions or CodePipeline
- Strong skills in SQL and Python; experienced with CI/CD workflows and infrastructure-as-code
- Familiarity with graph-based data modeling and platforms like Neo4j, Amazon Neptune for relationship-driven use cases
- Implementation of log aggregation, container monitoring, and data pipeline observability using tools such as CloudWatch, Sumo Logic, Sentry, or New Relic
- Experience partnering with AI/ML teams to design pipelines that support model development, training, and deployment
Responsibilities
- Design and implement resilient, cloud-native data architectures supporting both batch and real-time pipelines
- Lead the ingestion, transformation, and orchestration of data via Fivetran, Apache Airflow, and Python-based ETL/ELT
- Optimize compute, storage, and processing layers to ensure scalable, secure, and cost-effective data operations
- Integrate modern container orchestration, caching, and task automation approaches to support data enrichment, transformation, and delivery at scale
- Collaborate with AI engineers to enable end-to-end MLOps, feature engineering pipelines, and training data provisioning
- Define and enforce data governance policies including lineage, metadata management, and data quality rules
- Instrument data workflows with CloudWatch, Kinesis Firehose, Sumo Logic, Sentry, and New Relic for real-time visibility
Other
- 10+ years of experience in data engineering, cloud architecture, or big data infrastructure
- 5+ years in a senior or leadership capacity with a track record of building scalable data platforms
- Proven ability to lead complex cross-functional initiatives and influence architectural decisions across technology and business teams
- Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field
- Mentor data engineers, promoting best practices in scalable design, modular pipeline development, and IaC
- Lead architecture reviews and cross-functional design sessions across data, application, and security teams
- Translate technical decisions into business impact narratives for leadership and stakeholders