Truckstop is seeking to strengthen and scale its modern data platform by building and optimizing pipelines, models, and infrastructure to power analytics, product intelligence, and customer-facing solutions.
Requirements
- Strong experience with Snowflake, SQL, and dbt in a production environment.
- Solid understanding of Terraform and infrastructure-as-code practices
- Proficiency in Python for data processing, scripting, and automation.
- Experience implementing and maintaining ELT pipelines and data integrations.
- Familiarity with Postgres or other relational databases.
- Hands-on experience with BI or analytics tools.
- Experience with Airbyte, Matillion, or similar ETL/ELT platforms is highly valued.
Responsibilities
- Design, build, and maintain scalable ELT pipelines and data models with Snowflake, dbt, and SQL.
- Develop data infrastructure and platform components using Terraform, Python, and modern orchestration tools.
- Work closely with engineering, analytics, and product teams to ensure data quality, reliability, and availability.
- Optimize ingestion, transformation, and storage patterns across Postgres and other relational systems.
- Partner with BI and analytics teams to enable self-service reporting in Domo (or other BI tools such as Metabase, Tableau, Power BI).
- Manage and enhance data integration workflows using Airbyte and Matillion.
- Drive architectural improvements around data governance, observability, automation, and scaling.
Other
- Excellent communication skills and the ability to work cross-functionally.
- Background in supply chain, freight, or logistics (bonus).
- Bachelor's degree (not explicitly mentioned but implied as a standard requirement)
- Ability to work in a fast-moving environment and collaborate well across teams.
- Commitment to principles such as assume positive intent, have each other’s back, and be your authentic self.