At InterWorks, help clients turn their data into something powerful, useful, and even beautiful by designing robust data systems, unifying messy data sources, and building pipelines that fuel meaningful insights
Requirements
- Solid SQL skills (and the curiosity to keep leveling up)
- Strong experience with ETL/ELT workflows (GUI tools or code-based—either works!)
- A clear understanding of data modeling best practices
- Deep understanding of data quality, governance, and observability principles and practices
- Working knowledge of DevOps concepts
- Experience with CI/CD pipelines
- Experience with cloud platforms (AWS, Azure, GCP)
Responsibilities
- Build modern, scalable data pipelines that keep the data flowing—and keep our clients happy
- Design cloud-native infrastructure and automation that supports analytics, AI, and machine learning
- Unify and wrangle data from all kinds of sources: SQL, APIs, spreadsheets, cloud storage, and more
- Develop ETL/ELT frameworks that improve code quality and make things easier for your teammates
- Apply strong data modeling principles to support everything from dashboards to data science
- Collaborate closely with other InterWorkers and client teams to understand what they really need
- Write clear documentation, contribute to design decisions, and share what you learn
Other
- 5+ years of professional experience in a data engineering or technical consulting role
- Flexibility and comfort in fast-changing environments
- Excellent communication skills—you can explain tech to humans
- A passion for delivering smart, thoughtful, client-centered solutions
- A love for learning new tools, frameworks, and approaches