Tech Holding is looking to solve the problem of setting up a full end-to-end data pipeline and maintaining scalable data pipelines to deliver high-quality solutions to clients
Requirements
- 5+ years of data engineering experience
- Experience with multiple cloud platforms (AWS, Azure, GCP)
- Experience with DBT (Data Build Tool)
- Advanced working SQL knowledge and experience working with relational databases as well as working familiarity with a variety of databases
- Experience with big data tools: Hadoop, Spark, Kafka
- Experience with Data Modeling
- Some understanding of Machine Learning
Responsibilities
- Develops and maintains scalable data pipelines
- Collaborate with business teams to improve data models that feed business intelligence tools
- Implements processes and systems to monitor data quality, ensuring production data is always accurate and available
- Performs data analysis required to troubleshoot data-related issues and assist in the resolution of data issues
- Works closely with all business units and engineering teams to develop a strategy for long term data platform architecture
- Implement data lake/data warehouse on various cloud providers (AWS, Azure, GCP)
- Produce detailed documentation related to the data pipeline, including design, development, and deployment details
Other
- Bachelor’s Degree in Computer Science or relevant years of work experience
- Must be available to work core hours aligned with Central Standard Time (CST)
- Remote Work Opportunities
- Flexible Work Hours
- Unlimited PTO