Les Schwab is looking for a Data Engineer II to design, develop, and manage data resources to support reporting, business intelligence, analytics, data science projects, and operational applications. This role is critical for providing accurate, timely, and high-quality data to feed data products and support downstream applications.
Requirements
- Advanced data modelling skills
- Proficient in the composition of Advanced SQL (analytical functions)
- Proficient in query performance tuning
- Proficient in Data Analysis techniques for testing and troubleshooting
- Proficient in ETL process development
- Proficiency and demonstrated experience with a programming language such as Python, R, JavaScript, Java, Go, or similar. Advanced procedure or function development in T-SQL, Oracle PL/SQL or equivalent also acceptable
- Proficiency and demonstrated professional experience working with Snowflake, Kafka, Ngrok, Docker, Streamlit, and Kinesis Data Streams.
Responsibilities
- Build and deliver ad hoc data sets to support business analysis, data analysis, analytics, proofs-of-concept, and other use cases
- Create complex SQL scripts and queries in support of reporting and analytics applications
- Use appropriate technologies and methods to automate data preparation and data movement with Les Schwab standard data stores, tools, and platforms
- Monitor and troubleshoot manual and automated data preparation and data movement processes
- Solve technical problems, develop workarounds and resolve operational issues
- Collaborate with business systems analysts, data analysts, business stakeholders, and analytics practitioners to understand data product and downstream system data requirements
- Create data models for relational and dimensional database schemas for a range of use cases from targeted reporting solutions to support of downstream applications
Other
- Applicants must be currently authorized to work in the United States on a full-time basis. This position is not eligible for visa sponsorship.
- Bachelor’s degree (BS or BA) in STEM related discipline with major in Computer Science/Information Management/Database Development and Analysis or equivalent disciplines or equivalent experience with appropriate time-in-role
- 4+ years of experience with data warehouse technical architectures, ETL/ELT, reporting/analytic tools and scripting
- Experience with AWS services including S3, Data-pipeline and cloud based data warehouses
- Flex remote arrangements (work 1-2 days/week from home)