Spectrum is looking to implement and enhance data services, modify and enhance data architecture and automation, and drive innovation and efficiency through transformative data solutions.
Requirements
- 2+ years of ETL experience
- 2+ years of experience with SQL, Shell Script, Python, JSON
- 2+ years of data integration experience
- 2+ years of data visualization experience
- 2+ years of data/software engineering experience
- Intermediate knowledge of SQL and Python (YAML – if applicable)
- Intermediate knowledge of Data Tools including querying, scripting, analysis, ETL, and command line interfaces
Responsibilities
- Implement data services, structure/models and movement infrastructures
- Provide and implement preventative solutions and troubleshoot data job failures, such as error handling, data manipulation and I/O processing
- Implement database concepts and practices, including definition and query language
- Modify, enhance, and influence requirements and architecture specifications of data warehousing systems and/or processing infrastructure
- Collaborate with stakeholders to design, implement, and support end-to-end data solutions across multiple platforms, environments, domains, and locations
- Work closely with internal staff, vendors, consultants, and external partners to quickly identify and resolve data integration issues
- Independently develop reporting processes, programs, and solutions following established standards
Other
- Ability to work lawfully in the U.S. without employment-based immigration sponsorship, now or in the future.
- Ability to perform in-depth and independent research and analysis with some oversight from the manager
- Intermediate knowledge of cloud-based infrastructures and services relevant to data processing, data automation and data platform/processing troubleshooting
- Knowledge of Spectrum’s product data sources and data pipelines
- Ability to apply appropriate level of data maintenance, data quality control and validation of code, tools, and models as directed