The company is looking to develop a QA automation framework to validate the integrity, quality, and performance of data pipelines within the Workplace Investing (WI) Data Engineering space.
Requirements
- Strong SQL skills plus Java and Python development experience
- Experience as Software Engineer in test automation of RESTful services (Cucumber, Junit, Selenium, RestAssured, Git, Gradle, Maven)
- Data Engineer working with the latest Big Data technologies and tools (Snowflake, Airflow, AWS, Azure)
- Knowledge of data pipeline testing will be a distinct advantage
- Knowledge of CI/CD Pipelines
- Experience with ETL Data Pipeline testing of Data Warehouse platforms; exposure to iCEDQ test tool preferred
- Proven experience with the software development process including analysis, design, coding, system and user testing, problem resolution and planning
Responsibilities
- develop a QA automation framework to validate the integrity, quality, and performance of the data pipelines
- analysis, development, and DevOps work for the cloud data platform
- building an end-to-end test strategy
- determining exit criteria
- chipping in to our automation roadmap
- scanning for opportunities to bring the outside in and help keep WI Quality Engineering at the forefront of Quality Engineering innovation
- work with your scrum team to move quality to the beginning of the process, and to coordinate quality activities throughout the epic life cycle
Other
- 6-9 years of Software Development or proven relevant experience desired
- 2+ years of Big Data, Data Lake or Data Warehouse experience is desired
- 6+ years of Software Quality Assurance or solid experience desired
- Bachelor’s Degree or higher
- Experience in retirement plan administration or financial services a plus