The company is looking to optimize its data architecture to support its next generation of products and data initiatives.
Requirements
- Expert-level skills in writing and optimizing SQL queries
- Expert with Big Data technologies such as AWS services, Lambda, Glue, orchestration etc.
- Expert in data wrangling with R, Python and is experience with R Studio Connect app development
- Expert with creating visualizations in Tableau
- Expert in operating very large data warehouses or data lakes such as AWS Redshift, Snowflake etc.
- Knowledge of GCP and SDLC is strongly desired
Responsibilities
- Improve data quality: Implement methods to improve data reliability and quality.
- Convert raw data: Collect, manage, and convert raw data into usable information for data scientists and business analysts to interpret Clinical data.
- Develop database architecture: Develop and maintain the database architecture and data processing systems.
- Test and maintain data infrastructure: Architect, build, test, and maintain the data infrastructure using cloud-based systems.
- Implements the AI/ML models and assists Data Scientists or Statisticians in model fitting or other optimizations related to data quality.
- Work under guidance of a lead engineer and contributes to the best practices and collaborate on complex problem-solving projects.
Other
- Excellent verbal and written communication skills
- Minimum 3+ years of work experience with ETL, Data Modelling, Data Architecture or Analytics
- Bachelor’s degree in computer science, Mathematics, Statistics, or related discipline required; Advanced Degree strongly preferred
- Ability to work a Flex/Hybrid schedule with 3 days per week on-site