Quizlet is looking to design and deliver AI-powered learning tools that scale across the world and unlock human potential by building tools and models for data warehouse to increase data quality and development efficiency, identifying and solving performance challenges with jobs and queries, designing tools and tests for product engineers to ensure data quality at the time it is produced, automating and enhancing the software development lifecycle of data processes, implementing data pipelines and data ingestion best practices, providing recommendations on data model and query best practices, collaborating with cross-functional teams to design systems that fit a broad range of data needs, preparing data governance practices for external scrutiny, and bringing software development lifecycle to data by porting metrics queries to DBT and automating data quality monitoring.
Requirements
- utilizing SQL, window functions, and sub-queries, to synthesize, visualize, and communicate ideas
- creating data standards for internal and cross-functional teams
- utilizing SQL and database platforms (including MySQL, Redshift or Snowflake)
- programming using database programming languages including SQL
- developing BI and analytic dashboards and creating data pipelines using Airflow
- defining operational metrics and working with teams in an agile format to integrate and validate product logging consistent with measuring behaviors related to those metrics for tracking and improving business performance
- using Python to manipulate data and generate advanced statistical analyses, including time-series regressions or ARIMA models.
Responsibilities
- Build tools and models for data warehouse to increase data quality and development efficiency.
- Identify and solve performance challenges with jobs and queries.
- Design tools and tests for product engineers to ensure data quality at the time it is produced.
- Automate and enhance the software development lifecycle of our data processes by adding automation and working with data build tools (dbt), Airflow, and other tools.
- Implement data pipelines and data ingestion best practices.
- Provide recommendations on data model and query best practices.
- Bring software development lifecycle to data by porting metrics queries to DBT and automating data quality monitoring.
Other
- Minimum Requirements: Master’s degree or U.S. equivalent in Mathematics, Management Science, Computer Science, or a related field, plus 3 years of professional experience as a Business Intelligence Engineer, Data Engineer, or related occupation/position/job performing data modeling to address scalability and performance.
- Collaborate with cross-functional teams (Product Management, Data Analytics, and Finance) to design systems that fit a broad range of data needs.
- Prepare data governance practices for external scrutiny.
- Position is 100% remote & reports to HQ in San Franciso, CA.
- Must specify ad code GKDC.