SharkNinja is looking to solve the problem of building and maintaining a modern data ecosystem to support analytics initiatives across the business.
Requirements
- Must have used SQL within academic projects or in prior internship settings
- Strong foundation in programming (Python, Java, or similar) with Python knowledge preferred
- Curiosity about cloud data platforms (AWS) and an eagerness to learn modern data engineering tools.
- Previous exposure (through coursework or projects) to data modeling, ETL concepts, or big data tools is a plus.
- Familiarity with Git or version control systems is desirable.
Responsibilities
- Data Pipelines & ETL/ELT: Help design, build, and test batch and streaming pipelines that move data from diverse sources into our cloud data platforms.
- Data Quality & Testing: Implement validation checks and assist with monitoring systems that ensure accuracy, completeness, and trust in our data.
- Automation & Scripting: Write Python and SQL code to automate ingestion, transformation, and reporting tasks.
- Analytics Enablement: Support analysts and data scientists by preparing clean, well-modeled datasets that unlock new insights.
- Learning Modern Tools: Gain exposure to technologies such as dagster/dbt, Snowflake, and ELT platforms like Fivetran
- Collaboration: Work closely with teammates to translate business requirements into data solutions, while documenting and sharing your work clearly.
Other
- Current student in their third year or beyond of a bachelor's program, or currently enrolled in a master’s or doctorate program
- Must be able to work a full-time, 40-hour-per-week schedule with 2 days per week onsite in Needham, MA
- Pursuing a STEM degree program
- Ability to explain technical ideas clearly and collaborate with cross-functional teammates.
- Analytical thinker with strong problem-solving skills and attention to detail.