The company is looking to solve the problem of building and maintaining scalable data pipelines, ensuring the smooth flow of data from source to destination, and contributing to the development of its data architecture.
Requirements
- Familiarity with SQL and relational databases.
- Basic understanding of data structures, algorithms, and software development principles.
- Knowledge of data technologies (e.g., Snowflake, Azure Data Factory) is a plus.
- Experience with cloud platforms (AWS, Azure, Salesforce)
- Understanding of ETL scripts and data validation checks
- Knowledge of data modeling and database design
- Experience with data pipelines and workflows
Responsibilities
- Assist in designing, developing, and maintaining data pipelines to collect, process, and store large datasets.
- Help optimize and troubleshoot data workflows to ensure accuracy, efficiency, and scalability.
- Work with cloud platforms (AWS, Azure, Salesforce) to implement and maintain data infrastructure.
- Ensure data quality by writing and maintaining robust ETL scripts and data validation checks.
- Participate in data modeling and database design for efficient data storage and retrieval.
- Assist in documenting data processes and maintaining clear records of project progress.
- Collaborate with team members and other stakeholders, including leadership, to understand data needs and provide data solutions that align with business objectives.
Other
- Currently pursuing a degree in Computer Science, Data Engineering, or a related field.
- Excellent communication and collaboration skills, including the ability to communicate effectively with leadership and cross-functional teams.
- Ability to work in a fast-paced environment and meet deadlines.
- Strong analytical and problem-solving skills.
- Location - On-Site, Sioux Falls, SD (no hybrid or remote option)