Improving the global financial system and creating greater economic fairness and opportunity for more people, in more places around the world through crypto solutions for financial institutions, businesses, governments and developers at Ripple
Requirements
- Coursework / previous intern experience with software engineering, ideally involving data oriented applications (Python, Java or other programming languages)
- Experience in building ETL and ELT data pipelines. Real time pipelines are big plus
- Experience writing SQL queries in data warehouses such as Redshift, BigQuery
- Knowledge of building REST API endpoints
- Exposure to Hadoop and NoSQL databases like hbase, cassandra, etc is a plus
- Exposure to CI/CD (via Airflow or similar tool) a plus
- Experience with Terraform or similar tools a huge plus
Responsibilities
- Collaborate with teams to define and maintain a comprehensive data dictionary for consistent data definitions across systems
- Implement data quality frameworks to ensure accuracy, completeness, and reliability of data
- Establish and carry out data governance policies to maintain compliance, security, and proper data usage
- Design and promote best practices for data modeling, pipelines, and infrastructure to support scalable and maintainable solutions
- Partner with stakeholders to ensure data structures and processes align with business objectives and analytical needs
Other
- Currently enrolled in an Undergraduate, Graduate or PhD program, preferably in a science or quantitative field
- Available to work for 12 weeks during Summer 2026, beginning in May or June
- Intent to return to degree-program after the completion of the internship
- Excellent written and verbal communication skills and attention to detail with a commitment to excellence