The partner company of Jobgether is looking to solve the problem of designing, building, and optimizing high-performance data platforms to support critical business insights and decision-making in the United States.
Requirements
- Strong proficiency in SQL, Python, and at least one scripting language.
- Proven experience designing and building data warehouses, data lakes, and large-scale ETL/ELT pipelines.
- Knowledge of cloud data platforms such as Snowflake, AWS Redshift, or similar.
- Expertise in data modeling, architecture, and design patterns.
- Experience with industry-leading ETL tools (Informatica, IBM DataStage, SAP BODS) and ELT tools (DBT, Fivetran, AWS Glue).
- Familiarity with version control tools (GitLab) and CI/CD processes.
- Understanding of SDLC, Agile methodology, and data integration best practices.
Responsibilities
- Design, develop, and maintain data warehouses, data lakes, and large-scale ETL/ELT pipelines.
- Build solutions using industry-leading ETL tools and modern cloud-based data platforms such as Snowflake and AWS.
- Develop and optimize SQL scripts, stored procedures, and Python-based data processing solutions.
- Ensure data quality, integrity, and performance across multiple systems and pipelines.
- Apply data modeling principles to create logical and physical data models for high-volume environments.
- Implement version control, CI/CD processes, and Agile practices for data projects.
- Monitor and tune pipeline performance and database objects to achieve optimal efficiency.
Other
- Excellent problem-solving, analytical, and communication skills.
- Experience in financial services or other regulated industries is a plus.
- Remote-friendly, full-time position.
- Competitive hourly rate of $70–$72.
- Opportunity to work on large-scale, impactful data initiatives.