Manage and transform complex data pipelines, enabling data-driven insights for innovative fraud detection and analytics solutions.
Requirements
- 5+ years of experience as a Data/Analytics Engineer working with DBT, including semi-structured data, custom macros, incremental models, job scheduling, and streaming data.
- Hands-on experience with Snowflake or other cloud data warehouses, including clones, pipes, external stages, and query optimization.
- Proficiency in Terraform, Python, and AWS cloud environments.
- Advanced SQL skills, including CTEs and window functions.
Responsibilities
- Manage daily operations of data pipelines using DBT.
- Implement and optimize changes in Snowflake based on product development and research needs.
- Monitor pipeline performance and costs, proactively addressing inefficiencies.
- Troubleshoot and resolve data, system, and performance issues across pipelines.
- Collaborate with data science, product, and engineering teams to design and implement scalable solutions.
- Maintain high-quality, reliable, and well-documented data workflows.
Other
- Strong problem-solving, analytical, and collaboration skills in a fast-paced, remote work environment.
- Bachelor’s degree in Data Science, Analytics, Computer Science, Information Systems, Mathematics, or equivalent experience.