Kapitus is looking to design, build, and maintain its data infrastructure to ensure high-quality data is available for analytics and decision-making.
Requirements
- Strong proficiency in Snowflake, including experience with data modelling and performance optimization.
- Strong experience with DBT (Core or Cloud) in a production environment.
- Proficiency in SQL – writing complex queries, window functions, CTEs, etc.
- Familiarity with Git and collaborative development workflows.
- Solid understanding of RDBMS principles and experience with AWS RDS.
- Familiarity with data preparation and analytics using Alteryx.
- Knowledge of cloud platforms, particularly AWS, including services such as S3, EventBridge, and Lambda.
Responsibilities
- Design, develop, and maintain data pipelines to ensure the smooth flow of data into Snowflake and other data storage solutions.
- Implement ELT and ETL processes using tools such as DBT and Apache Airflow for data transformation and orchestration.
- Optimize and monitor data workflows to improve efficiency and performance.
- Manage and maintain RDBMS databases, focusing on AWS RDS, ensuring data reliability and security.
- Ensure data quality and integrity by implementing best practices in data governance.
- Document data architecture, processes, and workflows to support team knowledge sharing.
Other
- Collaborate with data analysts and stakeholders to understand data requirements and deliver effective solutions.
- Excellent problem-solving skills and a proactive approach to challenges.
- Strong communication skills, able to collaborate with technical and non-technical teams.
- Consideration will be given to qualified remote candidates residing in states where Kapitus and/or one of its subsidiaries has an established physical presence.