The business problem Talan is looking to solve is defining the data architecture, ensuring scalability, reliability, and performance of their client's data infrastructure, while collaborating with cross-functional teams to deliver business value from data.
Requirements
- Solid understanding of ETL/ELT, data modeling, and SQL/NoSQL systems.
- Familiarity with cloud platforms (AWS, Azure, GCP) and big data tools (e.g., Spark, Hadoop).
- Experience with Python, SQL scripting, and API integrations.
- Ability to evaluate multiple implementation paths and recommend optimal solutions.
- Dataiku certification or advanced expertise in automation, APIs, and deployment.
- Experience with data migration or modernization projects.
- Knowledge of platforms like Snowflake, Azure, or Hadoop.
Responsibilities
- Design, develop, and implement data pipelines and workflows in Dataiku to support data ingestion, transformation, and processing.
- Define and maintain data architecture standards, best practices, and governance policies.
- Collaborate with data engineers, analysts, and business stakeholders to translate business requirements into technical solutions.
- Ensure data quality, integrity, and security throughout the data lifecycle.
- Optimize data pipelines for scalability, performance, and cost-effectiveness.
- Support migration and integration of data from multiple sources into a unified data platform.
- Provide technical leadership and mentoring to junior data team members.
Other
- 5+ years in a similar role.
- Strong communication skills and capacity to mentor and challenge peers constructively.
- Previous experience in a Financial institution
- We offer sponsorship under specific treaty-based U.S. work visas, including TN and E-2 (for eligible Canadian, Mexican, and French citizens).
- 15 days of paid vacation per year at hire and up to 27 according to seniority (annual untaken vacation days are cashed out)