Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

AssetMark Logo

Data Engineer

AssetMark

$126,000 - $140,000
Dec 30, 2025
Charlotte, NC, US
Apply Now

AssetMark is looking to solve the problem of ensuring data reliability and scalability across their Azure and Snowflake data stack, while proactively integrating Generative AI techniques into their engineering workflows and data products.

Requirements

  • Proven hands-on experience building and deploying data solutions on Microsoft Azure.
  • Deep expertise with Snowflake architecture, optimization, and advanced SQL features.
  • Strong proficiency in Python for data manipulation, scripting, and pipeline automation.
  • Solid experience with data modeling techniques (Dimensional, 3NF, or Data Vault) and developing complex ETL/ELT workflows.
  • Experience with modern data transformation tools like dbt (Data Build Tool) and orchestration tools (e.g., Azure Data Factory, Airflow).
  • Prior exposure to Data Observability platforms (e.g., Monte Carlo, Collibra).
  • Familiarity with Generative AI (GenAI) concepts or hands-on use of LLM coding assistants (e.g., Copilot) to improve engineering efficiency.

Responsibilities

  • Design, build, and optimize highly scalable and fault-tolerant ELT/ETL pipelines using Python, SQL, and dbt to integrate complex financial datasets from diverse sources into Snowflake (hosted on Azure).
  • Own the data infrastructure on Azure, including leveraging services like Azure Data Factory and Azure Synapse, with expertise in setting up and managing data flows into Snowflake.
  • Lead the design and implementation of dimensional, Kimball/ Inmon, and/or Data Vault models within the data warehouse to support advanced analytics and reporting.
  • Conduct performance tuning for complex SQL queries and data pipelines within Snowflake to ensure low latency and cost-efficient compute usage.
  • Champion software engineering best practices, including robust unit/integration testing, automated data validation, and maintaining resilient CI/CD pipelines (e.g., using Azure DevOps or GitHub Actions).
  • Implement advanced data quality frameworks and observability solutions (e.g., Monte Carlo) to automatically monitor data freshness, volume, distribution, and schema health, proactively preventing data downtime.
  • Establish and maintain comprehensive data lineage documentation and tooling to provide transparency and ensure compliance across the data transformation layer.

Other

  • 3 - 7 years of professional experience in a Data Engineering, Software Engineering, or similar role.
  • Experience working in the financial services or wealth management domain.
  • Candidates must be legally authorized to work in the US to be considered.
  • We are unable to provide visa sponsorship for this position.
  • Leading with Heart, in truly making a difference in the lives of others: teammates, clients, investors and communities.