AssetMark is looking to solve the problem of transforming raw, governed data into reliable, accessible, and high-quality data products for business intelligence (BI) and reporting.
Requirements
- Technical Stack: Expert proficiency in Advanced SQL for complex query writing and optimization.
- Modeling Expertise: Strong experience implementing Dimensional Modeling (Kimball) concepts.
- Tooling: Proven hands-on experience using dbt (Data Build Tool) in a production environment.
- Data Warehouse: Experience working with a modern cloud data warehouse like Snowflake (Azure knowledge is a plus).
- Programming: Foundational scripting ability in Python (for automation, utility scripts, or advanced dbt logic).
- Experience with BI tools (Power BI, Tableau) from a data preparation/modeling standpoint.
- Familiarity with data governance practices and concepts like semantic layers.
Responsibilities
- Modeling & Architecture: Design, develop, and maintain analytical data models (e.g., Star Schemas, Dimensional Models) in Snowflake to support business reporting, BI tools (e.g., Power BI), and strategic decision-making.
- Transformation Logic (dbt): Own the entire dbt (Data Build Tool) lifecycle. Write, test, and deploy complex transformation logic in modular SQL, ensuring efficiency, reusability, and adherence to established coding standards.
- Data Curation: Collaborate with data engineers to transition raw data feeds (Bronze/Silver layers) into clean, certified, and aggregated datasets (Gold layer), making them the canonical source of truth for business metrics.
- Performance Tuning: Optimize SQL queries and dbt models for performance and efficiency within the Snowflake environment, managing compute usage and costs.
- Testing & Validation: Implement rigorous data quality testing using dbt tests and other validation frameworks to ensure data accuracy, completeness, and freshness before publishing to business users.
- Data Lineage & Documentation: Proactively generate and maintain comprehensive documentation, data dictionaries, and data lineage tracking for all production data models, promoting data literacy and trust across the organization.
- CI/CD Integration: Integrate dbt workflows into the existing CI/CD pipelines (e.g., Azure DevOps) to automate testing and deployment processes, ensuring continuous delivery and model reliability.
Other
- Experience: 3+ years of professional experience as an Analytics Engineer, BI Engineer, or Data Analyst focused on data modeling and transformation.
- Education: Not specified, but 3+ years of experience is required.
- Travel: Not specified, but candidates must be able to accommodate a hybrid work schedule and be close to the Charlotte, NC office.
- Visa Requirements: Candidates must be legally authorized to work in the US to be considered. AssetMark is unable to provide visa sponsorship for this position.
- Soft Skills: Ability to collaborate with data engineers, business stakeholders, and data analysts, and to communicate technical information to non-technical stakeholders.