Goosehead Insurance is looking to ensure that the company's data ecosystem is robust, secure, and capable of supporting data-driven ambitions.
Requirements
- Hands-on experience with various data architecture patterns, including data warehouse, data mesh, medallion architecture, and data lakehouse to meet modern analytics needs.
- Proficient in data modeling techniques and ERD development, including conceptual, logic, and physical models, as well as Snowflake and star schemas, to support robust data structuring and optimization.
- Skilled in leveraging industry-standard tools such as Erwin, SQL DBM, Lucidchart, and ER/Studio for designing and documenting scalable data solutions.
- Advanced expertise in SQL and Python, with proven capabilities in complex data extraction,transformation, and querying.
- Extensive hands-on experience with leading data platforms, including Snowflake, to architect and implement high-performance solutions.
- Strong understanding of modern data integration patterns, including APIs, event-driven architecture, streaming, and ELT/ETL workflows
- Experienced in implementing comprehensive data governance frameworks and deploying data catalog solutions to enhance data quality, security, and discoverability.
Responsibilities
- Define and implement the enterprise data strategy aligned with long-term business objectives.
- Design scalable, secure and flexible data architecture enabling data-driven decision-making.
- Architect and deploy end-to-end data solutions, including data lakes, data warehouses, lakehouses, integration pipelines, and modeling frameworks.
- Create and optimize data ingestion, transformation, and integration processes using industry-standard ETL/ELT tools to support both real-time and batch data flows.
- Oversee enterprise-wide data integration,ensuring seamless interoperability across systems, applications, and departments.
- Establish and enforce data governance policies, ensuring data integrity, security, privacy, and compliance with regulatory standards.
- Utilize programming languages such as Python and SQL to design and implement scalable data pipelines.
Other
- Bachelor’s degree in Computer Science, Information Systems, Data Science, or related field required.
- Over 5+ years of expertise in data management, including 2+ years as a Data Architect, delivering innovative solutions and driving transformative initiatives.
- Excellent communication skills, with the ability to present technical solutions effectively to both technical and non-technical audiences.
- Proven leadership experience in leading cross-functional teams and driving data initiatives.
- Ability to work closely with IT teams, data engineers, data scientists, and analysts to ensure consistency and alignment.