Zendesk’s Enterprise Architecture team is looking to build digital data systems and platforms, and needs an Enterprise Data Architect to guide this effort and provide strategic direction.
Requirements
- Demonstrable experience with enterprise-wide architectures, integration, warehousing and highly proficient with relational database design, data streams, data modeling, and SQL
- 10+ years of data architecture experience leading MDM roadmaps, defined business cases and implementation efforts
- Deep understanding of master data management principles, data governance frameworks, and industry best practices
- Creating ETLs and reverse-ETLs patterns and services to handle data from various data sources, formats, and use cases
- Experience with streaming, batch and micro-batch data processing and workflows
- Built and implemented “citizen integrator” (user self-service) operating models and solutions using iPaaS or related technologies
- Enterprise data management experience including exposure to aspects such as analytics and data science, BI, MDM, governance, quality, security and issue management to enable effective engagement
Responsibilities
- Contribute to and make recommendations to Zendesk’s data ecosystem with a focus on multi-cloud architecture capable of supporting various data types; structured, semi-structured and unstructured data at the petabyte scale
- Lead and create overall master data solution architectures, system landscapes (blueprints), governance workflows, and systems integration and implementation standards/patterns.
- Collaborating with business stakeholders and technical teams to understand business processes, master data management requirements and challenges, maturity levels, and identify opportunities for the design of a best of breed MDM solution
- Produce relevant solution design artifacts including but not limited to technical estimates, Reference Architecture blueprints, conceptual system designs, sequence diagrams, and data models.
- Evaluate and support data solutions within and across a variety of data tools such as GCP Airflow, AWS Lambdas, Python, Cloud Functions, DBT, Astronomer.io, Workato
- Lead and design best in class practices leading to implementation planning of data movement and integration patterns across our ecosystem, defining target architectures and tools
- Provide leadership and standard methodologies for using native-first connectors over third-party for integrating our various workflows, applications and data.
Other
- 10+ years of experience
- Ability to drive the practical evolution and innovation of infrastructure, processes, products and services by influencing decision makers, implementers, stewards, and owners of what direction should be taken
- Hybrid work schedule, with some days working remotely and some days in the office
- Bachelor's degree or equivalent experience
- Travel may be required