Develop high-performing data solutions that enable analytics, reporting, and downstream consumption across the business.
Requirements
- Advanced proficiency in Python and SQL, with ability to design efficient data transformations and workflows
- Experience designing and implementing large-scale ETL pipelines and working with event-driven architectures (e.g., Kafka, Event Hub)
- Strong knowledge of data warehousing concepts, dimensional modeling, and modern data architectures (e.g., data lakes, lake houses)
- Hands-on experience with Data Services (e.g., Data Factory, Databricks, Synapse, Azure Functions)
- Extensive experience working with a major cloud platform (preferably Azure), including data architecture and platform design considerations
- Solid understanding of DevOps principles, version control (e.g., Git), and CI/CD practices for data systems
- Experience with Infrastructure as Code (IaC) tools such as Terraform, Bicep, or ARM templates for provisioning cloud infrastructure
Responsibilities
- Design and implement scalable, cloud-native data pipelines
- Build and maintain multi-dimensional data models optimized for performance, cost, and maintainability
- Integrate upstream data sources such as ERP, CRM, and external APIs into core data systems with attention to data integrity and lineage
- Optimize data flows for storage, processing, and performance using tools and best practices from the Azure ecosystem
- Participate in a weekly on-call operational rotation shared across Senior and Principal Engineers
- Own resolution of production issues during coverage weeks, including root cause analysis and stakeholder communication
- Complete both complex and foundational data tasks when supporting operations—ensuring platform reliability and agility
Other
- 4 or more years of experience in data engineering, software engineering, or similar technical roles with strong data focus
- Ability to balance technical execution with operational ownership, including working independently and prioritizing effectively during on-call weeks
- Participate in code and design reviews to elevate team standards
- Contribute to team documentation, reusability practices, and internal knowledge sharing
- Influence decisions around tooling, pipeline design, and architectural trade-offs
- Partner with product managers, analysts, and engineering peers to understand business needs and deliver well-architected data solutions
- Provide peer mentorship and participate in code and design reviews to elevate team standards