The company is seeking an experienced Data Architect/Data Engineer to develop and manage complex data pipelines, data lakes, and lakehouses, with a focus on Azure cloud data architecture.
Requirements
- Minimum of 15+ years of experience in Data Warehousing, Data Engineering, and Analytics.
- At least 8+ years of experience in cloud data architecture, with a minimum of 4+ years specializing in Azure.
- Hands-on experience (5-7 years) in Azure Databricks, including end-to-end delivery of one large program.
- High proficiency in developing complex ETL data pipelines using Databricks with PySpark.
- Experience in Unity Catalog is preferred.
- Knowledge of integrating with Power BI.
- Hands-on experience with current technologies and delivery methodologies.
Responsibilities
- Develop and manage complex data pipelines, data lakes, and lakehouses.
- Develop complex ETL data pipelines using Databricks with PySpark.
- Manage and maintain Data Lake & Lakehouse, including CDC (Change Data Capture) and SCD (Slowly Changing Dimensions).
- Integrate with Power BI and define approaches for data self-service.
- Develop progressive information management solutions and support end-to-end development life-cycle processes (SDLC).
- Align application development with business needs.
- Design and implement scalable data solutions that meet business requirements.
Other
- 3-4 days onsite in a week
- Exceptional analytical and problem-solving skills to address complex challenges.
- Ability to work effectively within a team environment, with excellent communication and presentation skills.
- Ability to articulate clear objectives and define qualitative/quantitative measures of success.
- Strong understanding of cloud data architecture principles and best practices.
- Strong communication and presentation skills.
- Analytical mindset with a methodical approach to problem-solving.
- Collaborative and team-oriented work style.