The company is looking to solve the problem of building and maintaining scalable data and AI workflows, and is seeking an expert to design and implement solutions on Azure, work with Databricks, and implement Git-based workflows and CI/CD automation.
Requirements
- Solid knowledge of the Databricks ecosystem: architecture, Delta Lake, and Delta Live Tables
- Strong Python skills (OOP, testing, clean code) with experience in advanced data processing (preferably PySpark)
- Hands-on experience with API development and integration using FastAPI (or Flask/Django)
- Practical experience with Azure services: Apps, Containers, Storage, SQL
- Familiarity with DevOps practices: automation-first mindset, CI/CD pipelines, and deployment automation (DAB)
- Experience with Git, agile teams, and working in Scrum-based environments
- Knowledge of Docker, Terraform/ARM is a plus
Responsibilities
- Build and maintain APIs and microservices (FastAPI/Django/Flask) supporting data and AI workflows
- Design and implement scalable solutions on Azure (Apps, Containers, Storage, SQL)
- Work with Databricks (PySpark, Delta Lake, Delta Live Tables) to process and integrate data
- Implement Git-based workflows, testing, and CI/CD automation (GitHub Actions/Azure DevOps)
- Apply DevOps-first practices with automation and deployment using Databricks Asset Bundles (DAB)
- Ensure clean code, testing, and maintain high engineering standards
- Set up monitoring, logging, and alerting (Azure Monitor, Log Analytics, Cost Management)
Other
- B2B contract
- 100% remote work
- Dedicated certification budget
- Annual evaluation meetings to define an individual development path
- Benefits package