Develop and maintain web applications, build RESTful APIs, design workflows, manage data ingestion, storage, and governance, deploy applications, ensure version control and code quality, integrate with BI tools, optimize performance, and ensure data security.
Requirements
- React.js TypeScript HTML5 CSS3
- Python (PySpark) Node.js
- Databricks Expertise: Delta Lake Unity Catalog Databricks SQL Workflows MLflow
- Big Data & Cloud: Apache Spark Hadoop Azure/AWS
- Visualization: Power BI Tableau
- Other: Git CI/CD Agile methodologies
- Databricks Certified Data Engineer Associate
Responsibilities
- Develop and maintain web applications using React.js and backend technologies such as Python and Node.js.
- Build RESTful APIs and ensure seamless integration between front end and back end components.
- Design workflows using Databricks (Delta Lake Unity Catalog Lakehouse architecture).
- Manage data ingestion storage and governance leveraging Databricks SQL and MLflow.
- Deploy applications on Azure and implement CI/CD pipelines.
- Ensure version control and code quality using Git/GitHub.
- Integrate Databricks with BI tools like Power BI or Tableau for dashboards and reporting.
Other
- Bachelors or Masters degree in Computer Science or related field.
- 5+ years of experience in full stack development and data engineering.
- Work in Agile teams participate in code reviews and mentor junior developers.
- Azure Fundamentals (AZ-900)
- SAFe Agile Practitioner