ELEVI is looking for a team player to build, deploy, and manage data and AI solutions using the Databricks platform.
Requirements
- Specialization using the Databricks platform for building, deploying, and managing data and AI solutions
- Experience building ETL pipelines, managing data pipelines, and working with large datasets using tools like Spark, Python, and SQL
- Experience with technologies like Delta Lake, Delta Live Tables, and Databricks Workflows
Responsibilities
- Writing and maintaining code using an Extract-Transform-Load (ETL) platform to ensure data is transformed into suitable formats as defined by IC ITE initiatives.
- Interface with external teams and systems, employing various protocols including HTML and SFTP to collect data efficiently.
- Enhancing the ETL platform by adding features aimed at shortening timelines for future data integration efforts.
- Develop and maintain software, ensuring seamless integration into a fully functional system.
- Collaboration with external teams will be necessary to validate data ingest processes.
- Responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process
Other
- You must hold a current CI Poly clearance
- 5+ years of professional experience with a Bachelor's degree or equivalent
- Active TS/SCI with with CI Polygraph