Transform existing ETL logic to AWS, Databricks, or equivalent cloud platforms and implement new or enhance existing services and components using both object-oriented and functional programming.
Requirements
- AWS, Java/Python and Big Data experience.
- Snowflake, Databricks
- Experience in data engineering, with a focus on cloud-based solutions
- Hands-on practical Experience in system design, application development, testing, and operational stability
Responsibilities
- Acquire and manage data from various sources and storage systems
- Transform existing ETL logic to AWS, Databricks, or equivalent cloud platforms
- Implement new or enhance existing services and components using both object-oriented and functional programming
- Establish and enforce guidelines to ensure consistency, quality, and completeness of data assets
- Executes standard software solutions, design, development, and technical troubleshooting
- Writes secure and high-quality code using the syntax of at least one programming language with limited guidance
- Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications
Other
- Must be local or from neighbour states.
- Independent consultants only
- 5 Days onsite
- Formal training or certification on data engineering concepts