The company is looking for a Hadoop Developer to work with big data technologies and implement solutions in a production environment.
Requirements
- 5+ years of experience in data warehousing architectural approaches.
- Minimum of 4 years in big data,(Cloudera) Sound understanding and experience with Hadoop ecosystem (Cloudera).
- Must have experience with Big Data technologies like Hadoop, Hive, Spark, python, scala etc.
- Experience in python and Unix shell scripting.
- Experience in scheduling tool like Autosys
- Sound knowledge of relational databases (SQL) and experience with large SQL based systems.
- Exposure to and strong working knowledge of distributed systems.
Responsibilities
- Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand.
- Experience in working with a Big Data implementation in production environment.
- Experience in query optimization, performance tuning of the complex SQL queries.
- Benchmark and debug critical issues with algorithms and software as they arise.
- Coordinates and facilitates routines to support delivery of technology solutions – e.g. kick-offs, status reviews, stakeholders meetings, change controls, and tollgates.
- Plans and coordinates delivery and dependencies across multiple technology teams.
- Facilitates dependency management, risk managements, and impediment removal for the defined deliverables.
Other
- Understanding of Agile methodologies and technologies
- Excellent understanding of client-service models and customer orientation in service delivery.
- Ability to grasp the 'big picture' for a solution by considering all potential options in impacted area.
- Aptitude to understand and adapt to newer technologies.
- The ability to work with team mates in a collaborative manner to achieve a mission.