The company is looking to solve data analysis problems with experienced professionals
Requirements
- Hadoop Stack of Technologies ( Hadoop ,Spark, HBase, Hive , Pig , Sqoop, Scala ,Flume, HDFS , Map Reduce)
- Python & Kafka
- Database concepts , Data Design , Data Modeling and ETL
- Teradata & Informatica
- Machine Learning Models and Artificial Intelligence
- Data Components , Data Processing & Data Analytics on AWS
- Data modeling tools like Erwin
Responsibilities
- Hands on Experience in Hadoop Stack of Technologies ( Hadoop ,Spark, HBase, Hive , Pig , Sqoop, Scala ,Flume, HDFS , Map Reduce)
- Hands on experience with Python & Kafka
- Analyzing, designing, and coding ETL programs which involves Data pre-processing , Data Extraction , Data Ingestion , Data Quality ,Data Normalization & Data Loading
- Delivering projects in Agile Methodology
- Working with Jira
- Client Facing Roles with good communication & thought leadership skills to co-ordinate deliverables across the SDLC
- Data Modeling and ETL
Other
- 6+ Years experience working in Data Analysis
- Good communication & thought leadership skills
- Client Facing Roles
- Experience in Agile Methodology
- Good understanding of SDLC