Location – Pune / Chennai
Exp - 6 to 9
Requirements
Technology : - Big Data (Hadoop, Hive, Spark, Scala, PLSQL / No SQL Database)
Knowledge / Experience (Must) :
- 6-9 years of experience in Hadoop / big data technologies.
- Experience with Spark / Storm / Kafka or equivalent streaming / batch processing and event
based messaging.
Relational and NoSQL database integration and data distribution principles experienceHands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala,Spark, Kafka, Kudu, Solr)
Experience with API development and use of JSON / XML / Hypermedia data formats.Strong development / automation skillsExperience with all aspects of DevOps (source control, continuous integration,deployments, etc.)
5+ years of hands-on experience as a Scala developer (with previous Java background)Knowledge / Experience (Preferred) :
Experience in Core Banking functionality for generating various hand-offsExperience with containerization and related technologies (e.g. Docker, Kubernetes)Comprehensive knowledge of the principles of software engineering and data analyticsKnowledge of agile(scrum) development methodology is a plusCloudera / Hortonworks / AWS EMR, S3 experience a plusQualifications :
Strong academic record, ideally with a bachelor’s degree in engineering / mathematical or scientific
background.
Strong Communication skillsSelf-MotivatedWillingness to learn.Excellent planning and organizational skills