Duration : 6 months _Possible Extension
Skill Required : Digital : Google Data Engineering
Experience Range in Required Skills : 8 to 10 Years
Job Description : GCP Data Engineer with Pyspark
1 Development in a programming language on GCP
2 Testing & Validation
3 Debugging & troubleshooting
Essential Skills : (Must-Have)
1 Good hands on knowledge on GCP
2 Should have worked on Data Migration projects, from On-prem to Cloud
3 Should have Cloud Storage Knowledge, Big Query, Cluster Knowledge
4 Sound programming knowledge on PySpark & SQL in terms of processing large amount of semi structured & unstructured data
5 Ability to design data pipelines in end-to-end manner
6 Knowledge of Avro, Parquet format
7 Knowledge on working on Hadoop Big Data platform and ecosystem
8 Strong debugging and troubleshooting capabilities.
9 Have experience of guiding the Technical Team for attaining the delivery milestones
Good-to-Have :
1. Knowledge of Jira, Agile, Sonar, Team city & CICD.
2. Any exposure / experience for an international Banking client / multi-vendor / multi geography teams.
3. Knowledge on Dataproc, Pyspark
Details of The Role (For Candidate Briefing)
Reports to : Scrum Master / Technical Architect
Unique Selling Proposition (USP) of The Role : Chance to work in a team implementing solutions for a top investment bank in the world on latest set of tools & technologies.
Details of The Project : A project within a large portfolio for one of the leading European Investment Banks
NOTE : Flexible on the rate part
Gcp Data Engineer • KA, India