Job Role : GCP Data Engineer
Location : Pan India
Experience : 5+ years
Type of Hire : Permanent
Required Technical Skill Set : GCP, PySpark, Python, HDFS, Hadoop, SQL
Must Have Skills :
- Good Handson Experience on GCP
- Should have worked on Data Migration projects, from On-prem to Cloud
- Should have Cloud Storage Knowledge, Big Query, Cluster Knowledge
- Sound programming knowledge on PySpark & SQL in terms of processing large amount of semi structured & unstructured data
- Ability to design data pipelines in end to end manner
- Knowledge on Avro, Parquet format
- Knowledge on working on Hadoop Big Data platform and ecosystem
- Strong debugging and troubleshooting capabilities.
- Have experience to guide the Technical Team for attaining the delivery milestones
Good-to-Have :
1. Knowledge on Jira, Agile, Sonar, Team city & CICD
2. Any exposure / experience for an international Banking client / multi-vendor / multi geography teams
3. Knowledge on Dataproc, Pyspark