Job Descriptions for Big data or Cloud Engineer
Position Summary :
We are looking for candidates with hands on experience in Big Data with GCP cloud.
Qualifications
- 4-7 years of IT experience range is preferred.
- Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub / sub, Cloud functions, Big Query, GCS - At least 4 of these Services.
- Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function.
- Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs.
- Good hands on expertise on either Python or Java programming.
- Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM.
- Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos.
- Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations.
- Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities.
- Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies.
- Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams.
- Technical ability to become certified in required GCP technical certifications.
Mandatory Skills
GCP, Dataproc, Dataflow, Pubsub, Bigquery, Python / PysparkImmediate joiners are preferred.