Strong knowledge of GCP services such as Cloud Storage, Big Query, Dataflow, Dataproc, Cloud Composer, Pub / Sub, Airflow, DAG etcExperience in data and analytics, including cloud technologiesExperience of Finance / Revenue domain will be an added advantageExperience with GCP Migration activities will be an added advantageExperience in SDLC with emphasis on specifying, building, and testing mission critical business applicationsTechnical Experience :
- Should have worked on Hadoop / Big data project and good SQL, hive, Big Query experience
- Should be comfortable with git, Jenkins CI / CD
- Should be good in Python / Hadoop / Spark
- Strong knowledge of GCP services, especially Big Query, data warehouse concepts
- Designing, implementing, and maintaining data infrastructure and pipelines on the Google Cloud Platform (GCP)
Professional Attributes :
- Strong analytical, inter personal communication skills
- Must possess impeccable communication skills, both in verbal and in written form
- Proficient in identifying, analyzing and solving problems
- Client facing experience
Educational Qualification :
- Bachelor's degree is must
Skills Required
Gcp, BigQuery, Python, Hadoop, Spark, Sql