Role : Data Engineer / Data Developer (GCP)
Experience : 2–6 Years
Location : Banglore and Pune
Required Skills & Experience
Strong hands-on experience as a Data Engineer / Data Developer
Experience working within an agile, multidisciplinary DevOps team
Knowledge of Hadoop , with experience in NiFi / Kafka
Expertise in : Python
DataFlow
Pub / Sub
BigQuery
Solid experience / knowledge of GCP components , including :
GCS (Google Cloud Storage)
BigQuery
Airflow
Cloud SQL
Pub / Sub / Kafka
DataFlow
Google Cloud SDK
Understanding of Terraform and Shell scripting
Experience working with at least one RDBMS
GCP Data Engineer certification is an added advantage
Responsibilities
Collaborate and interface with stakeholders to assess technical requirements and business impact
Work effectively within an agile DevOps environment
Migrate and re-engineer existing on-premise workloads to GCP / AWS
Understand business requirements and deliver scalable, real-time technical solutions
Use project tools such as JIRA, Confluence, and Git
Develop Python / Shell scripts to automate operations and server management
Build and maintain tools for monitoring, alerting, trending, and analysis
Define, document, test, and execute operational procedures
Maintain detailed documentation of current and future system configurations, processes, and policies
Please share your CV to
Gcp Data Engineer • Hyderabad, India