Job Description :
We are looking for an experienced GCP Big Data Engineer with strong expertise in building and managing large-scale data processing systems on Google Cloud Platform. The ideal candidate should have hands-on experience with BigQuery, Dataflow, Dataproc, Pub / Sub, Cloud Storage, and related GCP data services.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines and architectures on GCP.
- Optimize and manage data ingestion, transformation, and storage solutions.
- Collaborate with data scientists and analysts to ensure efficient data access.
- Implement data governance, security, and performance optimization best practices.
- Work with cross-functional teams to deliver end-to-end data solutions.
Technical Skills :
Expertise in GCP Data Services (BigQuery, Dataflow, Dataproc, Pub / Sub, Cloud Composer, etc.).Proficiency in Python, SQL, and Spark.Experience with ETL design, data modeling, and data warehousing.Knowledge of CI / CD pipelines and version control systems (Git).Understanding of data security, compliance, and access control.(ref : hirist.tech)