Job Title : Data Engineer – Google Cloud Platform (GCP)
Job Summary
We are seeking a skilled and motivated Data Engineer with hands-on experience in building scalable data pipelines and cloud-native data solutions on Google Cloud Platform. The ideal candidate will be proficient in GCP services like Pub / Sub, Dataflow, Cloud Storage, and BigQuery, with a foundational understanding of AI / ML workflows using Vertex AI.
Key Responsibilities
Design, develop, and optimize robust data ingestion pipelines using GCP services such as Pub / Sub, Dataflow, and Cloud Storage.
Architect and manage scalable BigQuery data warehouses to support analytics, reporting, and business intelligence needs.
Collaborate with data scientists and ML engineers to support AI / ML workflows using Vertex AI (AO Vertex), including model training and deployment.
Ensure data quality, reliability, and performance across all pipeline components.
Work closely with cross-functional teams to understand data requirements and deliver efficient solutions.
Maintain documentation and contribute to best practices in cloud data engineering.
Required Skills & Qualifications
3–6 years of experience in data engineering, with strong exposure to GCP.
Proficiency in GCP services : Pub / Sub, Dataflow, Cloud Storage, and BigQuery.
Solid understanding of data modeling, ETL / ELT processes, and performance optimization.
Experience with Python, SQL, and cloud-native development practices.
Familiarity with CI / CD pipelines and version control (e.g., Git).
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
Secondary Skills (Interview-Ready Knowledge)
Basic understanding of AI / ML workflows and tools within Vertex AI.
Ability to discuss model lifecycle, deployment strategies, and integration with data pipelines.
Awareness of MLOps principles and cloud-based ML orchestration.
Data Engineer • Hosur, Tamil Nadu, India