Job Title : Data Engineer
We are seeking a skilled and motivated data engineer with hands-on experience in building scalable data pipelines. The ideal candidate will be proficient in GCP services like Pub / Sub, Dataflow, Cloud Storage, and BigQuery.
Key Responsibilities :
- Design and develop robust data ingestion pipelines using GCP services such as Pub / Sub, Dataflow, and Cloud Storage.
- Architect and manage scalable BigQuery data warehouses to support analytics and reporting needs.
- Collaborate with data scientists to support AI / ML workflows using Vertex AI.
- Ensure data quality, reliability, and performance across all pipeline components.
- Work closely with cross-functional teams to understand data requirements.
Required Skills & Qualifications :
3-6 years of experience in data engineering with strong exposure to GCP.Proficiency in GCP services : Pub / Sub, Dataflow, Cloud Storage, and BigQuery.Solid understanding of data modeling, ETL processes, and performance optimization.Experience with Python, SQL, and cloud-native development practices.Familiarity with CI / CD pipelines and version control.Secondary Skills (Interview-Ready Knowledge) :
Basic understanding of AI / ML workflows within Vertex AI.Ability to discuss model lifecycle and deployment strategies.