Job Title : Data Engineer – Cloud Architect
About the Role
We're seeking a seasoned professional with expertise in designing and developing scalable data solutions on cloud platforms. The ideal candidate will have hands-on experience in GCP services like Pub / Sub, Dataflow, Cloud Storage, and BigQuery, as well as a solid understanding of AI / ML workflows using Vertex AI.
Main Responsibilities :
- Data Ingestion Pipeline Development : Design, develop, and optimize robust data ingestion pipelines using GCP services such as Pub / Sub, Dataflow, and Cloud Storage.
- BigQuery Data Warehouse Architecture : Architect and manage scalable BigQuery data warehouses to support analytics, reporting, and business intelligence needs.
- AI / ML Workflow Support : Collaborate with data scientists and ML engineers to support AI / ML workflows using Vertex AI (AO Vertex), including model training and deployment.
- Data Quality and Performance : Ensure data quality, reliability, and performance across all pipeline components.
- Team Collaboration : Work closely with cross-functional teams to understand data requirements and deliver efficient solutions.
Requirements
3–6 years of experience in data engineering, with strong exposure to GCP.Proficiency in GCP services : Pub / Sub, Dataflow, Cloud Storage, and BigQuery.Solid understanding of data modeling, ETL / ELT processes, and performance optimization.Experience with Python, SQL, and cloud-native development practices.Familiarity with CI / CD pipelines and version control (e.g., Git).Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.Secondary Skills
Basic understanding of AI / ML workflows and tools within Vertex AI.Ability to discuss model lifecycle, deployment strategies, and integration with data pipelines.Awareness of MLOps principles and cloud-based ML orchestration.