Experience with Apache Airflow for orchestration of data workflows.
Familiarity with CI / CD pipelines and version control (e.g., Git).
Strong problem-solving skills and ability to work in a fast-paced environment.
Preferred Qualifications :
GCP Professional Data Engineer certification.
Experience with data modeling and data warehousing concepts.
Exposure to other cloud platforms (AWS Job Summary :
We are seeking a highly skilled and experienced GCP Data Engineer to join our dynamic data engineering team. The ideal candidate will have a strong background in Google Cloud Platform (GCP) , Airflow , Python , and SQL , with a proven track record of building scalable data pipelines and solutions in cloud environments.
Key Responsibilities :
Design, develop, and maintain scalable and robust data pipelines on GCP using Cloud Dataflow, BigQuery, Pub / Sub, and Cloud Composer (Airflow) .
Build and optimize ETL workflows to ingest, transform, and store large volumes of structured and unstructured data.
Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality data solutions.
Implement data quality checks, monitoring, and alerting mechanisms.
Ensure data security, compliance, and governance best practices are followed.
Optimize performance and cost-efficiency of data pipelines and storage solutions.
Required Skills :
Strong experience with GCP services : BigQuery, Dataflow, Pub / Sub, Cloud Storage, Cloud Composer (Airflow), etc.
Proficiency in Python for scripting and data manipulation.
Advanced SQL skills for data queryi, Azure) is a plus.
Knowledge of containerization (Docker, Kubernetes) is an advantage.