Job Description :
Key Responsibilities :
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as :
○ Dataflow for real-time and batch data processing
○ Cloud Functions for lightweight serverless compute
○ BigQuery for data warehousing and analytics
○ Cloud Composer for orchestration of data workflows (based on Apache
Airflow)
○ Google Cloud Storage (GCS) for managing data at scale
○ IAM for access control and security
○ Cloud Run for containerized application
Required Skills :
7–10 years of hands-on experience in Python for backend or data engineering
projects.
Strong understanding and working experience with GCP cloud services(especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
Solid understanding of data pipeline architecture, data integration, andtransformation techniques.
Experience in working with version control systems like GitHub and knowledge ofCI / CD practices.
Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.Strong experience in SQL with at least one enterprise database (SQL Server,Experience in data migrations from on-premise data sources to Cloud platforms.Important Note (Please Read Before Applying)
🚫 Do NOT apply if :
You have less than 7 years or more than 14 years of experienceYou do not have hands-on GCP experienceYou do not have hands-on Python experienceYou are on a notice period longer than 15 daysYou are looking for remote only (role is hybrid in Hyderabad)You are a fresher or unrelated background (e.g., support, testing only, non-IT roles)✅ Apply ONLY if you meet ALL criteria above. Random / irrelevant applications will not be processed .