Required Technical Skill Set - GCP, Python, SQL, Big Query, ETL Data Pipeline
Location of Requirement - Chennai, Kochi, Bangalore,HYD,Pune
Must-Have Competencies ,
Python, Pandas, NumPy
SQL, Data Warehouse concept
GCP services (Big Query, Cloud Run, Pub / Sub, Cloud Storage, Spanner, Cloud Composer, Dataflow, Cloud Functions)
Docker, Kubernetes, GitHub
Good Communication skills
Good-to-Have
Pyspark, Data Modeling, Dataproc, Terraform
Google Cloud Professional Data Engineer certification
Responsibility of / Expectations from the Role
1 Design, develop, and manage robust, scalable ETL / ELT pipelines on GCP using tools such as Dataflow, Cloud Run, Big Query, Pub / Sub and Cloud Composer.
2 Implement data ingestion workflows from various structured and unstructured data sources.
3 Build and maintain data warehouses and data lakes using Big Query, Cloud Storage, and other GCP services.
4 Optimize data models and schema design for efficient querying and storage
5 Experience with ETL / ELT tools and data modeling concepts
6 Monitor and troubleshoot data pipeline performance and failures.
Reporting To Which Role GC developer
Size of the Team, if any Reporting to this Role - 200
Unique Selling Proposition (USP) of The Role - Long term program, Learning emerging cloud technologies,
Details of The Project (A short Briefing on the Project may be attached with this document for candidate- briefing). It may be shared with external stakeholders like job-agencies etc.
Migration of F&A data from Oracle to GCP and setting up the analytical platform for reporting
Gcp Data Engineer • Greater Hyderabad Area, India