Experience : 4-7 Years
Key Responsibilities :
- Design, develop, and maintain data pipelines and ETL tool ( Informatica ) workflows using GCP-native tools and services.
- Build and optimize data warehouses using BigQuery or other DWHs (e.g., Snowflake on GCP, Redshift on GCP).
- Write complex and efficient SQL queries for data transformation, analysis, and reporting.
- Collaborate with analysts, data scientists, and business stakeholders to understand data needs and deliver reliable solutions.
- Implement data governance, security, and monitoring best practices across GCP projects.
- Tune queries and optimize performance of large-scale datasets.
- Automate workflows using Cloud Composer (Airflow) or similar orchestration tools. Required Skills & Qualifications :
- 3+ years of experience in a data engineering or data platform role.
- Expert-level skills in SQL — able to write optimized, scalable, and complex queries.
- Strong experience on ETL Tool ( Informatica )
- Strong hands-on experience with Google Cloud Platform services, especially BigQuery, Cloud Storage, Dataflow, Pub / Sub, Composer.
- Experience with data modeling (star / snowflake schema), partitioning, clustering, and performance tuning in a data warehouse.
- Familiarity with modern ELT tools such as dbt, Fivetran, or Cloud Data Fusion.
- Experience in Python or similar scripting language for data engineering tasks.
- Solid understanding of data governance, privacy, and security in cloud environments.
Thanks & Regards
Prashant Awasthi
Vastika Inc.
1200 West Walnut Hill Lane, Suite# 2200
Irving, TX 75038
E-mail-Pawasthi@vastika.com
Cell 469-359-1422