Talent.com
This job offer is not available in your country.
Data Engineer with GCP, Databricks & PySpark Expertise

Data Engineer with GCP, Databricks & PySpark Expertise

SynechronHinjawadi, Maharashtra, India
5 hours ago
Job description

Job Summary

Synechron is seeking a skilled Data Engineer experienced in Google Cloud Platform (GCP), Databricks, PySpark, and SQL. In this role, you will design, develop, and maintain scalable data pipelines and workflows to enable advanced analytics and business intelligence solutions. You will work within a collaborative environment to integrate diverse data sources, optimize data processing workflows, and ensure data quality and availability. Your contributions will support strategic decision-making and enhance the organization’s data-driven initiatives.

Software Requirements

Required Skills :

Hands-on experience with GCP services , specifically BigQuery , Cloud Storage , and Composer for data pipeline orchestration

Proficiency in Databricks platform with PySpark for building and optimizing large-scale ETL / ELT processes

Expertise in writing and tuning complex SQL queries for data transformation, aggregation, and reporting on large datasets

Experience integrating data from multiple sources such as APIs , cloud storage , and databases into a central data warehouse

Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer for scheduling, monitoring, and managing data jobs

Knowledge of version control systems (Git) , CI / CD practices, and Agile development methodologies

Preferred Skills :

Experience with other cloud platforms (AWS, Azure) or additional GCP services (Dataflow, Pub / Sub)

Knowledge of data modeling and data governance best practices

Familiarity with containerization tools like Docker or Kubernetes

Overall Responsibilities

Design, develop, and maintain scalable data pipelines using GCP, Databricks, and associated tools

Write efficient, well-documented SQL queries to support data transformation, data quality, and reporting needs

Integrate data from diverse sources, including APIs, cloud storage, and databases, to create a reliable central data repository

Develop automated workflows and schedules for data processing tasks utilizing Composer or Airflow

Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions

Monitor, troubleshoot, and optimize data pipelines for performance, scalability, and reliability

Maintain data security, privacy standards, and documentation compliance

Stay informed about emerging data engineering technologies and apply them effectively to improve workflows

Technical Skills (By Category)

Programming Languages :

Required : PySpark (Python in Databricks), SQL

Preferred : Python, Java, or Scala for custom data processing

Databases / Data Management :

Required : BigQuery, relational databases, large-scale data transformation and querying

Preferred : Data cataloging and governance tools

Cloud Technologies :

Required : GCP services including BigQuery, Cloud Storage, Composer

Preferred : Experience with other cloud services (AWS, Azure)

Frameworks and Libraries :

Required : Databricks with PySpark, Airflow or Cloud Composer

Preferred : Data processing frameworks such as Apache Beam, Dataflow

Development Tools and Methodologies :

Version control using Git

CI / CD pipelines for automated deployment and testing

Agile development practices

Security & Compliance :

Knowledge of data security best practices, access controls, and data privacy regulations

Experience Requirements

Minimum of 3 years of professional experience in data engineering or a related role

Proven expertise in designing and implementing large-scale data pipelines using GCP and Databricks

Hands-on experience with complex SQL query development and optimization

Working knowledge of workflow orchestration tools such as Airflow or Cloud Composer

Experience processing data from multiple sources, including APIs and cloud storage solutions

Experience in an Agile environment preferred

Candidates with strong data pipeline experience on other cloud platforms who are willing to adapt and learn GCP services may be considered.

Day-to-Day Activities

Develop, test, and deploy data pipelines that facilitate analytics, reporting, and data science initiatives

Collaborate with cross-functional teams during sprint planning, stand-ups, and code reviews

Monitor scheduled jobs for successful execution, troubleshoot failures, and optimize performance

Document processes, workflows, and data sources in compliance with organizational standards

Continuously review pipeline performance, implement improvements, and ensure robustness

Participate in scalable architecture design discussions and recommend best practices

Qualifications

Bachelor’s degree in Computer Science, Data Science, Information Technology, or equivalent field

At least 3 years of experience in data engineering, data architecture, or related roles

Demonstrated expertise with GCP, Databricks, SQL, and workflow orchestration tools

Certifications (preferred) :

GCP certifications such as Professional Data Engineer or equivalent

Databricks Data Engineer certification

Professional Competencies

Critical thinking and effective problem-solving skills related to large-scale data processing

Strong collaboration abilities across multidisciplinary teams and stakeholders

Excellent communication skills with the ability to translate technical details into clear insights

Adaptability to evolving technologies and project requirements

Ability to prioritize tasks, manage time efficiently, and deliver on deadlines

Innovative mindset with a focus on continuous learning and process improvement

Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Create a job alert for this search

Gcp Data Engineer • Hinjawadi, Maharashtra, India

Related jobs
  • Promoted
  • New!
GCP Data Engineer

GCP Data Engineer

Zensar TechnologiesPune, Maharashtra, India
Work with the team in capacity of GCP Data Engineer on day to day activities.Solve problems at hand with utmost clarity and speed. Train and coach other team members.Ability to turn around quickly.W...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Data Engineer

Data Engineer

ALIQAN TechnologiesPune, Maharashtra, India
Responsible for designing, developing, and maintaining data pipelines and ETL processes.Databricks and Azure Data Factory. Good experience in using SQL, Python and Spark within Databricks notebooks ...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Data Engineer _ AWS + Python

Data Engineer _ AWS + Python

FractalPune, Maharashtra, India
It's fun to work in a company where people truly BELIEVE in what they are doing!.Data Analytics Engineer- AWS at Fractal. Fractal is one of the most prominent players in the Artificial Intelligence ...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
GDT WPB DF GCP Lead Data Engineer

GDT WPB DF GCP Lead Data Engineer

VirtusaPune, Maharashtra, India
GDT WPB DF GCP Lead Data Engineer - CREQ Description GDT WPB DF GCP Lead Data Engineer JD : .Technical Lead with Cloud GCP Experience 8 to 12 Years. Experience in working within an agile, multidiscip...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Data Engineer

Data Engineer

Red Hat, Inc.Pashan, Maharashtra, India
The Data Engineer exercises judgment when following general instructions and works with minimal instruction to support the integration and automation of data solutions. This role focuses on data mas...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Senior Data Engineer (AWS, Airflow, EMR, Python, Spark, Gitlab / GitHub)

Senior Data Engineer (AWS, Airflow, EMR, Python, Spark, Gitlab / GitHub)

NielsenIQPune, Maharashtra, India
Become a substential part of the Data Platform team, which provides engineering services across the Media Measurement department. Design, develop, test and deploy big data applications and pipelines...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Azure Databricks Senior Data Engineer

Azure Databricks Senior Data Engineer

ExusiaPune, Maharashtra, India
Sr Data Engineers & Tech Leads – Python / Pyspark / Databricks.Sales and Delivery Team - Empower.Information Technology & Services, Computer Software, Management Consulting. Bachelor of Engineering or E...Show moreLast updated: 5 hours ago
  • Promoted
Data Engineer

Data Engineer

Innodata Inc.Pune, IN
CI / CD practices, Databricks (Spark), Python, Github and SQL.The ideal candidate should have hands-on expertise in building and automating data pipelines, managing multi-environment deployments, and...Show moreLast updated: 28 days ago
  • Promoted
  • New!
Sr. Data Engineer - AWS+Python+Pyspark Job

Sr. Data Engineer - AWS+Python+Pyspark Job

YASH TechnologiesPune, Maharashtra, India
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences,...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Lead Consultant - Data Engineer AWS+ Python

Lead Consultant - Data Engineer AWS+ Python

GenpactPune, Maharashtra, India
Genpact (NYSE : G) is a global professional services and solutions firm delivering outcomes that shape the future.Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepr...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Data Engineer

Data Engineer

AccentureHinjawadi, Maharashtra, India
Design, develop and maintain data solutions for data generation, collection, and processing.Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to ...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Data Engineer – Python / PySpark | 3-10 yr Experience

Data Engineer – Python / PySpark | 3-10 yr Experience

CBNsensePune, Maharashtra, India
Job Title : Data Engineer – Python / PySpark .Location : Pune (Client - USA).Full-time, hybrid work, 3 days / week in office(Client location. Client Domain : Utilities / Energy.We are seeking a highl...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
AWS Data Engineer

AWS Data Engineer

Epergne SolutionsPune, Maharashtra, India
Pune / Hyderabad / Chennai / Mysuru / Bhubaneswar / Mangalore / Trivandrum / Chandigarh / Jaipur / Nagpur / Indore / Gurgaon. Design, build, and optimize data pipelines and scalable data assets.Develop and maintain high-...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Lead Consultant - Data Engineer AWS + Python

Lead Consultant - Data Engineer AWS + Python

GenpactPune, Maharashtra, India
Genpact (NYSE : G) is a global professional services and solutions firm delivering outcomes that shape the future.Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepr...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
GCP_Data Engineer

GCP_Data Engineer

FractalPune, Maharashtra, India
It's fun to work in a company where people truly BELIEVE in what they are doing!.Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation...Show moreLast updated: 5 hours ago
  • Promoted
  • New!
Data Engineer

Data Engineer

Aumni TechworksPune, Maharashtra, India
We are looking for an experienced Data Engineer with 3 to 5 years of experience to join our team and contribute to building scalable, reliable, and high-performing data pipelines.The ideal candidat...Show moreLast updated: 5 hours ago
  • Promoted
Databricks Data Engineer

Databricks Data Engineer

AscendionPune, Maharashtra, India
Job Title : Senior Data Engineer.Location : Gurgaon / Pune / Bangalore.Skills : PySpark, SQL, Databricks, AWS.The ideal candidates should have hands-on expertise in building. Databricks and AWS, along w...Show moreLast updated: 30+ days ago
  • Promoted
  • New!
Data Engineer

Data Engineer

NetlinkPune, Maharashtra, India
Design, develop, and maintain efficient and scalable data pipelines using Python, PySpark, and Databricks on AWS.Work closely with data scientists, analysts, and other engineers to ensure smooth da...Show moreLast updated: 5 hours ago