Job Title : Data Engineer (AWS + Databricks + PySpark)
Experience : 5–12 years
Location : Pan India
Notice Period : Immediate joiners only
Job Description :
Looking for a Data Engineer with strong experience in AWS, Databricks, PySpark, and SQL. The role involves building and maintaining data pipelines, ETL processes, and cloud-based data solutions.
Required Skills :
AWS (S3, Glue, Lambda, EMR, EC2)
Databricks
PySpark
SQL (must-have)
ETL / ELT development
Data pipeline design and optimization
Data warehousing concepts
Good to Have : Python scripting
CI / CD tools (Git, Jenkins)
Delta Lake or Lakehouse knowledge
Aws Engineer • Hyderabad, Telangana, India