We are looking for an experienced PySpark AWS Cloud Designer / Developer with a strong background in data integration and ETL. The ideal candidate will have :
- Over 5 years’ experience designing and developing ETL integration patterns using Python and Spark.
- 3+ years working with PySpark for data processing (Databricks or Apache Spark experience is a plus).
- Proven ability to create and maintain data integration pipelines, including migrating data (DB2 to Amazon S3).
- Experience handling structured, semi-structured, and unstructured data formats (CSV, XML, HTML, SQL, JSON).
- Strong analytical and database skills (complex queries, query optimization, debugging, indexes).
- Familiarity with source control (Git, Bitbucket) and CI tools (Jenkins).
- Ability to support enterprise data platforms and collaborate with cross-functional teams.
Years of Experience : 7 - 12 years.
Location : Bengaluru , Pune.