Job Description
Role Summary
The incumbent should have
- Hands on experience in Python for Data Prototyping, and delivering projects in Data related projects using different AWS technologies like S3, Lambda, Glue, PySpark,
- Working knowledge of AWS environment (S3, CLI, Sage Maker)
- The incumbent would be responsible to understand & develop ETL Data Prototypes using Python. S / he would work towards creating a positive and innovation friendly environment.
- Programming : Unix shell scripting, Python, PySpark, MSSQL
- Design and develop ETL Rules in AWS Glue
Core Responsibility
Strong experience in delivering projects in using Python.Exposure of working in Global environment and have delivered at-least 1-2 projects on Python.Delivery collaboration & coordination with multiple business partners.Must have good experience in leading projects.Good to have Industry knowledge of the Insurance industry with proven experience across multiple clientsImplemented the developed methodology on Cloud, using AWS services like S3, Lambda, Glue, PySpark, SQS, DynamoDBQualification & Experience
Bachelors in Technology(B.Tech) / Masters in computer application(MCA) or equivalent qualification.5-8 years of relevant experience as a Data Engineer.Job Category :
IT - Digital Development
Posting End Date : 29 / 09 / 2025