Role : AWS Data Engineer
JOB LOCATION : Chennai, Pune
EXPERIENCE REQUIREMENT : 5+
Required Technical Skill :
Strong Knowledge of Aws Glue / AWS REDSHIFT / SQL / ETL. Good knowledge and experience in Pyspark for forming complex Transformation logic.
AWS Data Engineer,
SQL,ETL, DWH , Secondary : AWS Glue , Airflow
Must-Have
- Good Knowledge of SQL , ETL
- A minimum of 3 + years' experience and understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS.
- Work well independently as well as within a team
- Good knowledge in working with various AWS services including S3, Glue, DMS, Redshift.
- Proactive, organized, excellent analytical and problem-solving skills
- Flexible and willing to learn, can-do attitude is key
- Strong verbal and written communication skills
Good-to-Have
Good knowledge of SQL ,ETL ,understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS.Good knowledge in working with various AWS services including S3, Glue, DMS, RedshiftResponsibility : AWS Data Engineer
Pyspark / Python / SQL / ETL
A minimum of 3 + years' experience and understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS
Good knowledge of SQL, ETL and also working with various AWS services including S3, Glue, DMS, Redshift