Talent.com
This job offer is not available in your country.
AWS Big data pyspark SQL Spark Airflow

AWS Big data pyspark SQL Spark Airflow

ScaleneWorksBengaluru, Karnataka, India
30+ days ago
Job type
  • Quick Apply
Job description

Job Title : AWS Big Data Engineer

About the Role :

We are looking for a highly skilled AWS Big Data Engineer with extensive experience in PySpark, SQL, Spark, and Airflow. The ideal candidate will have a strong background in coding and a deep understanding of big data technologies. You will be responsible for designing, developing, and maintaining big data solutions on AWS, ensuring data is processed efficiently and effectively.

Key Responsibilities :

Design, develop, and maintain big data solutions using AWS services.

Write and optimize complex SQL queries.

Develop and manage data pipelines using PySpark and Spark.

Implement and manage workflows using Apache Airflow.

Collaborate with data scientists and analysts to understand data requirements.

Ensure data quality and integrity throughout the data lifecycle.

Troubleshoot and resolve issues related to big data processing.

Stay updated with the latest trends and best practices in big data technologies.

Key Skills and Qualifications :

Proven experience with AWS big data services.

Strong coding experience in PySpark and SQL.

Expertise in Spark and Airflow.

Excellent problem-solving and analytical skills.

Ability to work independently and as part of a team.

Strong communication and documentation skills.

Relevant certifications (e.g., AWS Certified Big Data - Specialty) are a plus.

Create a job alert for this search

Big Data • Bengaluru, Karnataka, India