Position : AWS Data Engineer
Experience : 4 - 8 Years
Education : B.E. / B.Tech from IIT or Tier I / II colleges
Job Summary :
We are seeking a highly skilled AWS Data Engineer with 4 - 8 years of experience to enable data-driven operations and decision-making in an agile, product-centric environment. The ideal candidate will have extensive hands-on experience with Python and big data platforms on AWS, including Redshift, Glue, and Lambda. You will be responsible for building robust data pipelines, managing data warehouses and data lakes, and integrating diverse data sources to create a seamless flow of information. This role requires a strong technical background, excellent communication skills, and a customer-focused mindset to drive business success.
Key Responsibilities Pipeline & Integration :
- Design, build, and maintain scalable and reliable data pipelines using AWS services like Glue and Lambda.
- Utilize workflow management tools such as Apache Airflow, Luigi, or Azkaban to orchestrate complex data flows.
- Integrate data from various sources into a centralized Data Lake and Data Warehouse.
Big Data Development :
Write high-quality, efficient code in Python and PySpark to process and transform large-scale datasets using the Spark engine.Develop and optimize SQL queries for data manipulation and analysis within the data warehouse.Platform Management :
Manage and optimize big data platforms on AWS, including Redshift for data warehousing.Ensure the data infrastructure is performant, secure, and scalable to meet growing business needs.Required Skills & Qualifications :
Core Experience :
4-8 years of total experience in a data engineering role.Hands-on experience with Python coding is a must.Proven experience with data engineering, data integration, and data pipeline development.Technical Proficiency :
Hands-on experience with AWS big data platforms such as Redshift, Glue, and Lambda.Proficiency in writing code using the Spark engine with Python and PySpark.Expertise in SQL.Experience with data pipeline and workflow management tools like Azkaban, Luigi, or Apache Airflow.Professional Attributes :
A business-focused, customer, and service-minded approach.Strong consultative and management skills.Excellent communication and interpersonal skills.Preferred Skills
Certification in a cloud platform (AWS or GCP).Experience with Data Lakes and Data Warehouses.(ref : hirist.tech)