About Client :
Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media.
Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia.
Job Title : AWS Data Engineer
Key Skills : Required skills (maximum of 6) : Pyspark, airflow, AWS S3, EKS, CICD, SQL, Shell script
Job Locations : Bangalore
Experience : 5 – 7 Years
Budget : - 13LPA
Education Qualification : Any Graduation
Job Description : Key Responsibilities
- Design, develop, and optimize ETL / ELT data pipelines using PySpark and Airflow .
- Build scalable data ingestion and processing workflows on AWS (S3, Glue, EMR, Lambda, EKS, IAM).
- Implement containerized data processing jobs on Amazon EKS and manage Kubernetes deployments.
- Develop and maintain CI / CD pipelines to support automated deployment of data workflows.
- Optimize performance of distributed data processing systems (Spark jobs, SQL queries, cluster tuning).
- Write efficient SQL queries for transformations, analytics, and data validation.
- Automate workflows using Shell scripting and Infrastructure-as-Code (IaC) tools.
- Collaborate with data architects, analysts, and business stakeholders to deliver high-quality data solutions.
- Ensure data integrity, security, and compliance with best practices and company standards.
- Monitor, troubleshoot, and improve data pipelines and cloud resources.
Required Skills & Qualifications
Strong hands-on experience with PySpark for large-scale data processing.Solid experience with Apache Airflow for orchestration and workflow management.Expertise in AWS cloud services including :S3, Glue, Lambda, IAM, EMR, EKS, CloudWatch, KMSProficiency with SQL (query optimization, analytical queries).Strong Shell scripting experience for automation.Experience working with CI / CD pipelines (GitLab CI, GitHub Actions, CodePipeline, or similar).Understanding of distributed systems, data modeling, and big data concepts.Experience with Docker and Kubernetes (EKS preferred).Familiarity with version control (Git) and DevOps best practices.Interested Candidates please share your CV to vamsi.v@people-prime.com