Talent.com
Search jobs
Search salary
Tax calculator
For employers
Sign in
Find jobs
This job offer is not available in your country.
Perficient - Data Engineer - ETL / AWS
Perficient India Private Limited
Chennai
29 days ago
Job description
Key Responsibilities :
Design, build, and maintain scalable ETL pipelines and data workflows using AWS Glue, Spark, and Python
Implement data ingestion from structured and unstructured data sources into AWS Data Lake or data warehouse systems
Work with large-scale datasets to ensure efficient data transformation and loading across various AWS storage layers (S3, Redshift, RDS)
Write complex SQL queries for data validation, transformation, and reporting
Develop and maintain metadata, data lineage, and logging for data quality and traceability
Optimize data workflows for performance, scalability, and cost efficiency
Collaborate with Data Scientists, Analysts, and Business teams to understand data needs and deliver robust solutions
Ensure data governance, security, and compliance in cloud data pipelines
Perform unit and integration testing, and support deployment in lower and production environments
Contribute to best practices in cloud data architecture, DevOps, and CI / CD automation for data Technical Skills & Data Engineering :
5+ years of experience in Data Engineering or related roles
Strong hands-on experience with AWS Glue (Jobs, Crawlers, Workflows, Dynamic Frames)
Experience with AWS services : S3, Redshift, Lambda, Athena, Step Functions, CloudWatch, IAM
Strong experience in Apache Spark (PySpark preferred) for distributed data & Scripting :
Advanced skills in Python (data manipulation, scripting, exception handling, performance tuning)
Strong in SQL writing complex queries, stored procedures, and query Management & Workflow :
Experience in building and orchestrating ETL / ELT pipelines
Familiarity with schema design, data partitioning, compression formats (Parquet, ORC, Avro)
Hands-on with data cataloging, metadata management, and data Skills (Nice to Have) :
Experience with Airflow, AWS Step Functions, or other orchestration tools
Knowledge of DevOps for Data CI / CD pipelines using tools like Git, Jenkins, CodePipeline
Exposure to data warehousing concepts and experience with Redshift or Snowflake
Experience working in Agile Scrum environments
Understanding of data security, privacy, and compliance standards (GDPR, HIPAA, Skills :
Strong communication and collaboration skills, with experience in stakeholder interaction
Excellent analytical thinking and problem-solving abilities
Ability to work independently and within a cross-functional team
Detail-oriented, with a commitment to delivering high-quality, reliable solutions
ref :
hirist.tech)
Create a job alert for this search
Data Engineer • Chennai
Create