Data EngineerIndia, RemoteFull Time / Contract# of Positions - 3Job SummaryThe Data Engineer will be responsible for designing, developing, and optimizing scalable data pipelines and cloud-based data solutions. This role requires strong Python programming skills, expertise in ETL / ELT processes, and deep hands-on experience with AWS cloud services such as S3, Glue, Lambda, Redshift, Kinesis, and DynamoDB. The ideal candidate will excel in building serverless data architectures, designing efficient data models, and ensuring robust pipeline performance through monitoring and optimization. The position is remote within India and open for both full-time and contract engagements.Key SkillsData Engineering & PipelinesETL / ELT pipeline development using PythonData extraction, transformation, and loading into data lakes / warehousesWorkflow orchestration and automationReal-time and batch data processingAWS Cloud ExpertiseS3, Glue, Lambda, Redshift, AuroraKinesis (real-time streaming)DynamoDB (NoSQL databases)AWS serverless architecture designIntegration of AWS managed services for data workflowsServerless & AutomationBuilding event-driven pipelines using AWS LambdaServerless compute optimizationAutomated triggering and orchestration of data processesData ModelingDesigning schemas for relational databases (OLTP / OLAP)NoSQL data modeling and storage optimizationUnderstanding of normalization, partitioning, indexingMonitoring & OptimizationAWS monitoring tools (CloudWatch, CloudTrail)Pipeline performance tuningCost optimization across cloud resourcesCollaboration & CommunicationCross-functional collaboration with analysts, data scientists, and engineering teamsStrong problem-solving, documentation, and communication skillsMinimum QualificationsBachelor’s degree in Computer Science, Information Technology, Engineering, or related field (preferred).Strong hands-on experience with Python for data engineering tasks.Deep understanding of AWS cloud services used for data storage, streaming, ETL, and serverless compute.Experience designing and implementing ETL / ELT pipelines .Solid understanding of relational and NoSQL data modeling concepts.Professional Experience RequirementsProven experience building scalable data pipelines using Python.Experience extracting and transforming data from diverse sources into cloud-based environments.Hands-on experience with AWS services such as S3, Glue ETL, Lambda, Redshift / Aurora, Kinesis, and DynamoDB.Demonstrated ability to design and implement serverless architectures.Experience creating event-driven workflows using Lambda and other AWS triggers.Strong experience in monitoring, debugging, and optimizing cloud-based data workflows.Experience collaborating across data science, analytics, and engineering teams to deliver complete data solutions.
Data Engineer • Bhubaneswar, Odisha, India