Genpact (NYSE : G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI.
Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python!
Responsibilities
- Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift)
- Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness.
- Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms.
- Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost
- Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc.
- Build data pipelines by building ETL processes (Extract-Transform-Load)
- Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data.
- Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs
- Analyse requirements / User stories at the business meetings and strategize the impact of requirements on different platforms / applications, convert the business requirements into technical requirements
- Participating in design reviews to provide input on functional requirements, product designs, schedules and / or potential problems
- Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security
- Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way
- Coordinate with release management, other supporting teams to deploy changes in production environment
Qualifications we seek in you!
Minimum Qualifications
Experience in designing, implementing data pipelines, build data applications, data migration on AWSStrong experience of implementing data lake using AWS services like Glue, Lambda, Step, RedshiftExperience of Databricks will be added advantageStrong experience in Python and SQLStrong understanding of security principles and best practices for cloud-based environments.Experience with monitoring tools and implementing proactive measures to ensure system availability and performance.Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment.Strong communication and collaboration skills to work effectively with cross-functional teams.Preferred Qualifications / Skills
Master’s Degree-Computer Science, Electronics, Electrical.AWS Data Engineering & Cloud certifications, Databricks certificationsExperience of working with Oracle ERPExperience with multiple data integration technologies and cloud platformsKnowledge of Change & Incident Management process