What we do :
GMG is a global well-being company retailing, distributing and manufacturing a portfolio of leading international and home-grown brands across sport, food and health sectors. Its vision is to inspire people to win in ways that make the world better. Today, GMG’s investments span four key verticals : GMG Sports, GMG Food, GMG Health, and GMG Consumer Goods. Under the ownership and management of the Baker family, it has become a leading global company, affiliated with the world’s most successful and respected brands in the well-being sector. Working across the Middle East, North Africa, and Asia, GMG has introduced more than 120 brands into its markets.
What will you do :
We are seeking a highly skilled Data Engineer specializing in AWS and Databricks. The ideal candidate will design, build, and maintain scalable data pipelines, ensuring efficient data ingestion, processing, and integration from multiple sources. This role requires expertise in AWS services, PySpark, SQL, and Databricks, along with strong optimization, security, and cost management skills.
Roles and Responsibilities :
Data Engineering & Pipeline Development
Cloud & Infrastructure Management
Testing & CI / CD Best Practices
Optimization & Security
Functional / Technical Competencies :
Knowledge of Glue, PySpark, SQL, Athena, Lambda, SNS, S3
Knowledge of Databricks : Cluster setup, Notebooks, Libraries, CI / CD, Optimization
Data Processing : Event stream ingestion and batch processing
Testing : Writing unit test cases and integration tests
Security & Governance : AWS / Databricks governance standards and best practices
Performance Optimization : Query tuning, cluster performance improvements, cost reduction
Strong problem-solving and analytical skills
Ability to work in a fast-paced, cloud-based data environment
Excellent collaboration and communication skills
Strong attention to detail and commitment to best practices
Educational Qualification :
Bachelor's in computer science or computer engineering
Certification in Data Engineering and Analytics
Experience :
Minimum 6 years' experience in Data engineering (Core development / design), in which 3+ years in AWS with command on (AWS glue, pyspark, SQL, Athena, lambda, SNS, S3) and 1+ year in Databricks.
Data Engineer • Aurangabad, Maharashtra, India