Talent.com
This job offer is not available in your country.
Developer – Data Engineering & Cloud Analytics

Developer – Data Engineering & Cloud Analytics

GeakMinds, IncDelhi, IN
9 hours ago
Job description

Developer – Data Engineering & Cloud Analytics

Role Overview :

Responsible for building, maintaining, and optimizing large-scale data pipelines and analytics solutions leveraging Snowflake, Kafka, Splunk, Airflow, AWS, Apache Iceberg, and Presto. The candidate will bring hands-on development skills and collaborate with architects, analysts, and DevOps teams to deliver reliable, efficient, and scalable data services.

Key Responsibilities

  • Develop and maintain ETL / ELT pipelines using Airflow, orchestrating data movement across Snowflake, AWS, Iceberg, and Kafka systems.
  • Implement and optimize real-time data ingestion and streaming solutions using Apache Kafka, ensuring high throughput and fault tolerance.
  • Integrate Apache Iceberg and Presto for interactive analytics on large-scale data lakes.
  • Write SQL, Python, and / or Scala code for complex data transformations, metric calculations, and business logic deployment.
  • Collaborate with data architects to evolve data models and ensure alignment with enterprise best practices.
  • Utilize Splunk for operational monitoring, log analysis, and incident troubleshooting within data workflows.
  • Deploy and manage infrastructure on AWS (S3, EC2, Glue, Lambda, IAM), focusing on automation, scalability, and security.
  • Document pipelines, produce clear runbooks, and share technical knowledge with team members.

Required Skills & Experience

  • 4 to 6 years of hands-on development experience with modern data stack components : Snowflake, Apache Kafka, Airflow, and AWS.
  • Strong working knowledge of scalable SQL (preferably Snowflake, Presto) and scripting (Python / Scala).
  • Experience implementing data lake solutions with Apache Iceberg.
  • Familiarity with Splunk for monitoring and event management.
  • Proven history of building, deploying, and troubleshooting ETL / ELT data flows and real-time streaming jobs.
  • Knowledge of IAM, networking, and security concepts on AWS.
  • Preferred Qualifications

  • Bachelor’s degree in Computer Science, Engineering, or related field.
  • Experience in cloud-native data warehousing, cost optimization, and compliance.
  • Certifications in AWS, Snowflake, or other relevant technologies.
  • This role is ideal for candidates who enjoy end-to-end work on cloud-native analytics platforms, working with cutting-edge data streaming and lakehouse technologies in production environments.

    Create a job alert for this search

    Data Engineering • Delhi, IN