Talent.com
This job offer is not available in your country.
Data Engineer - AWS

Data Engineer - AWS

Patch Infotech Private LimitedDelhi, IN
26 days ago
Job type
  • Remote
Job description

Position : Data Engineer

Experience : 5-8 Years

Location : Work From Home

Job Summary

We are seeking a skilled Data Engineer with 5-8 years of experience to join our remote team. The ideal candidate will have extensive experience with AWS Glue and a strong background in building and maintaining robust data pipelines. You will be responsible for designing, developing, and optimizing our data infrastructure to support business intelligence and analytics. This role requires proficiency in Python, SQL, Kafka, and Apache Airflow, along with a commitment to implementing DataOps best practices for continuous delivery.

Key Responsibilities

ETL & Data Integration :

  • Design, build, and maintain scalable data pipelines using AWS Glue for data integration and transformation.
  • Utilize Python for scripting, automation, and custom data processing tasks.

Workflow & Orchestration :

  • Orchestrate complex data workflows using Apache Airflow to ensure efficient and reliable data delivery.
  • Implement and manage real-time data streaming and processing using Kafka.
  • Data Management & Quality :

  • Use advanced SQL skills to query, manage, and optimize relational databases.
  • Ensure data quality and accuracy across all data pipelines and systems.
  • DataOps & Collaboration :

  • Apply DataOps tools and methodologies for continuous integration and delivery (CI / CD) in data engineering.
  • Collaborate with cross-functional teams to understand data requirements and deliver on business objectives.
  • Required Skills

    Core Experience :

  • 5-8 years of experience as a Data Engineer.
  • Technical Proficiency :

  • Extensive hands-on experience with AWS Glue.
  • Strong proficiency in Apache Airflow for workflow orchestration.
  • In-depth knowledge of Kafka for real-time data streaming.
  • Advanced SQL skills.
  • Proficiency in Python for scripting and automation.
  • Databases :

  • Familiarity with SAP HANA for data storage and management.
  • Preferred Skills

  • Knowledge of Snowflake for cloud-based data warehousing.
  • Experience with other AWS data services like Redshift, S3, and Athena.
  • Familiarity with big data technologies such as Hadoop, Spark, and Hive.
  • Experience with DataOps tools and methodologies.
  • (ref : hirist.tech)

    Create a job alert for this search

    Aws Data Engineer • Delhi, IN