Talent.com
This job offer is not available in your country.
Data Engineer - Big Data Technologies

Data Engineer - Big Data Technologies

Talent SocioHyderabad
30+ days ago
Job description

Role Description :

Candidate will be responsible for translating client requirements into design, architecting and implementing Cloud & Non Cloud based big data solutions for clients. The role requires a hands-on technologist with expertise in Big Data solution architecture with strong programming background in Java / Scala / Python, should have experience in creating Data Ingestion pipelines for streaming and batch datasets, creating ETL / ELT data pipelines using distributed computing frameworks like Spark, Strom, Flink etc, orchestrating data pipelines, should have experience in setting up secure big data platform. Candidate also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud :

  • Provide technical leadership and hands-on implementation role in the areas of data engineering including data ingestion, data access, modeling, data processing, visualization, design and implementation.
  • Lead a team to deliver high quality big data technologies based solutions either on premise or on Cloud. Manage functional & non-functional scope and quality.
  • Help establish standard data practices like governance and address other non-functional issues like data security, privacy and quality.
  • Manage and provide technical leadership to a data program implementation based on the requirement using agile technologies.
  • Participate in workshops with clients and align client stakeholders to optimal solutions.
  • Consulting, Soft Skills, Thought Leadership, Mentorship etc.
  • People management, contributing in hiring and capability Experience and Competencies :
  • 3+ years of experience in Big Data technologies and expertise of 1+years in data related Cloud services (AWS / Azure / GCP) and delivered at least 1 project as an architect.
  • Mandatory to have knowledge of Big Data Architecture Patterns and experience in delivery of end to end Big data solutions either on premise or on Cloud.
  • Expert in Hadoop eco-system with one or more distribution like Cloudera and cloud specific distributions.
  • Expert in programming languages like Java / Scala and good to have Python.
  • Expert in one or more big data ingestion tools (Sqoop, Flume, NiFI etc), distributed messaging and ingestion frameworks (Kafka,Pulsar, Pub / Sub etc) and good to know traditional tools like Informatica, Talend etc.
  • Expert in distributed data processing frameworks like Spark (Core, Streaming , SQL), Storm or Flink etc.
  • Should have worked on MPP style query engines like Impala , Presto, Athena etc.
  • Should have worked on any of NoSQL solutions like Mongo DB, Cassandra, HBase etc or any of Cloud based NoSQL offerings like DynamoDB , Big Table etc.
  • Should have good understanding of how to setup Big data cluster security Authorization / Authentication, Security for data at rest, data in Transit.
  • Should have basic understanding of how to manage and setup Monitoring and alerting for Big data cluster.
  • Should have worked on any of Orchestration tools Oozie , Airflow , Ctr-M or similar.
  • Worked on Performance Tuning, Optimization and Data security.

ref : hirist.tech)

Create a job alert for this search

Big Data Engineer • Hyderabad