Talent.com
This job offer is not available in your country.
Module Lead - Data Fabric

Module Lead - Data Fabric

ConfidentialMumbai
9 days ago
Job description
  • a Module Lead in Data Fabric POD, you would be responsible for producing and implementing functional software solutions
  • You will work with upper management to define software requirements and take the lead on operational and technical projects
  • You would be working with a data management and science platform which provides Data as a service (DAAS) and Insight as a service (IAAS) to internal employees and external stakeholders
  • You are eager to learn technology-agnostic and love working with data and drawing insights from it
  • You have excellent organization and problem-solving skills and are looking to build the tools of the future
  • You have exceptional communication skills and leadership skills and the ability to make quick decisions
  • Educational Qualifications  - B.Tech / B.Ein Computers,

    Your job description

    • Work break-down and orchestrating the development of components for each sprint.
    • Identifying risks and forming contingency plans to mitigate them.
    • Liaising with team members, management, and clients to ensure projects are completed to standard.
    • Inventing new approaches to detecting existing fraud. You will also stay ahead of the game by predicting future fraud techniques and building solutions to prevent them.
    • Developing Zero Defect Software that is secured, instrumented, and resilient.
    • Creating design artifacts before implementation.
    • Developing Test Cases before or in parallel with implementation.
    • Ensuring software developed passes static code analysis, performance, and load test.
    • Developing various kinds of components (such as UI Components, APIs, Business Components, image Processing, etc ) that define the IDfy Platforms which drive cutting-edge Fraud Detection and Analytics.
    • Developing software using Agile Methodology and tools that support the same.
    • Skills Required :   Airflow,ETL,ETL pipeline design, Spark,Hadoop, Hive, System Architecture,

      Requirements :

    • Know-how of Apache BEAM, Clickhouse, Grafana, InfluxDB, Elixir, BigQuery, Logstash.
    • An understanding of Product Development Methodologies.
    • Strong understanding of relational databases especially SQL and hands-on experience with OLAP.
    • Experience in creating data ingestion pipelines and ETL(Extract, Transform & Load) pipelines (Good to have Apache Beam or Apache Airflow experience).
    • Strong design skills in defining API Data Contracts / OOAD / Microservices / Data Models.
    • Experience with Time Series DBs (we use InfluxDB) and Alerting / Anomaly Detection Frameworks.
    • Visualization Layers : Metabase, PowerBI, Tableau.
    • Experience in developing software in the Cloud such as GCP / AWS.
    • A passion for exploring new technologies and express yourself through technical blogs
    • Skills Required

      Airflow, Hadoop, Spark, Sql

    Create a job alert for this search

    Module Lead • Mumbai