Talent.com
This job offer is not available in your country.
Data Engineering Lead / ETL

Data Engineering Lead / ETL

KarixBengaluru, Karnataka, India
19 days ago
Job description

Job Overview :

  • Responsible for developing, deploying, monitoring and maintaining ETL Jobs and all data engineering and pipeline activities.
  • Having good knowledge of DB activities and support in DB solutions.
  • Having proven expertise in SQL queries.

Responsibilities and Duties

  • Design and construction of various enterprise procedure constructs using any ETL tool preferably PentahoDI.
  • Provide accurate work estimates and manager efforts across multiple lines of work
  • Design and develop exception handling and data cleansing / standardization procedures
  • Requirement gathering from various stakeholders related to ETL automation
  • Designing and creating the data extraction, transformation, and load of data functions
  • Data modelling of complex large data sets
  • Perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications
  • Should have knowledge and experience in latest data engineering and pipeline solutions via pyspark / Apache iceberg / Apache airflow
  • Good knowledge in DB activities and provide solutions in DB Maintenance activities such as Installation / Backup / Purging / Data Retention.
  • Qualification and Experience

  • B.E. / B.Tech / MCA
  • 10 years of experience in design and development of large-scale enterprise ETL solutions
  • Experience in any ETL tool primarily PentahoDI
  • Good knowledge and experience in any DataBase and in writing SQL queries.
  • Knowledge and Skills

  • Experience in full lifecycle software development and production support for DWH systems
  • Experience in data analysis, modelling (logical and physical data models) and design specific to a DWH / BI environment (normalized and multi-dimensional modelling)
  • Exposure in development of ETL packages and job using SPOON
  • Exposure in scheduling Pentaho ETL Jobs in crontab (i.e. Kitchen)
  • Exposure in Hadoop, Hive and PIG
  • Experience in SQL scripting for relational databases such as MySQL, PostgreSQL, etc
  • Experience in data loading tools like Flume, Sqoop
  • Knowledge of workflow / schedulers like Oozie
  • Knowledge of migrating existing dataflows into Big Data platforms
  • Experience in any opensource BI will be an added advantage
  • Experience in any DataBase will also be an added advantage.
  • Why join us?

  • Impactful Work : Play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry.
  • Tremendous Growth Opportunities : Be part of a rapidly growing company in the telecom and CPaaS space, with opportunities for professional development.
  • Innovative Environment : Work alongside a world-class team in a challenging and fun environment, where innovation is celebrated.
  • Tanla is an equal opportunity employer. We champion diversity and are committed to creating an inclusive environment for all employees.

    www.tanla.com

    Create a job alert for this search

    Data Engineering Lead • Bengaluru, Karnataka, India