Talent.com
This job offer is not available in your country.
Tricog Health - Data Engineer - SQL / ETL

Tricog Health - Data Engineer - SQL / ETL

Tricog Health India Private LimitedBangalore
15 days ago
Job description

Responsibilities :

  • Design, develop, and optimize complex SQL queries to transform and extract insights from large datasets.
  • Build and maintain scalable ETL pipelines using Python to ingest, process, and load data from various sources into our data lake and data warehouse.
  • Implement best practices for data governance, data quality, and data security.
  • Work with massively parallel processing (MPP) data warehouses like BigQuery, Redshift, and Snowflake to manage and analyze large datasets efficiently.
  • Collaborate with data scientists, analysts, and other stakeholders to understand their data needs and provide robust data solutions.
  • Set up and build data sources for dashboards to minimize repetitive ad-hoc data requests.
  • Monitor and troubleshoot data pipelines to ensure reliability and performance.
  • Document data workflows, ETL processes, and system architectures.
  • Stay current with the latest data engineering technologies and methodologies to continuously improve our data infrastructure.
  • Work within AWS or GCP environments; experience with multi-cloud environments is preferred.
  • Develop and maintain data models using dbt (data build tool) to ensure consistency, quality, and efficiency in data transformations.
  • Evaluate proof of concepts with new data solutions to improve existing architecture.

Requirements :

  • Proven experience in writing and optimizing complex SQL queries.
  • Strong proficiency in Python for ETL pipeline development.
  • Experience with docker containerization and Kubernetes for pipeline orchestration.
  • Experience with any MPP data warehouses such as BigQuery, Redshift, and Snowflake.
  • Experience working with cloud platforms such as AWS and GCP; multi-cloud experience is a plus.
  • Experience with data flow orchestration tools like Apache Airflow or Dagster.
  • Experience with data modeling and transformation using dbt (data build tool).
  • Solid understanding of data warehousing concepts, data modeling, and data architecture.
  • Knowledge of data governance, data quality, and data security best practices.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills, with the ability to work effectively in a team environment.
  • Experience with data catalogs and data lineage tools would be a plus
  • ref : hirist.tech)

    Create a job alert for this search

    Data Engineer • Bangalore