Talent.com
This job offer is not available in your country.
Data Architect

Data Architect

R Systemsdelhi, delhi, in
26 days ago
Job description

Job Title : Data Engineering Architect

Experience : 10-16 Years

Location : Pune & Noida

Work Mode : Hybrid - Full time

Key Responsibilities

  • Data Migration & Modernization
  • Lead the migration of data pipelines, models, and workloads in Redshift.
  • Design and implement landing, staging, and curated data zones to support scalable ingestion and consumption patterns.
  • Evaluate and recommend tools and frameworks for migration, including file formats, ingestion tools, and orchestration.
  • Design and build robust ETL / ELT pipelines using Python, SQL, and orchestration tools
  • Support both batch and streaming pipelines, with real-time processing via rudderstack, or Spark Structured Streaming.
  • Build modular, reusable, and testable pipeline components that handle high volume and ensure data integrity.
  • Define and implement data modeling strategies (star, snowflake, normalization / demoralization) for analytics and BI layers.
  • Implement strategies for data versioning, late-arriving data, and slowly changing dimensions.
  • Implement automated data validation and anomaly detection (using tools like dbt tests, Great Expectations, or custom checks).
  • Build logging and alerting into pipelines to monitor SLA adherence, data freshness, and pipeline health.
  • Contribute to data governance initiatives including metadata tracking, data lineage, and access control.

Required Skills & Experience

  • 10+ years in data engineering roles with increasing responsibility.
  • Proven experience leading data migration or re-platforming projects.
  • Strong command of Python, SQL for data pipeline development.
  • Experience working with dbt models.
  • Hands-on experience with modern data platforms like postgreSQL, Redshift.
  • Proficient in building streaming pipelines with tools like Kafka, rudderstack.
  • Deep understanding of data modeling, partitioning, indexing, and query optimization.
  • Expertise with Apache airflow for ETL orchestration.
  • Comfortable working with large datasets and solving performance bottlenecks and optimizing table structures.
  • Experience in designing data validation frameworks and implementing DQ rules.
  • Strong understanding of git hub and code migration techniques.
  • Familiarity with reporting tools like tableau, power bi.
  • Knowledge of financial domain. Preferably loans
  • Create a job alert for this search

    Data Architect • delhi, delhi, in