Talent.com
ETL Architect

ETL Architect

ConfidentialBengaluru / Bangalore, India
1 day ago
Job description

BETSOL is a cloud-first digital transformation and data management company offering products and IT services to enterprises in over 40 countries. BETSOL team holds several engineering patents, is recognized with industry awards, and BETSOL maintains a net promoter score that is 2x the industry average.

BETSOL's open source backup and recovery product line, Zmanda (Zmanda.com), delivers up to 50% savings in total cost of ownership (TCO) and best-in-class performance.

BETSOL Global IT Services (BETSOL.com) builds and supports end-to-end enterprise solutions, reducing time-to-market for its customers.

BETSOL offices are set against the vibrant backdrops of Broomfield, Colorado and Bangalore, India.

We take pride in being an employee-centric organization, offering comprehensive health insurance, competitive salaries, 401K, volunteer programs, and scholarship opportunities. Office amenities include a fitness center, cafe, and recreational facilities.

Learn more at betsol.com

Job Description

Position Overview

We are seeking an experienced ETL Architect to design, implement, and maintain robust, configuration-driven ETL frameworks on SQL Server with SSIS and Azure Data Factory. This hands-on technical leadership role requires a seasoned professional who can architect scalable, reusable ETL solutions, establish best practices, and ensure high-performance data integration across enterprise systems. You'll be responsible for developing ETL strategy, creating templatized frameworks, and actively participating in development and troubleshooting while ensuring data integrity and operational excellence.

Key Responsibilities

ETL Architecture & Framework Design :

  • Design and implement configuration-driven, metadata-based ETL frameworks on SQL Server with SSIS and Azure Data Factory
  • Develop reusable, templatized ETL patterns and components to minimize custom development
  • Architect scalable solutions that support multiple data sources and targets
  • Create framework architectures that enable parameter-driven, dynamic ETL execution
  • Establish design patterns for batch processing, incremental loads, and real-time data integration
  • Design modular, maintainable code structures that promote reusability across projects

ETL Strategy & Standards :

  • Define and implement enterprise ETL strategy and roadmap
  • Establish coding standards, naming conventions, and development best practices
  • Create standardized logging frameworks with comprehensive audit trails and execution metrics
  • Design robust error handling mechanisms with alerting and notification capabilities
  • Implement restart / rerun capabilities to resume processing from point of failure
  • Develop data validation and reconciliation frameworks to ensure data integrity
  • Define SLAs and performance benchmarks for ETL processes
  • Data Integrity & Quality Assurance :

  • Design and implement data quality checks and validation rules within ETL pipelines
  • Ensure data accuracy, completeness, and consistency for downstream systems
  • Establish data lineage and impact analysis capabilities
  • Implement reconciliation processes between source and target systems
  • Create exception handling frameworks for data anomalies and business rule violations
  • Monitor data quality metrics and implement corrective measures
  • Performance Optimization :

  • Architect high-performance ETL solutions optimized for large-volume data processing
  • Analyze and tune SQL queries, SSIS packages, and ADF pipelines for optimal performance
  • Implement parallel processing strategies and optimize resource utilization
  • Design efficient indexing, partitioning, and data distribution strategies
  • Monitor ETL execution metrics and proactively address performance bottlenecks
  • Optimize data movement patterns and minimize data transformation overhead
  • Dependency Management & Orchestration :

  • Identify, document, and resolve cross-pipeline and cross-system dependencies
  • Design workflow orchestration and job scheduling strategies
  • Implement dependency management frameworks for complex ETL workflows
  • Coordinate data loads across multiple systems ensuring proper sequencing
  • Establish monitoring and alerting for dependency failures and SLA breaches
  • Hands-On Development & Troubleshooting :

  • Actively develop complex ETL packages, pipelines, and stored procedures
  • Troubleshoot production issues and implement timely resolutions
  • Perform root cause analysis for ETL failures and data discrepancies
  • Develop data fixes and correction scripts for production issues
  • Conduct code reviews and provide technical guidance to development team
  • Create proof-of-concepts for new technologies and design patterns
  • Technical Leadership & Collaboration :

  • Mentor and guide ETL developers and data engineers
  • Collaborate with data architects, DBAs, and business analysts
  • Participate in architecture review boards and technical design sessions
  • Document technical specifications, framework architecture, and operational procedures
  • Provide technical estimates and effort sizing for ETL initiatives
  • Evaluate and recommend ETL tools, technologies, and best practices
  • Required Qualifications

    Experience :

  • 8+ years of experience in ETL development and data integration
  • 4+ years in ETL architecture or technical leadership roles
  • Proven track record designing and implementing configuration-driven ETL frameworks
  • Extensive hands-on experience with SQL Server, SSIS, and Azure Data Factory
  • Strong background in enterprise-scale data warehouse and ETL implementations
  • Technical Expertise :

  • Expert-level proficiency in SQL Server and T-SQL (stored procedures, functions, complex queries)
  • Deep expertise in SSIS (SQL Server Integration Services) including :
  • Package development, configurations, and deployment
  • Control flow and data flow components
  • Custom components and script tasks
  • Package execution, logging, and error handling
  • SSIS Catalog (SSISDB) and project deployment model
  • Strong experience with Azure Data Factory including :
  • Pipeline development and orchestration
  • Mapping data flows and data flow activities
  • Parameterization and dynamic pipelines
  • Triggers, linked services, and integration runtimes
  • Monitoring and alerting
  • Advanced SQL performance tuning and optimization skills
  • Experience with metadata-driven and configuration-based ETL frameworks
  • Strong understanding of data warehousing concepts (dimensional modeling, star / snowflake schemas, SCDs)
  • Proficiency in version control systems (Git, Azure DevOps)
  • Knowledge of PowerShell or Python scripting for automation
  • Architecture & Design :

  • Proven ability to design scalable, enterprise-grade ETL architectures
  • Strong understanding of ETL design patterns and best practices
  • Experience with incremental loading strategies (CDC, watermarking, delta detection)
  • Knowledge of data integration patterns (batch, micro-batch, streaming)
  • Understanding of data governance, security, and compliance requirements
  • Core Competencies :

  • Exceptional problem-solving and analytical skills
  • Strong troubleshooting abilities with systematic approach to issue resolution
  • Excellent debugging skills for complex data integration scenarios
  • Ability to quickly diagnose and resolve production issues
  • Detail-oriented with focus on data accuracy and integrity
  • Self-motivated with ability to work independently
  • Strong communication skills for technical and non-technical audiences
  • Technical Environment

  • Platforms : SQL Server 2016+, Azure Data Factory, Azure Synapse Analytics
  • Tools : SSIS, SSMS, Visual Studio, Azure DevOps
  • Languages : T-SQL, PowerShell, Python (preferred)
  • Version Control : Git, Azure Repos
  • Monitoring : Azure Monitor, SQL Server Agent, custom logging frameworks
  • Key Success Metrics

  • ETL framework reusability and adoption rate
  • Reduction in development time through templatization
  • ETL process reliability and success rate
  • Data quality and integrity metrics
  • Performance improvement of ETL pipelines
  • Time-to-resolution for production issues
  • Reduction in manual interventions and data fixes
  • Team productivity and code quality improvements
  • Qualifications

  • BE / BTECH in Information Technology, Computer Science, or related field
  • Microsoft certifications (MCSA : SQL Server, Azure Data Engineer Associate)
  • Additional Information

    This position is expected to work in EST hours

    Skills Required

    Git, T-sql, Azure Data Factory, Powershell, Sql Server, Ssis, Python, Azure Devops

    Create a job alert for this search

    Architect Etl • Bengaluru / Bangalore, India