Senior Data Engineer
Location : Coimbatore
Experience : 3–5 Years
Work Mode : On-site
About the Role
We are seeking a Senior Data Engineer who will take end-to-end ownership of designing, building, and scaling data pipelines, integrations, and analytics layers for our Enterprise AI Operational Platform .
The role involves integrating data from construction-focused enterprise tools such as Sage 300, Paperless, WorkMax, Procore, Kojo, CRM platforms , and other financial, procurement, and field applications to create a unified operational data ecosystem .
You will architect scalable systems, build automated ETL / ELT workflows, standardize data governance, and enable real-time insights, predictive analytics, and automated operational intelligence across the organization.
This position suits someone who brings strong technical depth, system architecture capability, and hands-on experience building large-scale data systems using Microsoft Azure and / or Snowflake .
Key Responsibilities
Data Architecture & System Design
- Architect modular, scalable, and secure data platforms.
- Design centralized data models to support analytics, forecasting, AI, and automation workloads .
- Define data governance standards including lineage, quality, and retention.
- Produce HLD / LLD architecture diagrams, integration maps, and technical documentation.
- Build frameworks that support complex cross-system workflows.
Integrations Across Enterprise Tools (Sage 300, Paperless, WorkMax, Procore, Kojo, CRM, etc.)
Lead integrations across construction and operations software including :Sage 300 (Financials, Job Costing, AP / AR, GL)Paperless (Document workflows, invoice automation)WorkMax (Timecards, units, production, field data)Procore (Project management, RFIs, submittals, schedules)Kojo (Procurement, materials management)CRM systems (Sales pipeline, customer records)Develop connectors using REST APIs, GraphQL, Webhooks, Secure File Transfer, and Database links .Standardize schemas across tools to create a single source of truth .Ensure secure and reliable data ingestion from on-prem and cloud systems.Implement robust incremental sync, batch loads, and real-time event-based ingestion.ETL / ELT Development & Data Pipelines
Build automated ETL / ELT workflows using Airflow, NiFi, Azure Data Factory, or custom frameworks .Ensure pipelines are fault-tolerant, versioned, and optimized for performance .Handle complex transformations, reconciliations, and multi-system data validation.Develop end-to-end data ingestion frameworks supporting large structured and unstructured datasets.Deploy pipelines into production with CI / CD practices.Data Warehousing, Modeling & Real-Time Reporting
Build enterprise-wide data warehouse layers using Snowflake (preferred) , Azure SQL, or cloud data stores.Maintain real-time operational data models for job cost, purchasing, productivity, CRM, safety, compliance, and field activity.Create unified analytical datasets for dashboards and business intelligence.Support executive and departmental reporting by eliminating manual spreadsheets and fragmented data silos.Ensure high data accuracy, reliability, and performance at scale.Predictive Analytics & AI Enablement
Prepare AI-ready datasets for forecasting, anomaly detection, productivity modeling, and operational risk scoring.Collaborate with data scientists / AI engineers to build and deploy ML pipelines.Develop streaming / near-real-time data flows to support AI-driven decision automation.Integrate predictive insights back into operation systems (alerts, workflows, dashboards).Automation & Operational Workflows
Automate KPI tracking, threshold-based alerts, and data validation rules.Build automated workflows for :RFIsCORsContractsVendor documentsField reportingImplement monitoring frameworks for pipeline health, error handling, retry logic, and SLA compliance.Documentation & Knowledge Sharing
Create detailed documentation for data models, transformations, integrations, and pipelines.Conduct internal training and handover sessions to engineering, analytics, and operations teams.Maintain continuous improvement of data standards and reusable components.Skills & Experience Required
Core Experience
3–5 years of hands-on Data Engineering / Data Platform development .Strong expertise in API-based integrations , especially with enterprise applications.Practical experience with :Python , FastAPI, Node.jsETL frameworks : Airflow / NiFi / Azure Data FactorySnowflake (strongly preferred) or Azure-based data warehousingSQL + cloud databasesAzure Functions, Storage, Data Lake environmentsData Architecture & Modeling
Strong data modeling principles (Kimball, Data Vault, or custom patterns).Building scalable data lakes, warehouse schemas, and analytical data marts.Tools & Platforms
Dashboarding tools : Power BI / TableauExperience working with large structured, semi-structured, and unstructured datasets .Background in financial or field operations workflows (preferred).Preferred Knowledge
Working with industry tools such as :Sage 300ProcoreWorkMaxKojoPaperlessExperience in building ML / AI data pipelines.Exposure to Azure DevOps, CI / CD , and cloud deployment best practices.Key Competencies
Strong system design and architecture thinking.High ownership, accountability, and delivery focus.Excellent analytical and problem-solving abilities.Clear communication and documentation skills.Ability to work with leadership, operations, finance, field teams, and engineering.Commitment to data accuracy, reliability, and operational performance.What We Offer
Opportunity to build a next-generation Enterprise AI Data Platform .Career growth into Data Platform Lead / Engineering Manager roles .Medical insurance and employee benefits.Supportive leadership, high visibility, and impactful responsibilities.Competitive compensation aligned with expertise and contribution.