Company Description
DataMoo AI, a Chennai-based start-up and the AI Unit of Matvik Solutions Pvt Ltd, specializes in developing innovative products in the field of Artificial Intelligence. The company collaborates with highly skilled doctorates and industry experts to deliver cutting-edge solutions in areas such as conversational AI, natural language processing, cloud data analytics, machine learning, and image and video analytics.
This is a high-visibility, front-facing role with frequent interaction with Investments, Investor Relations, financial planning and analysis, and Accounting. This role is partly stakeholder-facing but primarily focused on designing data models, writing data transformation code, and operating pipelines in Azure Databricks.
Role and Responsibilities
o Build and maintain end-to-end data pipelines in Azure Databricks :
§ Upstream data ingestion from Yardi, Salesforce, Argus, and other systems.
§ Transformation and enrichment logic using Delta Lake, Structured Query Language, and (where needed) Python.
§ Downstream data delivery into curated tables, data marts, and business intelligence layers.
o Implement job orchestration, scheduling, and monitoring to ensure reliable daily and monthly runs.
o Design and implement logical and physical data models for :
§ Investment performance, investor and partner views, and allocation / billing related to corporate financial planning and analysis.
o Create and maintain data marts and reporting-ready layers optimized for consumption by Domo and other tools.
o Apply dimensional modeling and best practices for keys, hierarchies, and slowly changing dimensions.
o Optimize queries, tables, and pipelines for performance and cost (cluster configuration, partitioning, indexing, caching where appropriate).
o Implement data quality checks, validations, and alerts ; work with business analysts and business intelligence developers to resolve issues.
o Contribute to the design of the overall data architecture , including bronze / silver / gold patterns, naming standards, and governance.
o Work closely with the Product Owner / Senior Business Analyst to align data models to business definitions and the glossary.
o Provide curated datasets and views to Business Intelligence Developers for dashboards and reports.
o Support system and user acceptance testing by investigating data issues and providing fixes and explanations.
Skills and Experience
o Strong data engineering skills :
§ Building ingestion, transformation, and delivery pipelines.
§ Writing and maintaining production-grade data transformation code.
o Strong data modeling skills (dimensional and relational).
o Strong Structured Query Language skills, including tuning and optimization.
o Ability to design scalable, maintainable data architectures that serve multiple reporting and analytics needs.
o Problem-solving mindset with focus on reliability, performance, and data quality.
o Hands-on experience with Azure Databricks , including :
§ Delta Lake tables, notebooks, jobs, and cluster management.
o Strong Structured Query Language ; Python experience is a plus but not required.
o Experience with Unity Catalog or similar tools for data governance and lineage.
o Experience with Databricks Genai in creating agents for extracting and processing data / information from structured and unstructured data
o Use of Git or similar for source control and code management.
o Exposure to Domo or other business intelligence tools is helpful for understanding downstream needs.
o Typically 4–8 or more years in data engineering or data modeling roles.
o Proven experience building and running production data pipelines, data marts, and data warehouses or data lakes .
o Prior work with financial, investment, or property data is strongly preferred.
o Background in real estate, private equity, asset management, or financial services is a strong advantage.
Drop your CV to reachus@datamoo.ai
Data Engineer • Bikaner, IN