The Enterprise
Data Architect / Data Modeler
is a senior technical leader responsible for defining, designing and governing the organization’s enterprise data architecture. You will ensure that data is modeled, structured and integrated across the organization to support analytics and AI / ML initiatives at scale. You will work closely with business stakeholders and technology teams to translate data requirements into robust logical and physical data models and establish enterprise-wide data standards that ensures consistency, quality and security across the group.
Roles & Responsibilities
Design and implement enterprise data models (conceptual, logical and physical) to support analytical, transactional and AI / ML workloads.
Define and maintain data architecture standards, modeling conventions, integration patterns and best practices across the group.
Lead the design and optimization of
data warehouse and data lake
architectures.
Model and manage structured, semi-structured and unstructured data using relational and
NoSQL databases .
Develop data integration strategies ( ETL / ELT ) and partner with data engineering teams to build reliable pipelines.
Optimize
database design for performance , partitioning, indexing, and query efficiency.
Partner with
data governance , security and compliance teams to implement data security, privacy, lineage, cataloging and access control frameworks.
Collaborate with data scientists, BI developers and data engineers to ensure architecture enables advanced analytics, AI / ML pipelines and self-service BI.
Qualifications, Experience, and Skills
QUALIFICATIONS
Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or related field.
Master’s degree and certifications in
cloud data architecture
(AWS, Azure, GCP) or enterprise architecture frameworks (TOGAF, DAMA-DMBOK) are a plus.
EXPERIENCE
7+ years of progressive experience in data architecture, data engineering or database design.
Proven experience designing enterprise-scale data models and leading architecture design.
Hands-on expertise with relational databases (PostgreSQL, MySQL, SQL Server, Oracle) and NoSQL technologies (MongoDB, Cassandra, DynamoDB, Cosmos DB).
Exposure to graph databases (Neo4j, Amazon Neptune) is a plus.
Experience with
cloud data platforms
such as Snowflake, BigQuery, Redshift or Azure Synapse.
Experience with
ETL / ELT tools
such as Informatica, Talend, dbt or Apache NiFi.
Experience designing real-time streaming pipelines with Kafka, Spark Streaming or Flink.
TECHNICAL AND INTERPERSONAL SKILLS
Expert in
data modeling tools
(Erwin, ER / Studio, SQL Power Architect).
Advanced SQL skills for schema design, optimization and performance tuning.
Proficiency in
Python or Scala for scripting and pipeline prototyping .
Knowledge of
orchestration frameworks
(Apache Airflow, Luigi, Prefect).
Familiarity with containerization (Docker, Kubernetes) and
DevOps & CI / CD
practices.
Understanding of data governance, security and compliance frameworks.
Enterprise Architect • Vapi, Gujarat, India