Lead data modelling discussions, stakeholder management and manage two junior data modelers
Requirements :
GCP Experience is must
Extensive experience modelling designing, optimizing, and maintaining structured and semi-structured data models in any cloud data warehouse (e.g. in either BigQuery, Amazon Redshift, Microsoft Azure Synapse Analytics etc.) for analytical and operational use cases.
Knowledge of cloud storage and processing optimizations (in either BigQuery, Amazon Redshift, Microsoft Azure Synapse etc.) , ensuring efficient data structuring for cost-effective querying and scaling.
Strong understanding of normalized and denormalized data modelling techniques, including Star and Snowflake schemas, to support efficient querying and reporting.
Proficiency in writing efficient SQL queries, using techniques like partitioning, clustering, and materialized views to enhance performance and cost-effectiveness.
Ability to determine the right balance between normalization and denormalization to optimize query performance while maintaining data integrity.
Expertise in defining and evolving schemas over time to accommodate business needs while minimizing disruptions and ensuring backward compatibility.
Experience implementing data validation, consistency checks, and schema enforcement to maintain data accuracy, completeness, and reliability.
Strong emphasis on data documentation, including entity-relationship diagrams (ERDs), data dictionaries, and lineage tracking to improve data discoverability and usability.
Knowledge of BigQuerys storage and processing optimizations, ensuring efficient data structuring for cost-effective querying and scaling.
Ability to work closely with data engineers, analysts, and business teams to align data models with business needs, ensuring best practices in data architecture