Data Architect with a strong foundation in data strategy and architecture to support Procurement Services – Data & Strategy team. Should be able to evaluate the current data landscape, recommend architectural improvements, and drive alignment between data engineering and business analytics teams. This role is critical in shaping a scalable and efficient data model that empowers procurement analytics and reporting.
Key Responsibilities
- Perform a comprehensive assessment of the current procurement data architecture to identify gaps and inefficiencies.
- Design and propose enhancements to the data model that align with procurement objectives and reporting needs.
- Collaborate with data engineers to guide implementation of architectural changes and data pipeline improvements.
- Partner with BI developers and business stakeholders to understand data consumption needs and ensure effective delivery.
- Provide a strategic, end-to-end view of the procurement data ecosystem, ensuring alignment with long-term business goals.
- Serve as a bridge between technical and functional teams to ensure clarity, cohesion, and data-driven decision-making.
Technical Skills & Experience
Hands-on experience with Google Cloud Platform (GCP), particularly GCP Data Lakes.Proficient in BigQuery for data querying and structuring.Familiarity with DBT (Data Build Tool) for modular SQL-based data transformations.Experience with Enterprise Load-Transform (ELT) architectures and large-scale data pipelines.Strong understanding of data modeling concepts – including normalization, star / snowflake schemas, and dimensional modeling.Ability to assess and improve data quality, lineage, and governance within an enterprise environment.Familiar with batch and streaming data processing patternsDesign scalable and maintainable data pipelines aligned with data governance principlesIntegration with diverse data sources like Oracle Autonomous Database (ATP / ADW), Object Storage, File systems, REST APIs, and on-prem databasesBuilt robust data ingestion pipelines for structured and semi-structured data (JSON, CSV, XML, Parquet)Skilled in logical and physical data modellingStrong knowledge of data normalization, schema design, and metadata managementSQL (advanced), PL / SQL – for data manipulation, ETL logic, and performance tuningPython – building custom data transformation logic and orchestration scriptsShell / Bash scripting – automating data pipeline deployment and monitoringSoft Skills & Functional Expertise
Strong communication and collaboration skills to engage both business and technical stakeholders.Strategic thinker capable of balancing technical depth with business context.Proven ability to translate business needs into scalable data architecture solutions.Skills Required
BigQuery, Pl Sql, Python, Sql