Role- Snowflake Data Architect
Year of Experience-14+ years
Location- Pan India
Required Skills :
10-15 years of total experience and at least 3+ years of expertise in Cloud data warehouse technologies on Snowflake and AWS, Azure or GCP.
At least one End-to-end Snowflake implementation is a must covering all aspects including architecture, design, data engineering, data visualization and data governance (specifically data quality and lineage).
Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses and Data Marts.
Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement.
Significant experience with data migrations and development, design, Operational Data Stores, Enterprise Data Warehouses and Data Marts.
Experience with cloud ETL and ELT in one of the tools like DBT / Glue / ADF or Matillion or any other ELT tool and exposure to Bigdata ecosystem (Hadoop).
Expertise with at least one of the Traditional data warehouses solutions on Oracle, Teradata, and Microsoft SQL server.
Excellent communication skills to liaise with Business & IT stakeholders.
Expertise in planning execution of a project and efforts estimation.
Understanding of Data Vault, data mesh and data fabric architecture patterns.
Exposure to working in Agile ways of working.
Good to have skills :
Experience on cloud services like S3 / Blob / GCS, Lambda, Glue / ADF / , Apache Airflow.
Experience in coding languages like Python and PySpark would be an added advantage.
Experience in DevOps, CI / CD, GitHub is a big plus.
Experience or understanding of Banking and Financial Services business domain
Data Architect • Tirunelveli, Tamil Nadu, India