Role- Data Architect
Location- Pan India
Year of Experience - 10 to 15 years
Skills Required - Databricks, AWS / Azure / GCP Cloud, Data Architecture
Job Description :
- 10-15 years of total experience and at least 3+ years of expertise in Cloud data warehouse technologies
- Azure data platform covering – ADF, ADLS, Synapse, Microsoft Fabric, databricks etc.
- AWS data platform covering – Glue, EMR, RedShift, databricks etc. At least one End-to-end AWS data platform implementation is a must covering all aspects including architecture, design, data engineering, data visualization and data governance (specifically data quality and lineage).
- GCP data platform covering – Cloud Storage, Data Flow, Data Proc, BigQuery, etc.
- ignificant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts.
- Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement.
- Significant experience with data migrations and development, design, Operational Data Stores, Enterprise Data Warehouses and Data Marts.
- Experience with cloud ETL and ELT in one of the tools like DBT / ADF or Matillion or any other ELT tool and exposure to Bigdata ecosystem (Hadoop).
- Expertise with at least one of the Traditional data warehouses solutions on Oracle, Teradata, and Microsoft SQL server.