Dear Candidates,
Greetings from TCS !!!
TCS is a leading global IT services, consulting, and business solutions organization that delivers real results to global businesses, ensuring a level of certainty no other firm can match.
TCS is hiring for GCP Data & Analytics Architect, please find the below JD.
Experience range
– 8 to 20 years
Location
Required Skills -
GCP Data Engineering (Big Query, DataProc, Dataflow, Composer), ETL / Data Pipeline design, Data Lake / Data Warehouse, Hadoop Ecosystem
Role& Responsibilities –
Design & Implement ETL / data pipeline and Data Lake / Datawarehouse in Google Cloud
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Big Query, DataProc, Data Prep , Cloud Composer , Dataflow services / technologies.
10+ years of total experience and at least 3+ years of expertise in Cloud data warehouse technologies on Google Cloud data platform covering – Big Query, DataProc, Data Prep, Cloud Composer , Dataflow, Databricks etc.
Extensive hands-on experience implementing data ingestion and data processing using Google Cloud services : DataProc , Data Prep, Cloud Bigtable , Dataflow , Cloud Composer , Big Query, Databricks, Kafka, Nifi, CDC processing, Snowflake, Datastore , Firestore, Docker, App Engine , Spark, Cloud Data Fusion, Apigee API Management, etc.
Familiarity with the Technology stack available in the industry for data management, data ingestion, capture, processing, and curation : Kafka, Attunity, Golden Gate , Data Plex , Map Reduce, Hadoop, Hive, HBase, Cassandra, PySpark, Flume, Hive, Impala, etc.
Design and Implement analytics solution that utilize the ETL / data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Create data tools for analytics and data scientist team members that assist them in building and optimizing products.
At least one End-to-end Google Cloud data platform implementation is a must covering all aspects including architecture, design, data engineering, data visualization and data governance (specifically data quality and lineage).
Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts.
Good hands-on knowledge on Big Query and Data Warehousing life cycle is an absolute requirement.
Significant experience with data migrations and development, design, Operational Data Stores, Enterprise Data Warehouses and Data Marts.
Excellent communication skills to liaise with Business & IT stakeholders.
Expertise in planning execution of a project and efforts estimation.
Understanding of Data Vault, data mesh and data fabric architecture patterns.
Exposure to working in Agile ways of working.
Senior Data Architect • Delhi, India