Role : Associate Architect - Data
Experience Level : 6 to 9 Years
Work location : Mumbai, Bangalore, Trivandrum
Notice Period : 0-45 days
Role & Responsibilities :
- 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence / Artificial Intelligence solutions on Cloud (GCP / AWS / Azure)
- Ability to understand business requirements, translate them into functional and non-functional areas, and define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
- Experience in architecting, designing, and implementing end-to-end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
- Experience of having worked in distributed computing and enterprise environments like Hadoop,
GCP / AWS / Azure Cloud.
Well-versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark / Scala, Dataflow, DataProc, EMR, etc. on various Cloud.Experience of having worked with traditional ETL tools like Informatica / DataStage / OWB / Talend, etc.Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc.Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers and MicroservicesArchitecture and Design.
Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.Understanding of Security features and Policies in one or more Cloud environments likeGCP / AWS / Azure.
Experience working in business transformation projects for the movement of On-Premise datasolutions to Clouds like GCP / AWS / Azure.
Role :
Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration,data warehouse, and business intelligence.
Interface with multiple stakeholders within IT and business to understand the data requirements.Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.Implement processes and systems to validate data, and monitor data quality, ensure productiondata is always accurate and available for key stakeholders and business processes that depend on
it.
Work with the Pre-Sales team on RFPs, and RFIs and help them by creating solutions for data.Mentor Young Talent within the Team and define and track their growth parameters.Contribute to building Assets and Accelerators.