About Client : -
Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations.
The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organizations accelerate their transition to a digital and sustainable world.
They provide a variety of services, including consulting, technology, professional, and outsourcing services.
Job Details : -
location : Pan India
Mode Of Work : Hybrid
Notice Period : Immediate Joiners
Experience : 6-8yrs
Type Of Hire : Contract to Hire
Job Description : -
4+ years of relevant IT experience in the BI / DW domain with minimum 2 years of hands-on experience on Azure modern data platform that includes Data Factory, Databricks, Synapse (Azure SQL DW) and Azure Data Lake
Meaningful experience of data analysis and transformation using Python / R / Scala on Azure Databricks or Apache Spark
Well versed NoSQL data store concepts
Good knowledge in Distributed Processing using Databricks (preferred) or Apache Spark
Ability to debug using tools like Ganglia UI, expertise in Optimizing Spark Jobs
The ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets
Expert in creating data structures optimized for storage and various query patterns for e.g. Parquet and Delta Lake
Meaningful experience in at least one database technology in each segment such as :
Traditional RDBMS (MS SQL Server, Oracle)
MPP (Teradata, Netezza)
NoSQL (MongoDB, Cassandra, Neo4J, CosmosDB, Gremlin)
Understanding of Information Security principles to ensure compliant handling and management of data
Experience in traditional data warehousing / ETL tools (Informatica, IBM Datastage, Microsoft SSIS)
Effective in communication
Proficient at working with large and complex code bases (Github, Gitflow, Fork / Pull Model)
Working experience in Agile methodologies (SCRUM, XP, Kanban)
Data Modelling - One to Three Years
Developer / Software Engineer - One to Three Years
PSP Defined SCU in Data Engineering_Data Engineer
Data Modeler • Thane, Maharashtra, India