As a Technical Specialist - Azure Databricks , you will play a crucial role in designing, building, and maintaining scalable data pipelines and infrastructure. You will work with a variety of technologies, including Scala, Python, Spark, AWS (Amazon Web Services) services, and SQL, to support our data processing and analytics needs.
Responsibilities : -
- Develop robust data platforms, and contribute to the organization's data-driven growth.
- Utilize expertise in Apache Airflow for orchestrating and automating complex data workflows.
- Help the other team members in developing solutions.
- Design solutions on Databricks including using delta lake / delta table, data warehouse, and more.
- Apply best practices while developing the solutions.
- Work towards creating reusable components.
- Ensure that the pipelines are designed to keep the operating cost low.
Educational Qualifications :
Engineering Degree – BE / ME / BTech / MTech / BSc / MSc.Technical certification in multiple technologies is desirable.Skills :
Mandatory Technical Skills : -
Hands-on experience in designing, developing, and optimizing scalable large volume pipelines to support the processing of structured and semi-structured dataIn-depth knowledge of SQL, PySpark, Databricks, delta lake, delta table, Azure data factory, Azure services Python, DevOps and Git, Relational Databases (Oracle, SQL Server, and more)Understanding of designing and implementing data solutions involving real-time streaming technologies, such as Apache Kafka, AWS MSK, Kinesis and Azure Event Hubs, ensuring seamless integration of streaming data into processing pipelinesExposure to the CI / CD pipelines for the Azure and AWS resourcesPossess excellent communication skills, with the ability to effectively convey complex analytical findingsUnderstanding of FinOps / observability patterns, data governance best practicesAWS / GCP / AzureSkills Required
Cloud Architecture, data engineering , Spark SQL, Azure Databricks