Job Summary :
A PySpark Developer is responsible for designing, developing, and optimizing large-scale data processing applications and pipelines using Apache Spark and Python. This role involves leveraging PySpark to handle, transform, and analyze vast datasets in distributed computing environments, often integrating with other big data technologies and cloud platforms.
Qualification :
Pyspark Developer • panchkula, haryana, in