Talent.com
This job offer is not available in your country.
Python Developer - Apache Spark

Python Developer - Apache Spark

Anlage Infotech (India) Pvt LtdBangalore
30+ days ago
Job description

We Are Hiring : Python Developer

About the Role :

We're looking for a highly skilled and experienced Python Developer to join our team in Hyderabad or Bangalore.

You'll play a key role in designing, developing, and maintaining scalable data pipelines and high-performance applications. This is an exciting opportunity for someone with a strong background in big data technologies, cloud platforms, and modern API development to make a significant impact.

Work Location : Hyderabad / Bangalore (Hybrid - 3 days a week in office)

Experience : 8 - 12 Years

Key Responsibilities

  • Develop and maintain robust applications : Design, build, and deploy efficient, reusable, and scalable code using Python.
  • Build high-performance APIs : Create and manage RESTful APIs and microservices using FastAPI to support various business needs.
  • Engineer data pipelines : Design and implement complex ETL pipelines using PySpark and Apache Spark for large-scale data processing.
  • Manage big data processing : Write optimized Spark jobs and work with RDDs and DataFrames to handle data ingestion, cleansing, and transformation.
  • Develop orchestration workflows : Utilize Apache Airflow to schedule, monitor, and orchestrate data workflows.
  • Query and analyze data : Write and optimize complex SQL queries and integrate Spark SQL for data analysis and reporting.
  • Collaborate with cloud technologies : Deploy and manage applications within a cloud environment (AWS, Azure, or GCP).
  • Ensure code quality : Adhere to best practices for code development, testing, and documentation.
  • Mentor and guide junior team members : Share knowledge and expertise with other developers on the team.

Mandatory Skills & Qualifications

  • 8-12 years of professional experience as a Python Developer.
  • Deep expertise in Python programming, including a strong understanding of its libraries, language features, and best practices.
  • Extensive hands-on experience with Apache Spark and PySpark, including writing optimized Spark jobs, transformations, and actions.
  • Proven experience with Apache Airflow for workflow orchestration.
  • Solid experience developing high-performance RESTful APIs using FastAPI.
  • Proficiency in SQL and experience with Spark SQL.
  • Working knowledge of a major cloud platform such as AWS, Azure, or GCP.
  • Strong understanding of big data processing concepts, including distributed computing and fault tolerance.
  • Desired Skills & Qualifications

  • Familiarity with the Hadoop ecosystem (HDFS, Hive, etc.).
  • Experience with various data serialization formats like JSON, Parquet, Avro, and ORC.
  • Knowledge of software development lifecycle (SDLC) and Agile methodologies.
  • Excellent problem-solving skills and the ability to work independently as well as part of a team.
  • Strong communication and interpersonal skills.
  • (ref : hirist.tech)

    Create a job alert for this search

    Python Developer • Bangalore