Talent.com
This job offer is not available in your country.
Apply Now! Sr. Software Engineer

Apply Now! Sr. Software Engineer

IntraEdgeDelhi, Delhi, India
16 hours ago
Job description

Job Title : Senior Data Engineer – Pyspark | AWS Lambda | AWS Glue | Snowflake

Experience Required : 5 to 9 years

Location : Remote

Joining Timeline : Immediate to 20 Days

Job Summary :

We are seeking a highly skilled and experienced Senior Data Engineer with strong expertise in Pyspark, AWS Lambda, AWS Glue, and Snowflake. The ideal candidate will be responsible for building and maintaining scalable data pipelines, ensuring seamless data integration and transformation across platforms, and delivering high-performance solutions in a cloud-native environment.

This is a remote position, open only to candidates who can join immediately or within 15 days.

Key Responsibilities :

  • Design, build, and maintain robust and scalable ETL / ELT pipelines using AWS Glue and AWS Lambda.
  • Integrate and manage Pyspark-based solutions, ensuring secure and compliant data flow.
  • Work with large datasets and implement complex data transformation logic within AWS and Snowflake environments.
  • Optimize Snowflake queries, warehouse performance, and data architecture.
  • Automate data ingestion, cleansing, transformation, and loading processes.
  • Monitor and troubleshoot data pipeline failures and performance issues.
  • Collaborate with cross-functional teams including data analysts, data scientists, and DevOps.
  • Ensure adherence to data governance, quality, and security standards.
  • Provide technical leadership and mentorship to junior data engineers as needed.

Required Skills & Qualifications :

  • 5 to 9 years of overall experience in data engineering and cloud-based data solutions.
  • Hands-on expertise in Payspark platform and integrations.
  • Strong experience with AWS Glue (Job creation, Crawlers, Catalog, PySpark / Scala scripts).
  • Solid knowledge of AWS Lambda for serverless compute and data automation.
  • Proven expertise in Snowflake (data modeling, performance tuning, secure data sharing, etc.).
  • Proficient in SQL, Python, and working knowledge of PySpark or Scala.
  • Experience with CI / CD practices for data pipelines (e.g., using Git, CodePipeline, or similar tools).
  • Excellent problem-solving skills and the ability to work independently in a remote setup.
  • Good to Have :

  • Exposure to AWS services such as S3, Athena, Redshift, CloudWatch, Step Functions.
  • Familiarity with data security and compliance best practices (e.g., GDPR, HIPAA).
  • Experience with Agile / Scrum methodologies.
  • Employment Type : Full-time : Remote

    Notice Period : Immediate Joiners preferred or maximum 15 days.

    How to Apply : share it to mkore1@intraedge.com

    Create a job alert for this search

    Software Engineer • Delhi, Delhi, India