Description
& Summary : The primary purpose of this role is to translate business requirements and functional specifications into logical program designs and to deliver dashboards, schema, data pipelines, and software solutions. Design, develop, and maintain scalable data pipelines to process and transform large volumes of structured and unstructured data. Build and maintain ETL / ELT workflows for data ingestion from various sources (APIs, databases, files, cloud). Ensure data quality, integrity, and governance across the pipeline. This includes developing, configuring, or modifying data components within various complex business and / or enterprise application solutions in various computing environments.
Responsibilities :
Mandatory skill sets :
‘Must have’ knowledge, skills and experiences
Preferred skill sets :
‘Good to have’ knowledge, skills and experiences
Years of experience required :
Experience and Qualifications
Education qualification :
o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above)
Education
Degrees / Field of Study required : Bachelor of Engineering, Master of Business Administration, Bachelor of TechnologyDegrees / Field of Study preferred :
Certifications
Required Skills
Data Engineering
Optional Skills
Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more}
Desired Languages
Travel Requirements
Not Specified
Available for Work Visa Sponsorship?
No
Government Clearance Required?
No
Job Posting End Date
Pune • pune, India