Job Description
Role and Responsibilities :
- Emphasis is on end-to-end delivery of analysis
- Extremely comfortable working with data, including managing large number of data sources, analysing data quality, and pro-actively working with client’s data / IT teams to resolve issues
- Use variety of analytical tools (Python, SQL, PySpark etc.) to carry out analysis and drive conclusions
- Reformulate highly technical information into concise, understandable terms for presentations
Specefic Requirements
Good to have : - Any cloud platform knowledge(AWS), Airflow
Candidate Profile :
Required skills : Python, SQL,, Big Data, Hive,PySpark, Hadoop2- 6 years of consulting, analytics delivery experienceExperience in Banking and Financial Services domain is preferred.Master’s or Bachelor's degree in math, statistics, economics, computer engineering or related analytics fieldVery strong analytical skills with the demonstrated ability to research and make decisions based on the day-to-day and complex customer problems requiredExperience of working in financial services and risk analytics domain, a plusStrong record of achievement, solid analytical ability, and an entrepreneurial hands-on approach to workOutstanding written and verbal communication skillsData Engineer with programming background in Advanced python .Mandatory knowledge of GIT and Linux commandsExperience in building data ingestion pipeline scripts.Comfortable with REST APIs and different auth mechanism to retrieve data.Good to have experience in handling complex data pipeline architecture with great debugging skills.Experience in CI / CD Deployment and application support activitiesHave experience working with geographically distributed team and clients.Job Location
Hybrid - 2 days work from office Gurgaon, Bangalore, Pune, Noida, 3 days WFH