Description
& Summary : A career within….
Responsibilities :
1. Required years of experience : 3-5 years 2. Engage in design, development and deployment of data integration solutions employing a three-tiered ETL methodology. 3. Utilize Python and Py-Spark to extract data from diverse sources, execute transformative operations including filtering and joining, and dispatch processed data to designated destinations. 4. Understanding and experience with Azure Databricks will be preferred. Knowledge of advanced Python programming concepts is compulsory
Mandatory skill sets :
Python / pySpark / Azure Databricks
Preferred skill sets :
Python / pySpark / Azure Databricks
Years of experience required :
3+
Education qualification :
B.Tech / B.E.
Education
Degrees / Field of Study required : Bachelor of Technology, Bachelor of EngineeringDegrees / Field of Study preferred :
Certifications
Required Skills
Full Stack Development
Optional Skills
Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration / Continuous Delivery (CI / CD), Creativity {+ 46 more}
Desired Languages
Travel Requirements
Available for Work Visa Sponsorship?
Government Clearance Required?
Job Posting End Date
Data • kolkata (salt lake city), India