Understand the business case and translate to a holistic a solution involving AWS Cloud Services , PySpark, EMR, Python, Data Ingestion and Cloud DB Redshift / PostgresPL / SQL development for high volume data sets.Experience in preparing data warehouse design artifacts based on given requirements (ETL framework design, data modeling, source-target-mapping), DB query monitoring for tuning and optimization opportunitiesProven experience with large, complex database projects in environments producing high-volume dataDemonstrated problem solving skills; familiarity with various root cause analysis methods; experience indocumenting identified problems and determined resolutions.Makes recommendations regarding enhancements and / or improvementsProvides appropriate consulting, interfacing, and standards relating to database management, and monitors transaction activity and utilization.Performance issues analysis and TuningData Warehouse design and development, including logical and physical schema design.Other Responsibilities :
Perform all activities in a safe and responsible manner and support all Environmental, Health, Safety
Security requirements and programs
Customer / stakeholder focus. Ability to build strong relationships with Application teams, cross
functional IT and global / local IT teams
Required Qualifications :
- Bachelor or master s degree in information technology, Electrical Engineering or similar relevant fields.
- Proven experience (3 years minimum) with ETL development, design, performance tuning and optimization
- Very good knowledge of data warehouse architecture approaches and trends, and high interest to apply and further develop that knowledge, including understanding of
- Dimensional Modelling and ERD design approaches,
- Working Experience in Kubernetes and Docker Administration is added advantage
- Good experience in AWS Services, Big data, PySpark, EMR, Python, Cloud DB RedShift
- Proven experience with large, complex database projects in environments producing high-volume data,
- Proficiency in SQL and PL / SQL
- Experience in preparing data warehouse design artifacts based on given requirements (ETL framework design, data modeling, source-target-mapping),
- Experience in developing streaming applications e.g. SAP Data Intelligence, Spark Streaming, Flink, Storm, etc.
- Excellent conceptual abilities pared with very good technical documentation skills, e.g. ability to understand and document complex data flows as part of business / production processes, infrastructure.
- Familiarity with SDLC concepts and processes
Additional Skill :
- Experience using and developing on AWS services.
- Experience in semiconductor industry,
- Knowledge of Semi structured datasets
- Experience with reporting data solutions and business intelligence tools
- Experience in collecting, structuring and summarizing requirements in a data warehouse environment,
- Knowledge of statistical data analysis and data mining,
- Experience in test management, test case definition and test processes
Preferred Qualifications :
- Bachelor s or master s degree with Minimum 6 Years Exp
- Mush have experience in AWS Cloud Services, PySpark, EMR, Python, Cloud DB Redshift / Postgres and Data Ingestion
- Experience in preparing Data Warehouse design (ETL framework design, data modeling, source-target-mapping)
Skills Required
Dimensional Modeling, Sql, IT, Python, Sdlc, Aws