Strong conceptual understanding of the context and business requirements. To understand the business needs, High-Level design, and produce Low-level design documents, implement code by the best practices.
- Hadoop development and Implementation.
- Loading from disparate data sets.
- Pre-processing using hive
- Ability to perform data quality checks in an orderly manner to understand how to utilize client data accurately.
- Expert level programming skills using Hadoop to meet the challenges of advanced data manipulation, complicated programming logic, and large data volumes is required.
- Ability to communicate results and methodology with the project team and clients. Should be able to work in the offshore / onshore modelAbility to meet deadlines and thrive in a banking environment.
- Provides solutions for data-driven applications involving large and complex data and providing reconciliation and test cases.
- Understand customer's business processes and pain areas which need attention
- Source system understanding and analysis.
- Solution Architecture for the entire flow from source to end reporting data marts.
- Design Conceptual and physical data model for a global data warehouse.in the Hadoop world (ETL versus ELT)
- High Level & Low-Level design for ETL Components in Hadoop