Builds end-to-end data applicationsintegrating backend APIs with analytical front-ends.3+ yrs of IT experienceGood understanding of analytics tools for effective analysis of dataAbility to learn new tools and technologiesShould have worked on anyone Structural (SQL / Oracle / Postgres) and one Nosql DatabaseShould have very good understanding of DW, Data Mart and Data modelling Concepts.Should be part of a Data Warehouse design team at least in one projectRoles & Responsibilities
- Develop high performance and scalable solutions using GCP that extract, transform, and load big data.
- Designing and building production-grade data solutions from ingestion to consumption using Java / Python
- Design and optimize data models on GCP cloud using GCP data stores such as BigQuery
- Optimizing data pipelines for performance and cost for large scale data lakes.
- Writing complex, highly-optimized queries across large data sets and to create data processing layers.
- Closely interact with Data Engineers to identify right tools to deliver product features by performing POC
- Collaborative team player that interacts with business, BAs and other Data / ML engineers
- Research new use cases for existing data.
Preferred Skills
- Need to be Aware of Design Best practices for OLTP and OLAP Systems
- Should have exposure to Load testing methodologies, Debugging pipelines and Delta load handling
- Creation of Dag File using Python and SQL for ETL.
- Having experience in Exploratory analysis of log. Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable
- Experience in Google Cloud Platform (GCP)
- Skills in writing batch and stream processing jobs using Apache Beam Framework (Dataflow)
- Experience with Spring Boot
- Knowledge of Microservices, Pub / Sub, Cloud Run, Cloud Function
Skills Required
Java, BigQuery, Spring Boot, Sql, Microservices, Pub Sub, Gcp, Postgres, Apache Beam, DataFlow, Oracle, Python