Role : Big Data Cloud Engineer
Required Experience : 8 – 12 years
Job Location : Work from Anywhere (Pune / Bengaluru / Chennai / Hyderabad / Kolkata / Mumbai / Indore / Ahmedabad / Jaipur / Noida / Patna / Gurgaon / Lucknow / Dehradun / Chandigarh)
Job Description :
Must have Skills : Spark, Python / Scala, AWS (Glue, Athena, Redshift, etc.), SQL, PostgreSQL.
Job Details :
- Build scalable, reliable, cost-effective solutions for both the Cloud and on-premises with an emphasis on quality, best-practice coding standards, and cost-effectiveness
- Build and test Cloud-based applications for new and existing backend systems to help facilitate development teams to migrate to the cloud
- Build platform reusable code and components that could be used by multiple project teams
- Understand the enterprise architecture within the context of existing platforms, services and strategic direction
- Implement end-to-end solutions with sound technical architecture, in Big Data analytics framework along with customized solutions that are scalable, with primary focus on performance, quality, maintainability, cost and testability
Key Responsibilities :
Experience of working in data warehouse platforms (e.g. Snowflake, Redshift, Databricks, Teradata)Strong hands-on experience of designing conceptual, logical, and physical data models for enterprise level data warehousesStrong knowledge of dimensional modeling techniqueProficiency in data modeling tools (e.g. Erwin, Visio, or similar)ETL and Reporting :Strong expertise in SQL, ETL processesExperience working in BI modeling and reporting tools (Power BI, Tableau)