```html
About the Company
Must have : Bigdata, GCP, Dataproc, Dataflow, SQL, Spark
About the Role
Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer services).
Responsibilities
Should be aware with columnar database e.g. parquet, ORC etc.
Visualize and evangelize next generation infrastructure in Cloud platform / Big Data space (Batch, Near Real-time, Real-time technologies).
Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms.
Developing and implementing an overall organizational data strategy that is in line with business processes.
The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems.
Qualifications
Expert-level proficiency in at least 4-5 GCP services.
Required Skills
Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities.
Strong understanding and experience in distributed computing frameworks.
Experience working within a Linux computing environment, and use of command line tools including knowledge of shell / Python scripting for automating common tasks.
Preferred Skills
None specified.
Pay range and compensation package
None specified.
Equal Opportunity Statement
None specified.
```
Big Data Developer • Delhi, India