About the Company :
Brillio is one of the fastest growing digital technology service providers and a partner of choice for many Fortune 1000 companies seeking to turn disruption into a competitive advantage through innovative digital adoption. Brillio, renowned for its world-class professionals, referred to as "Brillians", distinguishes itself through their capacity to seamlessly integrate cutting-edge digital and design thinking skills with an unwavering dedication to client satisfaction.
Brillio takes pride in its status as an employer of choice, consistently attracting the most exceptional and talented individuals due to its unwavering emphasis on contemporary, groundbreaking technologies, and exclusive digital projects. Brillio's relentless commitment to providing an exceptional experience to its Brillians and nurturing their full potential consistently garners them the Great Place to Work® certification year after year.
About the Role : We’re hiring a
Senior GCP Data Engineer
to design and build scalable data solutions using Google Cloud’s advanced tools. This role demands hands-on expertise in BigQuery, Dataflow, Airflow, and Python, with a strong foundation in SQL and large-scale data architecture. You’ll work on high-impact analytics platforms, collaborating across teams to deliver clean, secure, and efficient data pipelines.
Required Skills :
4+ years of experience in the data engineering field is preferred
3+ years of Hands-on experience in GCP cloud data implementation suite such as Big Query, Pub Sub, Data Flow / Apache Beam, Airflow / Composer, Cloud Storage
Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms
Hands on Strong Experience in the below technology
1. GBQ Query
2. Python
3. Apache Airflow
4. SQL (BigQuery preferred)
Extensive hands-on experience working with data using SQL and Python
Cloud Functions. Comparable skills in AWS and other cloud Big Data Engineering space is considered
Experience with agile development methodologies
Excellent verbal and written communications skills with the ability to clearly present ideas, concepts, and solutions
Qualifications
Bachelor's Degree in Computer Science, Information Technology, or closely related discipline
Responsibilities :
Design, build, and maintain scalable data pipelines using GCP tools such as BigQuery, Dataflow (Apache Beam), Pub / Sub, and Airflow / Composer
Architect and operationalize large-scale data warehouses, data lakes, and analytics platforms
Write efficient, production-grade code in Python and SQL for data transformation and analysis
Implement and manage Cloud Functions and other GCP-native services for automation and integration
Collaborate with cross-functional teams to understand data requirements and deliver robust solutions
Ensure data quality, security, and performance across all stages of the pipeline
Participate in Agile ceremonies and contribute to iterative development cycles
Communicate technical concepts clearly to stakeholders through documentation and presentations
Preferred Skills
Comparable skills in AWS and other cloud Big Data Engineering space is considered
For apply click here : https : / / tinyurl.com / bdesksnc
Gcp Data Engineer • Delhi, India