Description
Sabre is seeking a talented senior software engineer full Senior Data Science Engineer for SabreMosaic Team.
At Sabre, we’re passionate about building data science and data engineering solves problems. In this role you will plan, design, develop and test data science / data engineer software systems or applications for software enhancements and new products based on cloud-based solutions.
Role and Responsibilities :
- Develops, codes, tests and debugs new complex data driven software solutions or enhancements to existing product
- Designs, plans, develops and improves applications using advanced cloud native technology
- Works on issues where analysis of situations or data requires an in-depth knowledge of organizational objectives. Implements strategic policies when selecting methods, techniques
- Encourage high coding standards, using best practices and high quality
- Regularly interacts with subordinate supervisors, architects, product managers, HR, on matters concerning projects, or team performance. Requires the ability to change the thinking of, or gain acceptance from others in sensitive situations, without damage to the relationship
- Provides technical mentorship and cultural / competency-based guidance to teams
- Provides larger business / product context. Mentors on specific tech stacks / technologies
Qualifications and Education Requirements :
Minimum 4-6 years of related experience as a full stack developerExpert in the field of Data Engineering / DW projects with Google Cloud based Data Engineering solutionsDesigning and developing enterprise data solutions on GCP cloud platform with native or third-party data technologies.Good working experience with relational databases and NoSQL databases including but not limited to Oracle, Spanner, BigQuery etc.Expert level SQL skills for data manipulation (DML), data validation and data manipulationExperience in design and development of data modeling, data design, data warehouses, data lakes and analytics platforms on GCPExpertise in designing ETL data pipelines and data processing architectures for DatawarehouseExperience in technical design and building both Streaming and batch processing systemsGood experience in Datawarehouse in designing Star & Snowflake Schemas and knowledge of Dimensional Data ModellingWork with data scientists, data team and engineering teams to use Google Cloud platform to analyze data, build data models on Big query, big table etcWorking experience in Integrating different datasets from multiple data sources for data modelling for analytical and AI / ML modelsTake ownership of production deployment of codeUnderstanding and experience in Pub / Sub, Kafka, Kubernetes, GCP, AWS, Hive, DockerExpertise in Java spring boot / Python or other programming languages used for Data Engineering and integration projectsStrong problem-solving and analytical skillsAI / ML exposure, MLOPS and Vertex AI is a great advantageFamiliarity with DevOps practices like CICD pipelineAirline domain experience is a plusExcellent spoken and written communication skillsGCP Cloud Data Engineer Professional is plusWe will give careful consideration to your application and review your details against the position criteria. You will receive separate notification as your application progresses.
Please note that only candidates who meet the minimum criteria for the role will proceed in the selection process.
LI-Hybrid#LI-BG1