Talent.com
This job offer is not available in your country.
GG Data Engineer

GG Data Engineer

Astellas Pharma Inc.Bengaluru, Karnataka, India
13 hours ago
Job description

Responsibilities and Accountabilities :

  • Data Pipeline Development : Design, build, and optimize data pipelines using DWH technologies, Databricks, QLIK and other platforms as required. Ensuring data quality, reliability, and scalability.
  • Application Transition : Support the migration of internal applications to Databricks (or equivalent) based solutions. Collaborate with application teams to ensure a seamless transition.
  • Managing Continuous Improvement, Continuous Development, DevOps and RunOps activities at application, data and infra levels either on cloud or on premise.
  • Mentorship and Leadership : Lead and mentor junior data engineers. Share best practices, provide technical guidance, and foster a culture of continuous learning.
  • Data Strategy Contribution : Contribute to the organization’s data strategy by identifying opportunities for data-driven insights and improvements.
  • Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond.
  • Design, develop and implement robust and scalable data analytics using modern technologies.
  • Collaborate with cross functional teams and practises across the organisation including Commercial, Manufacturing, Medical, FoundationX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions.
  • Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible.
  • Champion continuous improvement initiatives identifying opportunities to optimise performance security and maintainability of existing data and platform architecture and other technology investments.
  • Participate in the continuous delivery pipeline. Adhering to DevOps best practises for version control automation and deployment. Ensuring effective management of the FoundationX backlog.
  • Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization.
  • Stay-up to date on the latest trends and technologies in data engineering and cloud platforms.

Requirements

Experience :

At least 5+ years demonstrable experience in :

  • Data engineering with a strong understanding of PySpark and SQL, building data pipelines and optimization.
  • Data engineering and integration tools (e.g., Databricks, Change Data Capture)
  • Utilizing cloud platforms (AWS, Azure, GCP). A deeper understanding / certification of AWS and Azure is considered a plus.
  • Experience with relational and non-relational databases.
  • Qualifications :

    Bachelor's degree in computer science, Information Technology, or related field (Master’s preferred) or equivalent experience.

    Any relevant cloud-based integration certification at associate or professional level. For example :

  • AWS certified DevOps engineer (Associate or Professional),
  • AWS Certified Developer (Associate or Professional)
  • DataBricks Certified Engineer
  • Qlik Sense Data Architect / Business Analyst (or similar platform)
  • Mulesoft Certified integration architect Level 1,
  • Microsoft Certified Azure Integration and Security.
  • Proficient in RESTful APIs
  • AWS, CDMP, MDM, DBA, SQL, SAP, TOGAF, API, CISSP, VCP (any relevant certification)
  • MuleSoft

  • Understanding of MuleSoft's Anypoint Platform and its components
  • Experience with designing and managing API-led connectivity solutions
  • Knowledge of integration patterns and best practices
  • AWS

  • Experience provisioning, operating, and managing AWS environments
  • Experience developing code in at least one high-level programming language
  • Understanding of modern development and operations processes and methodologies
  • Ability to automate the deployment and configuration of infrastructure using AWS services and tools
  • Experience with continuous integration and continuous delivery (CI / CD) methodologies and tools
  • Microsoft Azure

  • Fundamental understanding of Microsoft Azure and AWS and the data services provided
  • Experience with Azure services related to computing, networking, storage, and security
  • Knowledge of general IT security principles and best practices
  • Understanding of cloud integration patterns and Azure integration services such as Logic Apps, Service Bus, and API Management
  • Preferred Qualifications :

  • Subject Matter Expertise : possess a strong understanding of data architecture / engineering / operations / reporting within Life Sciences / Pharma industry across Commercial, Manufacturing and Medical domains.
  • Other complex and highly regulated industry experience will also be considered for e.g. healthcare, government or financial services.
  • Data Analysis and Automation Skills : Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools
  • Analytical Thinking : Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement.
  • Technical Proficiency : Strong coding skills in SQL, R, and / or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization.
  • Agile Champion : Adherence to DevOps principles and a proven track record with CI / CD pipelines for continuous delivery.
  • Other critical skills required

  • Cross-Cultural Experience : Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments.
  • Innovation and Creativity : Ability to think innovatively and propose creative solutions to complex technical challenges.
  • Global Perspective : Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making.
  • Create a job alert for this search

    Data Engineer • Bengaluru, Karnataka, India

    Related jobs
    • Promoted
    • New!
    GCP_Data Engineer

    GCP_Data Engineer

    FractalBengaluru, Karnataka, India
    It's fun to work in a company where people truly BELIEVE in what they are doing!.Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation...Show moreLast updated: 13 hours ago
    • Promoted
    • New!
    Data Engineer (GCP)

    Data Engineer (GCP)

    Job Connect IndiaBengaluru, Karnataka, India
    Data Engineer (Google Cloud Platform - GCP)Job Overview.The ideal candidate will have hands-on experience with GCP services such as. BigQuery, Dataflow, Pub / Sub, Cloud Composer, Cloud Storage, and D...Show moreLast updated: 13 hours ago
    • Promoted
    • New!
    Data Engineer

    Data Engineer

    Volvo CarsNandhagudi, Karnataka, India
    We bring bold digital visions to life.So we’re on the lookout for more curious and creative engineers who want to create change – one line of high-quality code at a time. Our transformation isn't fo...Show moreLast updated: 13 hours ago
    • Promoted
    GCP Data Engineer

    GCP Data Engineer

    ImpetusBengaluru, Karnataka, India
    We need GCP engineers for capacity building;.The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus.Strong background in Data ...Show moreLast updated: 30+ days ago
    • Promoted
    GCP Data Engineer

    GCP Data Engineer

    AscendionBengaluru, Karnataka, India
    Job Title : GCP Data Engineer (4 - 12 Years).Locations : Bengaluru, Hyderabad, Chennai, Pune.We are looking for a talented GCP Big Query Data Engineer with strong SQL skills and basic proficiency in ...Show moreLast updated: 24 days ago
    • Promoted
    Data Engineer

    Data Engineer

    HISH IT SERVICESBangalore, IN
    Location : Remote(Banglore,Chennai,Pune).Pay : 14LPA - 18 LPA(Based on Experience).Timings : A couple of hours overlap with EST, as the client is Canada-based (till 12AM IST).Start Date : 20th Octob...Show moreLast updated: 10 days ago
    • Promoted
    Data Engineer

    Data Engineer

    IntraEdgebangalore district, karnataka, in
    Snowflake, AWS (Lambda, Glue), DBT, and SQL.The ideal candidate will be responsible for enabling seamless data integration, transformation, and analytics to support business intelligence and advanc...Show moreLast updated: 29 days ago
    • Promoted
    • New!
    Data Engineer- GCP

    Data Engineer- GCP

    NucleusTeqBengaluru, Karnataka, India
    Skills needed : Python, Airflow(Orchestration) ,GCP(Cloud),Spark SQL, PySpark, GCP, CI / CD ,Git,Git Hub.Designing and building data models to support business requirements. Developing and maintaining ...Show moreLast updated: 13 hours ago
    • Promoted
    • New!
    GCP Data Engineer

    GCP Data Engineer

    Aqilea (formerly Soltia)Bengaluru, Karnataka, India
    We are a consulting company with a bunch of technology-interested and happy people!.We love technology, we love design and we love quality. Our diversity makes us unique and creates an inclusive and...Show moreLast updated: 13 hours ago
    • Promoted
    • New!
    Data Consultant-GCP Data Engineer

    Data Consultant-GCP Data Engineer

    WomenTech NetworkBengaluru, Karnataka, India
    At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ...Show moreLast updated: 9 hours ago
    • Promoted
    GCP Data Engineer

    GCP Data Engineer

    HCLTechBengaluru, Karnataka, India
    ETL, Pyspark, Python with Azure or AWS or GCP.Special Economic Zone, Tower-2 Jigani, 129, Bommasandra Jigani Link Rd,.Interested candidates please share your resume to sajithkuma@hcltech.Show moreLast updated: 3 days ago
    • Promoted
    GCP Data Engineer

    GCP Data Engineer

    FractalBengaluru, Karnataka, India
    We are seeking an experienced GCP Data Engineer with 6+ years of experience to design, develop, and manage cloud-based data solutions on Google Cloud Platform (GCP). The ideal candidate will have ex...Show moreLast updated: 30+ days ago
    • Promoted
    GCP Data Engineer

    GCP Data Engineer

    TEKsystemsBangalore, IN
    We are actively hiring for Skilled GCP Data Engineers for a global banking client.Years of experience : 4+ years (relevant). Looking for immediate joiners • •.Onboard new data sources - negotiate, agre...Show moreLast updated: 11 days ago
    • Promoted
    • New!
    GCP Data Engineer

    GCP Data Engineer

    TalentOlaBengaluru, Karnataka, India
    GCP Senior Data Engineer Job description as below : Key Responsibilities : Solid understanding of GCP ETL framework.Solid knowledge about develop robust, scalable, reusable and efficient ETL Solid kn...Show moreLast updated: 13 hours ago
    • Promoted
    Data Engineer

    Data Engineer

    Innodata Inc.bangalore, karnataka, in
    CI / CD practices, Databricks (Spark), Python, Github and SQL.The ideal candidate should have hands-on expertise in building and automating data pipelines, managing multi-environment deployments, and...Show moreLast updated: 28 days ago
    • Promoted
    GCP Data Engineer

    GCP Data Engineer

    Tata Consultancy ServicesBengaluru, Karnataka, India
    TCS is looking for GCP Data Engineer.Location : Bangalore / Kochi / Chennai / Hyderabad / Pune.GCP services (Big Query, Cloud Run, Pub / Sub, Cloud Storage, Spanner, Cloud Composer, Dataflow, Cloud Fu...Show moreLast updated: 30+ days ago
    • Promoted
    Data Engineer

    Data Engineer

    ACL DigitalBengaluru, Karnataka, India
    Design, develop, and optimize Spark-based data pipelines on Databricks for large-scale data processing.Design, develop, and optimize AWS pipeline as applicable. Implement and manage GitHub asset bun...Show moreLast updated: 30+ days ago
    • Promoted
    GCP Data Engineer

    GCP Data Engineer

    BrillioBengaluru, Karnataka, India
    GCP (primary), AWS (secondary acceptable).Infrastructure as Code (good to have).Develop and maintain scalable, cloud-native data processing pipelines on GCP. BigQuery, DataFlow, Pub / Sub, Cloud Stora...Show moreLast updated: 30+ days ago