Modernizing Medicine is looking for an exceptional Data Operations Engineer with a passion for technology to help revolutionize the world of Healthcare IT.Data Operations Engineers are at the core of a data-driven business; they build and maintain the infrastructure that empowers customers, analysts, and data scientists to drive insights. we've built a team of passionate, creative, and innovative engineers and data scientists that are changing the world and having fun doing itYou may be a great fit for Mod Med s Data Operations Engineer opportunity.You are a gifted Data Operations Engineer with a background in large scale data engineering support.You will enjoy using technology to automate solutions and optimize outcomes focusing on data engineering, continuous integration, and continuous deploymentYou are a collaborator. You thrive in environments that freely exchange ideas and viewpoints.You are an innovator that believes in making a difference and having fun doing it.You are passionate about data and in being a part of a tight-knit Data Operations teamYour Role :
- Automate, deploy and operate data pipelines
- Automate build, deployment, and quality processes
- Implement facilities to monitor all aspects of the data pipeline
- Manage data in Spark and other environments using scripts and automation
- Communicate and / or address build, deployment and operational issues as they come up
- Implement, administer, and support the Qliksense systems
- Installations, Upgrades, Patches, Backups of the Data Operations systems
- Maintain documentation of Data Operations systems and processes
- Support the data usage needs of downstream analytics teams
- Ensure availability meets or exceed agreed-upon SLAs
- Participate in a 24x7 on-call rotation for critical issues
Skills :
- BS in Computer Science or equivalent work experience.
- Experience managing data in relational databases and developing ETL pipelines
- Experience using Spark SQL or other Big Data tools
- Experience implementing and administering logging, telemetry and monitoring tools like ops view
- Experience using AWS or other cloud services
- Experience scripting for automation and config management (Chef, Puppet, Ansible)
- Experience with orchestration software such as Airflow or Luigi
- Fluent in at least one scripting or systems programming language (Python, Ruby, Bash, Go, Rust, Crystal, etc)
- Deep Knowledge of the Linux and Windows operating systems (OS, networking, process level)
- Strong problem-solving skills
- Interest in DevOps style engineering teams - we operate what we build!
- Strong verbal and written communication skills
- Strong commitment to quality, architecture, and documentation.
- Experience working with the Qlik Sense product in a large-scale enterprise environment a plus
- Experience with Tableau and Domo a plus.
Skills Required
Systems Programming, Tableau, Big Data, Data Operations, Devops, Sql, Aws