Posts Tagged ‘analytics’

Reporter: Gail S. Thornton, M.A.

This article is excerpted from the Harvard Business Review, May 28, 2019

By Moni Miyashita, Michael Brady

In southeast England, patients discharged from a group of hospitals serving 500,000 people are being fitted with a Wi-Fi-enabled armband that remotely monitors vital signs such as respiratory rate, oxygen levels, pulse, blood pressure, and body temperature.

Under a National Health Service pilot program that now incorporates artificial intelligence to analyze all that patient data in real time, hospital readmission rates are down, and emergency room visits have been reduced. What’s more, the need for costly home visits has dropped by 22%. Longer term, adherence to treatment plans have increased to 96%, compared to the industry average of 50%.

The AI pilot is targeting what Harvard Business School Professor and Innosight co-founder Clay Christensen calls “non-consumption.”  These are opportunity areas where consumers have a job to be done that isn’t currently addressed by an affordable or convenient solution.

Before the U.K. pilot at the Dartford and Gravesham hospitals, for instance, home monitoring had involved dispatching hospital staffers to drive up to 90 minutes round-trip to check in with patients in their homes about once per week. But with algorithms now constantly searching for warning signs in the data and alerting both patients and professionals instantly, a new capability is born: providing healthcare before you knew you even need it.

The biggest promise of artificial intelligence — accurate predictions at near-zero marginal cost — has rightly generated substantial interest in applying AI to nearly every area of healthcare. But not every application of AI in healthcare is equally well-suited to benefit. Moreover, very few applications serve as an appropriate strategic response to the largest problems facing nearly every health system: decentralization and margin pressure.

Take for example, medical imaging AI tools — an area in which hospitals are projected to spend $2 billion annually within four years. Accurately diagnosing diseases from cancers to cataracts is a complex task, with difficult-to-quantify but typically major consequences. However, the task is currently typically part of larger workflows performed by extensively trained, highly specialized physicians who are among some of the world’s best minds. These doctors might need help at the margins, but this is a job already being done. Such factors make disease diagnosis an extraordinarily difficult area for AI to create transformative change. And so the application of AI in such settings  —  even if beneficial  to patient outcomes —  is unlikely to fundamentally improve the way healthcare is delivered or to substantially lower costs in the near-term.

However, leading organizations seeking to decentralize care can deploy AI to do things that have never been done before. For example: There’s a wide array of non-acute health decisions that consumers make daily. These decisions do not warrant the attention of a skilled clinician but ultimately play a large role in determining patient’s health — and ultimately the cost of healthcare.

According to the World Health Organization, 60% of related factors to individual health and quality of life are correlated to lifestyle choices, including taking prescriptions such as blood-pressure medications correctly, getting exercise, and reducing stress. Aided by AI-driven models, it is now possible to provide patients with interventions and reminders throughout this day-to-day process based on changes to the patient’s vital signs.

Home health monitoring itself isn’t new. Active programs and pilot studies are underway through leading institutions ranging from Partners Healthcare, United Healthcare, and the Johns Hopkins School of Medicine, with positive results. But those efforts have yet to harness AI to make better judgements and recommendations in real time. Because of the massive volumes of data involved, machine learning algorithms are particularly well suited to scaling that task for large populations. After all, large sets of data are what power AI by making those algorithms smarter.

By deploying AI, for instance, the NHS program is not only able to scale up in the U.K. but also internationally. Current Health, the venture-capital backed maker of the patient monitoring devices used in the program, recently received FDA clearance to pilot the system in the U.S. and is now testing it with New York’s Mount Sinai Hospital. It’s part of an effort to reduce patient readmissions, which costs U.S. hospitals about $40 billion annually.

The early success of such efforts drives home three lessons in using AI to address non-consumption in the new world of patient-centric healthcare:

1) Focus on impacting critical metrics – for example, reducing costly hospital readmission rates.

Start small to home in on the goal of making an impact on a key metric tied to both patient outcomes and financial sustainability. As in the U.K. pilot, this can be done through a program with select hospitals or provider locations. In another case Grady Hospital, the largest public hospital in Atlanta, points to $4M in saving from reduced readmission rates by 31% over two years thanks to the adoption of an AI tool which identifies ‘at-risk’ patients. The system alerts clinical teams to initiate special patient touch points and interventions.

2) Reduce risk by relying on new kinds of partners.

Don’t try to do everything alone. Instead, form alliances with partners that are aiming to tackle similar problems. Consider the Synaptic Healthcare Alliance, a collaborative pilot program between Aetna, Ascension, Humana, Optum, and others. The alliance is using Blockchain to create a giant dataset across various health care providers, with AI trials on the data getting underway. The aim is to streamline health care provider data management with the goal of reducing the cost of processing claims while also improving access to care. Going it alone can be risky due to data incompatibility issues alone. For instance, the M.D. Anderson Cancer Center had to write off millions in costs for a failed AI project due in part to incompatibility with its electronic health records system. By joining forces, Synaptic’s dataset will be in a standard format that makes records and results transportable.

3) Use AI to collaborate, not compete, with highly-trained professionals.

Clinicians are often looking to augment their knowledge and reasoning, and AI can help. Many medical AI applications do actually compete with doctors. In radiology, for instance, some algorithms have performed image-bases diagnosis as well as or better than human experts. Yet it’s unclear if patients and medical institutions will trust AI to automate that job entirely. A University of California at San Diego pilot in which AI successfully diagnosed childhood diseases more accurately than junior-level pediatricians still required senior doctors to personally review and sign off on the diagnosis. The real aim is always going to be to use AI to collaborate with clinicians seeking higher precision — not try to replace them.

MIT and MGH have developed a deep learning model which identifies patients likely to develop breast cancer in the future. Learning from data on 60,000 prior patients, the AI system allows physicians to personalize their approach to breast cancer screening, essentially creating a detailed risk profile for each patient.

Taken together, these three lessons paired with solutions targeted at non-consumption have the potential to provide a clear path to effectively harnessing a technology that has been subject to rampant over-promising. Longer term, we believe the one of the transformative benefits of AI will be deepening relationships between health providers and patients. The U.K. pilot, for instance, is resulting in more frequent proactive check-ins that never would have happened before. That’s good for both improving health as well as customer loyalty in the emerging consumer-centric healthcare marketplace.



Read Full Post »

Turing Institute Engaging the Science of Big Data

Larry H. Bernstein, MD, FCAP, Curator



Alan Turing Institute Will Lead Research in Data Science

12/08/2015 –   Duncan Roweth, Cray

Cray is partnering with the Alan Turing Institute, the new U.K. data science research organization in London, to help the U.K. as it increases research in data science to benefit research and industry.

Earlier this month Fiona Burgess, U.K. senior account manager, and I attended the launch of the institute. At the event, U.K. Minister for Science and Universities Jo Johnson paid tribute to Turing and his work. Institute director Professor Andrew Blake told the audience that the Turing Institute is about much more than just big data — it is about data science, analyzing that data and gaining a new understanding that leads to decisions and actions.

Alan Turing was a pioneering British computer scientist. He has become a household name in the U.K. following publicity surrounding his role in breaking the Enigma machine ciphers during the Second World War. This was a closely guarded secret until a few years ago, but has recently become the subject of numerous books and several films. Turing was highly influential in the development of computer science, providing a formalization of the concepts of algorithm and computation with the Turing machine. After the war, he worked at the National Physical Laboratory, where he designed ACE, one of the first stored-program computers.

The Alan Turing Institute is a joint venture between the universities of Cambridge, Edinburgh, Oxford, Warwick, University College London, and the U.K. Engineering and Physical Science Research Council (EPSRC). The Institute received initial funding in excess of £75 million ($110 million) from the U.K. government, the university partners and other business organizations, including the Lloyd’s Register Foundation.

The Turing Institute will, among other topics, research how knowledge and predictions can be extracted from large-scale and diverse digital data. It will bring together people, organizations and technologies in data science for the development of theory, methodologies and algorithms. The U.K. government is looking to this new Institute to enable the science community, commerce and industry to realize the value of big data for the U.K. economy.

Cray will be working with the Turing Institute and EPSRC to provide data analytics capability to the U.K.’s data sciences community.  EPSRC’s ARCHER supercomputer, a Cray XC30 system based at the University of Edinburgh, has been chosen for this work. Much as we worked with NERSC to port Docker to Cray systems, we will be working with ATI to port analytics software to ARCHER and then XC systems generally.

ARCHER is currently the largest supercomputer for scientific research in the U.K. — with its recent upgrade ARCHER’s 118,080 cores can access in excess of 300 TB of memory. What sort of problem might need that amount of processing power?  Genomics England is collecting around 200 GB of DNA sequence data from each of 100,000 people. Finding patterns in all this information will be a mammoth task!

ATI have put together a wide ranging programme of workshops and data science summits, details of which can be found on their Web site.

Duncan Roweth is a principal engineer in the Cray CTO Office in Bristol, U.K.  




















Read Full Post »