How GE Healthcare Is Turning to Deep-Learning Algorithms

Maureen Kingsley

November 21, 2016

4 Min Read
How GE Healthcare Is Turning to Deep-Learning Algorithms

The algorithms will be deployed via GE's Health Cloud and smart imaging machines to improve diagnoses and treatment recommendations.

Maureen Kingsley

deep learning algorithmsGE Healthcare has forged a partnership with University of California San Francisco (UCSF) to create a library of deep-learning algorithms intended to help physicians diagnose and treat patients more accurately and effectively.

"Deep learning" refers to the significant clinical and operational connections and insights machines can make from huge amounts of data. The library of algorithms (complex problem-solving formulas) will offer a way for providers to access and use the deep learning achieved by machines.

UCSF's Center for Digital Health Innovation (CDHI) and GE will begin with algorithms aimed at expediting differential diagnosis in acute situations, such as trauma. The goal is to speed treatment, improve survival, and reduce complications. These algorithms, the partners announced, can be deployed globally via the GE Health Cloud and on "smart" GE imaging machines, "sharing the research of healthcare leaders with clinicians around the world who have varied expertise."

As the library of algorithms grows, so will the potential of related applications that draw from it. Providers may be able to leverage the algorithms to predict patient trajectories, automate the triage of routine care, improve process efficiency, and enable the development of more personalized therapies, UCSF predicts. As a result, diagnostic accuracy, patient outcomes, clinical workflow, and productivity could all improve.

"Next-generation data-science techniques have already transformed the industrial and consumer worlds," said Michael Blum, MD, associate vice chancellor for informatics, director of the digital health center and professor of medicine at UCSF. With the GE-UCSF collaboration, he said, "these technologies will be applied to our clinical data and images to provide clinicians with actionable information in near-real time. Together, we will develop tools and algorithms that will allow clinicians and researchers to identify problems and ask questions that are only achievable with vast computing power and datasets."

For GE Healthcare's part, John Flannery, president and CEO, said, "This partnership is about the future of healthcare--technology, analytics, and cloud-computing power all combining to enable clinicians to make faster decisions for better patient outcomes." By "working hand-in-hand with a leading academic medical center" to design, build, and verify new deep-learning tools, Flannery explained, "we are defining how digital health solutions can be seamlessly integrated into care."

The two parties are initially looking at high-volume, high-impact imaging to create algorithms that reliably distinguish between normal scan results and those requiring follow-up or acute intervention. One early example of an algorithm under development, for instance, is one for pneumothorax (a collapsed lung). The algorithm will be trained to teach machines to distinguish between normal and abnormal images so that clinicians can prioritize and more quickly treat patients with pneumothorax, a potentially life-threatening condition.

As the partnership between GE and UCSF progresses, the two will seek to integrate data not only from various imaging technologies such as computed tomography, magnetic resonance, and x-ray, but also from electronic health records and other sources to "enrich algorithm development and improve sensitivity," they announced.

A number of healthcare companies and academic centers are touting deep learning and the algorithms for leveraging as the future of personalized medicine. In April of this year, for instance, Qmed reported on Samsung Medison's RS80A ultrasound imaging machine with the S-Detect for Breast feature, which uses big data collected from breast-exam cases to recommend whether a particular lesion is benign or malignant.

In the same story, Qmed reported that researchers at the Regenstrief Institute and Indiana University School of Informatics had determined that existing algorithms and open-source machine-learning tools were as good as, or better than, human reviewers in detecting cancer cases using data from free-text pathology reports. The computerized approach was also faster than human effort and used fewer resources.

Maureen Kingsley is a contributor to Qmed. 

Like what you're reading? Subscribe to our daily e-newsletter.

[Image from Pixabay]

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like