Researchers at Stanford University have developed an algorithm that can use data from more than 100,000 chest X-rays, each tagged with a condition the person had been diagnosed with, to teach itself how to diagnose conditions linked to the images.
The algorithm had no guidance about what to look for; its only job was to teach itself by search for patterns, using deep learning.
Now the researchers are using a similar method to diagnose tuberculosis among HIV-positive patients in South Africa, hoping the program will help fill an urgent medical need.
Currently, the algorithm gets the diagnosis right 75% of the time, while human doctors are correct about 62% of the time.
Says Stanford researcher Matthew Lungren, "The ultimate thought from our group is that if we can combine the best of what humans offer in their diagnostic work and the best of what these models can offer, I think you're going to have a better level of health care for everybody."
From NPR
View Full Article
Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA
No entries found