AI scans lung X-rays to find patients infected with Covid-19
Image credit: Dreamstime
An AI that can detect signs of Covid-19 by looking at X-ray images of a patients’ lungs has been developed by Northwestern University researchers.
Called DeepCOVID-XR, the machine-learning algorithm outperformed a team of specialised thoracic radiologists – spotting Covid-19 in X-rays about 10 times faster and 1-6 per cent more accurately.
The researchers believe physicians could use the system to rapidly screen patients who are admitted into hospitals for reasons other than Covid-19. Faster, earlier detection of the highly contagious virus could potentially protect healthcare workers and other patients by triggering the positive patient to isolate sooner.
The study’s authors also believe the algorithm could potentially flag patients for isolation and testing who are not otherwise under investigation for the virus.
“We are not aiming to replace actual testing,” said Northwestern’s Aggelos Katsaggelos, senior author of the study. “X-rays are routine, safe and inexpensive. It would take seconds for our system to screen a patient and determine if that patient needs to be isolated.”
Cardiologist Dr Ramsey Wehbe said: “AI doesn’t confirm whether or not someone has the virus. But if we can flag a patient with this algorithm, we could speed up triage before the test results come back.”
For many patients with Covid-19, chest X-rays display similar patterns. Instead of clear, healthy lungs, their lungs appear patchy and hazy.
The problem is that pneumonia, heart failure and other illnesses in the lungs can look similar on X-rays. It takes a trained eye to tell the difference between Covid-19 and something less contagious.
To develop, train and test the new algorithm, the researchers used 17,002 chest X-ray images – the largest published clinical dataset of chest X-rays from the Covid-19 era used to train an AI system. Of those images, 5,445 came from Covid-19-positive patients from sites across the Northwestern Memorial Healthcare System.
The team then tested DeepCOVID-XR against five experienced cardiothoracic fellowship-trained radiologists on 300 random test images. Each radiologist took approximately two-and-a-half to three-and-a-half hours to examine this set of images, whereas the AI system took about 18 minutes.
The radiologists’ accuracy ranged from 76-81 per cent whereas DeepCOVID-XR managed to achieve 82 per cent accuracy.
“These are experts who are sub-specialty trained in reading chest imaging,” Wehbe said. “Whereas the majority of chest X-rays are read by general radiologists or initially interpreted by non-radiologists, such as the treating clinician. A lot of times decisions are made based off that initial interpretation.”
“Radiologists are expensive and not always available,” Katsaggelos said. “X-rays are inexpensive and already a common element of routine care. This could potentially save money and time – especially because timing is so critical when working with Covid-19.”
The Northwestern researchers have made the algorithm publicly available with hopes that others can continue to train it with new data.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.