Radiologists Outperformed AI in Identifying Lung Diseases on Chest X-Ray

In a study of more than 2,000 chest X-rays, radiologists outperformed AI in accurately identifying the presence and absence of three common lung diseases, according to a study published in Radiology, a journal of the Radiological Society of North America (RSNA).

"Chest radiography is a common diagnostic tool, but significant training and experience is required to interpret exams correctly," said lead researcher Louis L. Plesner, M.D., resident radiologist and Ph.D. fellow in the Department of Radiology at Herlev and Gentofte Hospital in Copenhagen, Denmark.

While commercially available and FDA-approved AI tools are available to assist radiologists, Dr. Plesner said the clinical use of deep-learning-based AI tools for radiological diagnosis is in its infancy.

"While AI tools are increasingly being approved for use in radiological departments, there is an unmet need to further test them in real-life clinical scenarios," Dr. Plesner said. "AI tools can assist radiologists in interpreting chest X-rays, but their real-life diagnostic accuracy remains unclear."

Dr. Plesner and a team of researchers compared the performance of four commercially available AI tools with a pool of 72 radiologists in interpreting 2,040 consecutive adult chest X-rays taken over a two-year period at four Danish hospitals in 2020. The median age of the patient group was 72 years. Of the sample chest X-rays, 669 (32.8%) had at least one target finding.

The chest X-rays were assessed for three common findings: airspace disease (a chest X-ray pattern, for example, caused by pneumonia or lung edema), pneumothorax (collapsed lung) and pleural effusion (a buildup of water around the lungs).

AI tools achieved sensitivity rates ranging from 72 to 91% for airspace disease, 63 to 90% for pneumothorax, and 62 to 95% for pleural effusion.

"The AI tools showed moderate to a high sensitivity comparable to radiologists for detecting airspace disease, pneumothorax and pleural effusion on chest X-rays," he said. "However, they produced more false-positive results (predicting disease when none was present) than the radiologists, and their performance decreased when multiple findings were present and for smaller targets."

For pneumothorax, positive predictive values - the probability that patients with a positive screening test truly have the disease - for the AI systems ranged between 56 and 86%, compared to 96% for the radiologists.

"AI performed worst at identifying airspace disease, with positive predictive values ranging between 40 and 50%," Dr. Plesner said. "In this difficult and elderly patient sample, the AI predicted airspace disease where none was present five to six out of 10 times. You cannot have an AI system working on its own at that rate."

According to Dr. Plesner, the goal of radiologists is to balance the ability of finding and excluding disease, avoiding both significant overlooked diseases and overdiagnosis.

"AI systems seem very good at finding disease, but they aren't as good as radiologists at identifying the absence of disease especially when the chest X-rays are complex" he said. "Too many false-positive diagnoses would result in unnecessary imaging, radiation exposure and increased costs."

Dr. Plesner said most studies generally tend to evaluate the ability of AI to determine the presence or absence of a single disease, which is a much easier task than real-life scenarios where patients often present with multiple diseases.

"In many prior studies claiming AI superiority over radiologists, the radiologists reviewed only the image without access to the patient’s clinical history and previous imaging studies," he said. "In everyday practice, a radiologist’s interpretation of an imaging exam is a synthesis of these three data points. We speculate that the next generation of AI tools could become significantly more powerful if capable of this synthesis as well, but no such systems exist yet."

"Our study demonstrates that radiologists generally outperform AI in real-life scenarios where there is a wide variety of patients," he said. "While an AI system is effective at identifying normal chest X-rays, AI should not be autonomous for making diagnoses."

Dr. Plesner noted that these AI tools could boost radiologists’ confidence in their diagnoses by providing a second look at chest X-rays.

Lind Plesner L, Müller FC, Brejnebøl MW, Laustrup LC, Rasmussen F, Nielsen OW, Boesen M, Brun Andersen M.
Commercially Available Chest Radiograph AI Tools for Detecting Airspace Disease, Pneumothorax, and Pleural Effusion.
Radiology. 2023 Sep;308(3):e231236. doi: 10.1148/radiol.231236

Most Popular Now

ChatGPT can Produce Medical Record Notes…

The AI model ChatGPT can write administrative medical notes up to ten times faster than doctors without compromising quality. This is according to a new study conducted by researchers at...

Can Language Models Read the Genome? Thi…

The same class of artificial intelligence that made headlines coding software and passing the bar exam has learned to read a different kind of text - the genetic code. That code...

Study Shows Human Medical Professionals …

When looking for medical information, people can use web search engines or large language models (LLMs) like ChatGPT-4 or Google Bard. However, these artificial intelligence (AI) tools have their limitations...

Bayer and Google Cloud to Accelerate Dev…

Bayer and Google Cloud announced a collaboration on the development of artificial intelligence (AI) solutions to support radiologists and ultimately better serve patients. As part of the collaboration, Bayer will...

Shared Digital NHS Prescribing Record co…

Implementing a single shared digital prescribing record across the NHS in England could avoid nearly 1 million drug errors every year, stopping up to 16,000 fewer patients from being harmed...

Ask Chat GPT about Your Radiation Oncolo…

Cancer patients about to undergo radiation oncology treatment have lots of questions. Could ChatGPT be the best way to get answers? A new Northwestern Medicine study tested a specially designed ChatGPT...

North West Anglia Works with Clinisys to…

North West Anglia NHS Foundation Trust has replaced two, legacy laboratory information systems with a single instance of Clinisys WinPath. The trust, which serves a catchment of 800,000 patients in North...

Can AI Techniques Help Clinicians Assess…

Investigators have applied artificial intelligence (AI) techniques to gait analyses and medical records data to provide insights about individuals with leg fractures and aspects of their recovery. The study, published in...

AI Makes Retinal Imaging 100 Times Faste…

Researchers at the National Institutes of Health applied artificial intelligence (AI) to a technique that produces high-resolution images of cells in the eye. They report that with AI, imaging is...

SPARK TSL Acquires Sentean Group

SPARK TSL is acquiring Sentean Group, a Dutch company with a complementary background in hospital entertainment and communication, and bringing its Fusion Bedside platform for clinical and patient apps to...

GPT-4 Matches Radiologists in Detecting …

Large language model GPT-4 matched the performance of radiologists in detecting errors in radiology reports, according to research published in Radiology, a journal of the Radiological Society of North America...

Standing Up for Health Tech and SMEs: Sh…

AS the new chair of the health and social care council at techUK, Shane Tickell talked to Highland Marketing about his determination to support small and innovative companies, by having...