We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress
Sign In
Advertise with Us
GLOBETECH PUBLISHING LLC

Download Mobile App




AI-Aided Interpretation of Chest X-Ray Improves Reader Performance and Efficiency

By MedImaging International staff writers
Posted on 03 Sep 2022
Print article
Image: AI-aided chest radiograph interpretation improves reader performance and efficiency (Photo courtesy of Pexels)
Image: AI-aided chest radiograph interpretation improves reader performance and efficiency (Photo courtesy of Pexels)

There has been an increasing interest, with the rise of deep learning and artificial intelligence (AI) applications in medical imaging, to create chest radiograph AI algorithms that can help clinicians to accurately and efficiently detect key radiographic findings. Research shows that AI algorithms can improve the performance of readers when used in a concurrent manner. However, there are concerns about what the impact of AI would be in the real world, given that most research was conducted in a simulated setting without an observer performance tool that mimics the real-world workflow. There is also a lack of evidence on the impact of AI in the reader efficiency, especially in terms of time taken for readers to complete their reports. Now, a new study that explored the impact of AI on reader performance, both in terms of accuracy and efficiency, found that an AI algorithm can improve the reader performance and efficiency in interpreting chest radiograph abnormalities.

Researchers at the Massachusetts General Hospital (Boston, MA, USA) conducted a multicenter cohort study from April to November 2021 that involved radiologists, including attending radiologists, thoracic radiology fellows, and residents, who independently participated in two observer performance test sessions. The study involved a total of 497 frontal chest radiographs from adult patients with and without four target findings (pneumonia, nodule, pneumothorax, and pleural effusion). A commercially available AI algorithm (Lunit INSIGHT CXR, version 3.1.2.0) was used to process the chest radiograph images. The sessions included a reading session with AI and a session without AI, in a randomized crossover manner with a four-week washout period in between. The AI produced a heat map and the image-level probability of the presence of the referable lesion.

The ground truths for the labels were created via consensual reading by two thoracic radiologists. Each reader documented their findings in a customized report template, in which the four target chest radiograph findings and the reader confidence of the presence of each finding was recorded. The time taken for reporting each chest radiograph was also recorded. Sensitivity, specificity, and area under the receiver operating characteristic curve (AUROC) were calculated for each target finding. The target findings were found in 351 of 497 chest radiographs. The AI was associated with higher sensitivity for all findings compared with the readers. AI-aided interpretation was associated with significantly improved reader sensitivities for all target findings, without negative impacts on the specificity. Overall, the AUROCs of readers improved for all four target findings, with significant improvements in detection of pneumothorax and nodule. The reporting time with AI was 10% lower than without AI.

In conclusion, the use of an AI algorithm was associated with an improved sensitivity for detection of four target chest radiograph findings (pneumonia, lung nodules, pleural effusion, and pneumothorax) for radiologists, thoracic imaging fellows as well as radiology residents, while maintaining the specificity. These findings suggest that an AI algorithm can improve the reader performance and efficiency in interpreting chest radiograph abnormalities.

Related Links:
Massachusetts General Hospital

Gold Member
Solid State Kv/Dose Multi-Sensor
AGMS-DM+
New
Pre-Op Planning Solution
Sectra 3D Trauma
Thyroid Shield
Standard Thyroid Shield
New
Enterprise Imaging & Reporting Solution
Syngo Carbon

Print article
Radcal

Channels

MRI

view channel
Image: 11.7 teslas (T) of magnetic field vs. 1.5 and 3 T for conventional MRI machines in hospitals (Photo courtesy of CEA)

World’s Most Powerful MRI Machine Images Living Brain with Unrivaled Clarity

The world's most powerful magnetic resonance imaging (MRI) scanner has generated its first images of the human brain, demonstrating new precision levels that could shed more light on the mysterious human... Read more

Nuclear Medicine

view channel
Image: The radiotheranostic platform employs a MUC16-targeting humanized antibody, huAR9.6 (Photo courtesy of MSK)

New Radiotheranostic System Detects and Treats Ovarian Cancer Noninvasively

Ovarian cancer is the most lethal gynecological cancer, with less than a 30% five-year survival rate for those diagnosed in late stages. Despite surgery and platinum-based chemotherapy being the standard... Read more

General/Advanced Imaging

view channel
Image: The Tyche machine-learning model could help capture crucial information. (Photo courtesy of 123RF)

New AI Method Captures Uncertainty in Medical Images

In the field of biomedicine, segmentation is the process of annotating pixels from an important structure in medical images, such as organs or cells. Artificial Intelligence (AI) models are utilized to... Read more

Imaging IT

view channel
Image: The new Medical Imaging Suite makes healthcare imaging data more accessible, interoperable and useful (Photo courtesy of Google Cloud)

New Google Cloud Medical Imaging Suite Makes Imaging Healthcare Data More Accessible

Medical imaging is a critical tool used to diagnose patients, and there are billions of medical images scanned globally each year. Imaging data accounts for about 90% of all healthcare data1 and, until... Read more
Copyright © 2000-2024 Globetech Media. All rights reserved.