We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress
Sign In
Advertise with Us
GLOBETECH PUBLISHING LLC

Download Mobile App




Novel AI Algorithm for Mammography Interpretation Can Successfully Spot Breast Cancer Years Before Radiologists

By MedImaging International staff writers
Posted on 13 Jan 2021
Print article
Image: DeepHealth`s AI identifies cancer in a patient one year earlier than detected in practice (Photo courtesy of DeepHealth)
Image: DeepHealth`s AI identifies cancer in a patient one year earlier than detected in practice (Photo courtesy of DeepHealth)
A novel artificial intelligence (AI) algorithm for mammography interpretation has demonstrated the ability to detect breast cancer a year or more earlier than current practice.

DeepHealth (Cambridge, MA, USA), a wholly owned subsidiary of RadNet, Inc. (Los Angeles, CA, USA), compared its AI to five full-time, breast-fellowship-trained expert radiologists reading the same screening mammograms. The software exhibited higher performance than all five radiologists, and the results suggest that the AI could help detect cancer one to two years earlier than standard interpretation in many cases.

Additionally, the software showed promising generalization capabilities, demonstrating strong performance when tested across clinical sites and populations that were not directly involved in training the AI algorithms. While AI holds tremendous promise for improving screening mammography interpretation, there remain substantial challenges in developing expert-level AI. The new study by DeepHealth demonstrates progress in resolving these challenges.

“Reaching world-class performance requires a new way of building AI,” said Gregory Sorensen, M.D., CEO, and co-founder of DeepHealth. “The brute-force methods that have worked so well in other domains, such as self-driving cars or game playing, where data is plentiful, have not translated effectively to many parts of medicine, where human data is often scarce. For example, to train the technology for better detection, AI algorithms must be developed from annotated data where the cancer status is known. Such data can be difficult to obtain. Then, to validate performance, the AI should be tested across different clinical sites and patient populations in different scenarios.”

“We have developed an approach that mimics how humans often learn by progressively training the AI models on more difficult tasks. By leveraging prior information learned in each successive training stage, this strategy results in AI that detects cancer accurately while also relying less on highly-annotated data,” said lead author Bill Lotter, Ph.D., CTO, and co-founder of DeepHealth. “Our approach and validation extend to 3D mammography, which is particularly important given its growing use and the significant challenges it presents for AI.”

Related Links:
DeepHealth
RadNet, Inc.


Gold Member
Solid State Kv/Dose Multi-Sensor
AGMS-DM+
X-Ray Meter
Cobia SENSE
Dose Calibration Electrometer
PC Electrometer
New
C-Arm with FPD
Digiscan V20 / V30

Print article

Channels

Ultrasound

view channel
Image: Structure of the proposed transparent ultrasound transducer and its optical transmittance (Photo courtesy of POSTECH)

Ultrasensitive Broadband Transparent Ultrasound Transducer Enhances Medical Diagnosis

The ultrasound-photoacoustic dual-modal imaging system combines molecular imaging contrast with ultrasound imaging. It can display molecular and structural details inside the body in real time without... Read more

Nuclear Medicine

view channel
Image: PET/CT of a 60-year-old male patient with clinical suspicion of lung cancer (Photo courtesy of EJNMMI Physics)

Early 30-Minute Dynamic FDG-PET Acquisition Could Halve Lung Scan Times

F-18 FDG-PET scans are a way to look inside the body using a special dye, and these scans can be either static or dynamic. Static scans happen 60 minutes after the dye is administered into the body, showing... Read more

Imaging IT

view channel
Image: The new Medical Imaging Suite makes healthcare imaging data more accessible, interoperable and useful (Photo courtesy of Google Cloud)

New Google Cloud Medical Imaging Suite Makes Imaging Healthcare Data More Accessible

Medical imaging is a critical tool used to diagnose patients, and there are billions of medical images scanned globally each year. Imaging data accounts for about 90% of all healthcare data1 and, until... Read more
Copyright © 2000-2024 Globetech Media. All rights reserved.