We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress
Sign In
Advertise with Us

Download Mobile App


ATTENTION: Due to the COVID-19 PANDEMIC, many events are being rescheduled for a later date, converted into virtual venues, or altogether cancelled. Please check with the event organizer or website prior to planning for any forthcoming event.

AI-Powered Ultrasound Imaging Detects Breast Cancer

By MedImaging International staff writers
Posted on 14 Mar 2023
Print article
Image: An AI network system for ultrasonography accurately detects and diagnoses breast cancer (Photo courtesy of Pexels)
Image: An AI network system for ultrasonography accurately detects and diagnoses breast cancer (Photo courtesy of Pexels)

Breast cancer is undeniably the most commonly reported type of cancer among women, exhibiting a continuous increase in incidence rates in the past two decades, unlike the other significant cancer types. Early detection and treatment can improve the probability of recovery; however, the survival rate in breast cancer patients sharply declines to less than 75% after the third stage. As a result, regular medical check-ups are critical for reducing mortality rates. Ultrasonography is a major medical imaging technique for the assessment of breast lesions, and computer-aided diagnosis (CAD) systems have aided radiologists by segmenting and identifying lesion features to distinguish between benign and malignant lesions. Now, a team of researchers has developed an AI network system for ultrasonography to accurately detect and diagnose breast cancer.

A team of researchers from Pohang University of Science and Technology (POSTECH, Gyeongbuk. Korea) has developed a deep learning-based multimodal fusion network for the segmentation and classification of breast cancers using B-mode and strain elastography ultrasound images. The team developed deep learning (DL)-based methods to segment the lesions and then classify them as benign or malignant, using both B-mode and strain elastography (SE-mode) images. First, the team constructed a ‘weighted multimodal U-Net (W-MM-U-Net) model’ where the optimum weight is assigned on different imaging modalities to segment lesions, utilizing a weighted-skip connection method. The researchers have also proposed a ‘multimodal fusion framework (MFF)’ on cropped B-mode and SE-mode ultrasound (US) lesion images to classify benign and malignant lesions.

The MFF consists of an integrated feature network (IFN) and a decision network (DN). Unlike other recent fusion methods, the proposed MFF method can simultaneously learn complementary information from convolutional neural networks (CNN) that are trained with B-mode and SE-mode US images. The features of the CNN are ensembled using the multimodal EmbraceNet model, while DN classifies the images using those features. Experimental results on the clinical data reveal that the method identified seven benign patients as being benign in three out of the five trials and six malignant patients as malignant in five out of the five trials. This indicates that the proposed method outperforms the conventional single and multimodal methods and could improve the classification accuracy of radiologists for breast cancer detection in ultrasound images.

“We were able to increase the accuracy of lesion segmentation by determining the importance of each input modal and automatically giving the proper weight,” explained Professor Chulhong Kim from POSTECH, who led the team of researchers. “We trained each deep learning model and the ensemble model at the same time to have a much better classification performance than the conventional single modal or other multimodal methods.”

Related Links:

Gold Supplier
SBRT Phantom with Removable Spine
E2E SBRT Phantom with Removable Spine Model 036S-CVXX-xx
Mobile DR System
uDR 380i Pro
Preclinical MRI System
Straight Arm X-Ray System
Jumong Digital V Structure

Print article



view channel
Image: New scan measures tumor oxygen levels in real-time to help guide treatment (Photo courtesy of ICR)

Oxygen-Enhanced MRI Technology Allows Cancer Doctors to See Inside Tumors

Since the 1950s, researchers have been aware of the difficulty in effectively treating tumors deprived of oxygen, a problem that is further exacerbated when treating them with radiotherapy.... Read more

Nuclear Medicine

view channel
Image: Tracking radiation treatment in real time promises safer, more effective cancer therapy (Photo courtesy of Pexels)

Real-Time 3D Imaging Provides First-of-Its-Kind View of X-Rays Hitting Inside Body During Radiation Therapy

Radiation is used in treatment for hundreds of thousands of cancer patients each year, bombarding an area of the body with high energy waves and particles, usually X-rays. The radiation can kill cancer... Read more

General/Advanced Imaging

view channel
Image: Viz.ai is the first to receive FDA 510(k) clearance for an AI algorithm for abdominal aortic aneurysm (Photo courtesy of Pexels)

AI Algorithm Flags and Triages Suspected Abdominal Aortic Aneurysms from Chest CT Scans

An abdominal aortic aneurysm (AAA) denotes a bulge in the abdominal aorta, the chief artery that transfers blood from the heart to other parts of the body. If not detected and treated in time, AAA can... Read more

Imaging IT

view channel
Image: The new Medical Imaging Suite makes healthcare imaging data more accessible, interoperable and useful (Photo courtesy of Google Cloud)

New Google Cloud Medical Imaging Suite Makes Imaging Healthcare Data More Accessible

Medical imaging is a critical tool used to diagnose patients, and there are billions of medical images scanned globally each year. Imaging data accounts for about 90% of all healthcare data1 and, until... Read more
Copyright © 2000-2023 Globetech Media. All rights reserved.