We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress
Sign In
Advertise with Us
GLOBETECH PUBLISHING LLC

Download Mobile App




AI Algorithm Reads Ultrasound Images from Hand-Held Devices and Smartphone

By MedImaging International staff writers
Posted on 31 Mar 2022
Print article
Image: Researchers are teaching AI to read fetal ultrasound to identify high-risk patients (Photo courtesy of Northwestern University)
Image: Researchers are teaching AI to read fetal ultrasound to identify high-risk patients (Photo courtesy of Northwestern University)

Ultrasound technology is becoming more portable and more affordable. However, up to half of all birthing parents in developing countries are not screened while pregnant because existing hand-held devices require a trained technician to precisely manipulate the ultrasound probe to capture the right images. In addition, the image has to be interpreted by a radiologist or specially trained obstetrician who are limited in many underserved communities and developing countries. That is where artificial intelligence (AI) comes in.

Northwestern University (Evanston, IL, USA) and Google Health (Menlo Park, CA, USA) are collaborating on a project to bring fetal ultrasound to developing countries by combining AI, low-cost hand-held ultrasound devices and a smartphone. The project will develop algorithms enabling AI to read ultrasound images from these devices taken by lightly trained community health workers and even pregnant people at home, with the aim of assessing the wellness of both the birthing parent and baby. Raw ultrasound images will be sent to a smartphone, where AI will distinguish critical features like fetal age and position. The low-cost device will take the image, send it to the smartphone and then the AI will provide a read on factors like fetal age and position. Then Google Health can develop an AI that will do the fetal interpretation.

In the first step to developing the algorithms, the researchers will conduct research with pregnant patients in which they will perform their own ultrasound with a low-cost hand-held device. Northwestern technicians also will perform fetal ultrasounds on patients, and even family members will participate. The patients will then have a regular clinical fetal ultrasound. All the images and other pregnancy-related data will be downloaded into a database.

Study participants will use handheld ultrasound devices that have been pre-installed with Google Health’s custom application to collect, process and deliver the fetal ultrasound “blind sweeps.” “Blind-sweep” ultrasounds consist of six freehand ultrasound sweeps across the abdomen to generate a computer image. The goal is to collect a broad set of data and related information including reports on fetal-growth restriction, placental location, gestational age and other relevant conditions and risk factors. Data will be gathered across all three trimesters and from a diverse representative group of patients. The study will collect ultrasound images from several thousand patients over the next year. The AI will receive professional and amateur images across the many conditions that physicians typically want to monitor such as the age of the fetus and whether it has a heart defect. By having the side-by-side image captures, the AI can adapt to interpret the amateur image capture and learn to interpret them more accurately.

“We want to make high-quality fetal ultrasound as easy as taking your temperature,” said Dr. Mozziyar Etemadi, assistant professor of anesthesiology at Northwestern University Feinberg School of Medicine and leader of the project at Northwestern. “The real power of this AI tool will be to allow for earlier triaging of care, so a lightly trained community health provider can conduct scans of birthing parents. The patients don’t have to go to the city to get it. The AI will help inform what to do next – if the patient is OK or they need to go to a higher level of care. We really believe this will save the lives of a lot of birthing parents and babies.”

Related Links:
Northwestern University 
Google Health 

Gold Member
Solid State Kv/Dose Multi-Sensor
AGMS-DM+
New
Ultrasound System
Acclarix AX9
New
Digital Radiography Generator
meX+20BT lite
Portable Radiology System
DRAGON ELITE & CLASSIC

Print article
Radcal

Channels

MRI

view channel
Image: PET/MRI can accurately classify prostate cancer patients (Photo courtesy of 123RF)

PET/MRI Improves Diagnostic Accuracy for Prostate Cancer Patients

The Prostate Imaging Reporting and Data System (PI-RADS) is a five-point scale to assess potential prostate cancer in MR images. PI-RADS category 3 which offers an unclear suggestion of clinically significant... Read more

Nuclear Medicine

view channel
Image: The new SPECT/CT technique demonstrated impressive biomarker identification (Journal of Nuclear Medicine: doi.org/10.2967/jnumed.123.267189)

New SPECT/CT Technique Could Change Imaging Practices and Increase Patient Access

The development of lead-212 (212Pb)-PSMA–based targeted alpha therapy (TAT) is garnering significant interest in treating patients with metastatic castration-resistant prostate cancer. The imaging of 212Pb,... Read more

General/Advanced Imaging

view channel
Image: The Tyche machine-learning model could help capture crucial information. (Photo courtesy of 123RF)

New AI Method Captures Uncertainty in Medical Images

In the field of biomedicine, segmentation is the process of annotating pixels from an important structure in medical images, such as organs or cells. Artificial Intelligence (AI) models are utilized to... Read more

Imaging IT

view channel
Image: The new Medical Imaging Suite makes healthcare imaging data more accessible, interoperable and useful (Photo courtesy of Google Cloud)

New Google Cloud Medical Imaging Suite Makes Imaging Healthcare Data More Accessible

Medical imaging is a critical tool used to diagnose patients, and there are billions of medical images scanned globally each year. Imaging data accounts for about 90% of all healthcare data1 and, until... Read more
Copyright © 2000-2024 Globetech Media. All rights reserved.