We use cookies to understand how you use our site and to improve your experience. This includes personalizing content and advertising. To learn more, click here. By continuing to use our site, you accept our use of cookies. Cookie Policy.

Features Partner Sites Information LinkXpress
Sign In
Advertise with Us

Download Mobile App


ATTENTION: Due to the COVID-19 PANDEMIC, many events are being rescheduled for a later date, converted into virtual venues, or altogether cancelled. Please check with the event organizer or website prior to planning for any forthcoming event.

AI Algorithm Reads Ultrasound Images from Hand-Held Devices and Smartphone

By MedImaging International staff writers
Posted on 31 Mar 2022
Print article
Image: Researchers are teaching AI to read fetal ultrasound to identify high-risk patients (Photo courtesy of Northwestern University)
Image: Researchers are teaching AI to read fetal ultrasound to identify high-risk patients (Photo courtesy of Northwestern University)

Ultrasound technology is becoming more portable and more affordable. However, up to half of all birthing parents in developing countries are not screened while pregnant because existing hand-held devices require a trained technician to precisely manipulate the ultrasound probe to capture the right images. In addition, the image has to be interpreted by a radiologist or specially trained obstetrician who are limited in many underserved communities and developing countries. That is where artificial intelligence (AI) comes in.

Northwestern University (Evanston, IL, USA) and Google Health (Menlo Park, CA, USA) are collaborating on a project to bring fetal ultrasound to developing countries by combining AI, low-cost hand-held ultrasound devices and a smartphone. The project will develop algorithms enabling AI to read ultrasound images from these devices taken by lightly trained community health workers and even pregnant people at home, with the aim of assessing the wellness of both the birthing parent and baby. Raw ultrasound images will be sent to a smartphone, where AI will distinguish critical features like fetal age and position. The low-cost device will take the image, send it to the smartphone and then the AI will provide a read on factors like fetal age and position. Then Google Health can develop an AI that will do the fetal interpretation.

In the first step to developing the algorithms, the researchers will conduct research with pregnant patients in which they will perform their own ultrasound with a low-cost hand-held device. Northwestern technicians also will perform fetal ultrasounds on patients, and even family members will participate. The patients will then have a regular clinical fetal ultrasound. All the images and other pregnancy-related data will be downloaded into a database.

Study participants will use handheld ultrasound devices that have been pre-installed with Google Health’s custom application to collect, process and deliver the fetal ultrasound “blind sweeps.” “Blind-sweep” ultrasounds consist of six freehand ultrasound sweeps across the abdomen to generate a computer image. The goal is to collect a broad set of data and related information including reports on fetal-growth restriction, placental location, gestational age and other relevant conditions and risk factors. Data will be gathered across all three trimesters and from a diverse representative group of patients. The study will collect ultrasound images from several thousand patients over the next year. The AI will receive professional and amateur images across the many conditions that physicians typically want to monitor such as the age of the fetus and whether it has a heart defect. By having the side-by-side image captures, the AI can adapt to interpret the amateur image capture and learn to interpret them more accurately.

“We want to make high-quality fetal ultrasound as easy as taking your temperature,” said Dr. Mozziyar Etemadi, assistant professor of anesthesiology at Northwestern University Feinberg School of Medicine and leader of the project at Northwestern. “The real power of this AI tool will be to allow for earlier triaging of care, so a lightly trained community health provider can conduct scans of birthing parents. The patients don’t have to go to the city to get it. The AI will help inform what to do next – if the patient is OK or they need to go to a higher level of care. We really believe this will save the lives of a lot of birthing parents and babies.”

Related Links:
Northwestern University 
Google Health 

Print article



view channel
Image: Spinal fractures in the elderly are preventable with simple X-rays (Photo courtesy of Pexels)

Simple X-Ray Method Can Diagnose Vertebral Compression and Prevent Spinal Fractures

Vertebral compression means that the spine is compressed, causing a fracture in one of the vertebrae. Vertebral compression fractures (VCFs) occur easily in people with osteoporosis and are very common... Read more

Imaging IT

view channel

Global AI in Medical Diagnostics Market to Be Driven by Demand for Image Recognition in Radiology

The global artificial intelligence (AI) in medical diagnostics market is expanding with early disease detection being one of its key applications and image recognition becoming a compelling consumer proposition... Read more

Industry News

view channel
Image: RSNA`s annual meeting is the world`s largest medical imaging conference (Photo courtesy of RSNA)

RSNA 2022 Sees Rise in Abstract Submissions Ahead of Annual Meeting

The Radiological Society of North America (RSNA, Oak Brook, IL, USA) has announced that nearly 10,400 scientific and educational abstracts have been submitted for the Society's 108th Scientific Assembly... Read more
Copyright © 2000-2022 Globetech Media. All rights reserved.