Artificial Intelligence Tool Improves Accuracy of Breast Cancer Imaging

A computer program trained to see patterns among thousands of breast ultrasound images can aid physicians in accurately diagnosing breast cancer, a new study shows.

When tested separately on 44,755 already completed ultrasound exams, the artificial intelligence (AI) tool improved radiologists’ ability to correctly identify the disease by 37 percent and reduced the number of tissue samples, or biopsies, needed to confirm suspect tumors by 27 percent.

Led by researchers from the Department of Radiology at NYU Langone Health and its Laura and Isaac Perlmutter Cancer Center, the team’s AI analysis is believed to be the largest of its kind, involving 288,767 separate ultrasound exams taken from 143,203 women treated at NYU Langone hospitals in New York City between 2012 and 2018. The team’s report publishes online Sept. 24 in the journal Nature Communications.

“Our study demonstrates how artificial intelligence can help radiologists reading breast ultrasound exams to reveal only those that show real signs of breast cancer and to avoid verification by biopsy in cases that turn out to be benign,” says study senior investigator Krzysztof Geras, PhD.

Ultrasound exams use high-frequency sound waves passing through tissue to construct real-time images of breast or other tissues. Although not generally used as a breast cancer screening tool, it has served as an alternative (to mammography) or follow-up diagnostic test for many women, says Geras, an assistant professor in the Department of Radiology at NYU Grossman School of Medicine and a member of the Perlmutter Cancer Center.

Ultrasound is cheaper, more widely available in community clinics, and does not involve exposure to radiation, the researchers say. Moreover, ultrasound is better than mammography for penetrating dense breast tissue and distinguishing packed but healthy cells from compact tumors.

However, the technology has also been found to result in too many false diagnoses of breast cancer, producing anxiety and unnecessary procedures for women. Some studies have shown that a majority of breast ultrasound exams indicating signs of cancer turn out to be noncancerous after biopsy.

“If our efforts to use machine learning as a triaging tool for ultrasound studies prove successful, ultrasound could become a more effective tool in breast cancer screening, especially as an alternative to mammography, and for those with dense breast tissue,” says study co-investigator and radiologist Linda Moy, MD. “Its future impact on improving women’s breast health could be profound,” adds Moy, a professor at NYU Grossman School of Medicine and a member of the Perlmutter Cancer Center.

Geras cautions that while his team’s initial results are promising, his team only looked at past exams in their latest analysis, and clinical trials of the tool in current patients and real-world conditions are needed before it can be routinely deployed. He also has plans to refine the AI software to include additional patient information, such as a woman’s added risk from having a family history or genetic mutation tied to breast cancer, which was not included in their latest analysis.

For the study, over half of ultrasound breast examinations were used to create the computer program. Ten radiologists then each reviewed a separate set of 663 breast exams, with an average accuracy of 92 percent. When aided by the AI model, their average accuracy in diagnosing breast cancer improved to 96 percent. All diagnoses were checked against tissue biopsy results.

The latest statistics from the American Cancer Society estimate that one in eight women (13 percent) in the U.S. will be diagnosed with breast cancer over their lifetime, with more than 300,000 positive diagnoses in 2021 alone.

Funding support for the study was provided by National Institutes of Health grants P41 EB017183 and R21 CA225175; National Science Foundation grant HDR-1922658; Gordon and Betty Moore Foundation grant 9683; and Polish National Agency for Academic Exchange grant PPN/IWA/2019/1/00114/U/00001.

Besides Geras and Moy, other NYU researchers involved in this study are co-lead investigators Yiqiu “Artie” Shen; Farah Shamout; and Jamie Oliver; and co-investigators Jan Witowski; Kawshik Kannan; Jungkyu Park; Nan Wu; Connor Huddleston; Stacey Wolfson; Alexandra Millet; Robin Ehrenpreis; Divya Awal; Cathy Tyma; Naziya Samreen; Yiming Gao; Chloe Chhor; Stacey Gandhi; Cindy Lee; Sheila Kumari- Subaiya; Cindy Leonard; Reyhan Mohammed; Christopher Moczulski; Jaime Altabet; James Babb; Alana Lewin; Beatriu Reig; and Laura Heacock.

 

LINKS FOR STUDY AFTER EMBARGO LIFTS

DOI for this paper will be 10.1038/s41467-021-26023-2. Once the paper is published, your paper will be available to view online at http://www.nature.com/ncomms

withyou android app