Media contact: Beena Thannickal
American Society of Clinical Oncology, researchers from the O’Neal Comprehensive Cancer Center at the University of Alabama at Birmingham compared the current practice of evaluating tumor response in advanced cancer with an artificial intelligence software tool designed to assist radiologists.
In a multi-institution study presented this week at the annual meeting of the“It turns out that human-guided AI is more accurate, more reproducible and faster,” said Andrew Smith, M.D., Ph.D., associate professor, vice chair of Clinical Research and co-director of AI in the UAB Department of Radiology, and director of the Tumor Metrics Lab, a component of the O’Neal Cancer Center’s Human Imaging shared resource, who presented the findings today at ASCO.
The Tumor Metrics Lab conducts image interpretation for all cancer patients on clinical trials in the O’Neal Cancer Center who have tumor imaging to determine their response to treatment. The lab does more than 1,000 tumor metric reads per year.
Doctors track the progress of tumors using computed tomography scans. Radiologists measure the tumors manually on digital images of those scans and usually dictate their findings into text-based reports. But the group of researchers behind the new study hypothesized that this traditional system could be improved with some assistance from artificial intelligence. They used AI Mass, a cancer-specific implementation of the medical AI software platform AI Metrics, trained with more than 15,000 expert-labeled images.
“AI Mass uses AI to, one, measure tumors after a single mouse click; two, automatically label the anatomic location of tumors; and three, track tumors over time,” Smith said. AI Metrics is a product of a startup company of the same name, with Smith as CEO, that was spun off from UAB in 2019 and is now raising a seed round of capital.
In the study presented at ASCO by Smith, body CT images from 120 consecutive patients with advanced cancer were independently evaluated by 24 radiologists. The patients all had multiple serial imaging exams and had been treated with systemic therapy. Each radiologist categorized treatment responses and dictated text-based reports. Meanwhile, the AI-assisted software automatically calculated percent changes in tumor burden and categorized treatment response using standardized methods commonly found in clinical trials. A team of researchers looked for major errors — such as incorrect measurements, erroneous language in reports or misidentifying the tumor location — time spent in image interpretation and inter-reader agreement about the final tumor response. Twenty oncologic providers then evaluated the accuracy of the manually dictated text reports versus AI-assisted reports that included a graph, table and key images.
“The AI-assisted approach increased accuracy by 25 percent, reduced major errors by 99 percent, was nearly two times faster than current practice methods, and improved inter-reader agreement by 45 percent,” Smith said. The only error by the AI-assisted software “was a freeform text note that we could not interpret,” he noted. “All of the tumor measurements, percent changes, etc., were correct.” Smith, who is owner of AI Metrics as well as its CEO, did not directly participate in data gathering, have access to study data or conduct any of the statistical analysis.
“It is gratifying to see such a practical application of artificial intelligence,” said Cheri Canon, M.D., professor, chair and Witten-Stanley Endowed Chair of the Department of Radiology. “Seldom do we hear of such overwhelmingly positive results from a study: 99 percent reduction in major errors. The impact this will have on patients, specifically cancer patients, will be far-reaching.”
The work has other benefits, Canon noted. These include “improved workflow for the radiologists, who are now more than ever burdened with complex imaging studies and increased incidence of burnout, and for our clinical colleagues, a clear and concise longitudinal report. This is a monumental improvement to the current standard of care and will in fact set a new standard.”
The project involves collaborators from 21 institutions and three small businesses, including AI Metrics. The AI behind the software was trained using carefully annotated images from UAB and the National Institutes of Health by a team of UAB clinical research scientists, Smith says.
“We drew freeform shapes around the edge of each tumor to train the AI to do the same,” he explained. “We also labelled the anatomic location of all kinds of tumors located across different parts of the body. We were able to train the AI to provide an anatomic location of the tumor. That had never been done before.”
In practice, “the user guides the AI, but the AI does the measuring and labeling,” Smith said. “We can extract the measurements in a digital form. Because the data is digital, we can generate a graph or table, and we can even save key images of all image findings. The AI-assisted reports are a major leap beyond a text-based report.”
Importantly, radiologists work with the AI throughout the process, Smith says.
“Let’s say there is a tumor in the liver that needs to be followed over time,” he said. “In our AI Metrics software, the user simply clicks on a lesion and a first AI algorithm measures it. The user can change it within two seconds if something is wrong. As you can imagine, the AI is more reliable than having different radiologists do this manually. This is what we call transparent AI, where the user both directs the AI and can check it. Then a second AI algorithm provides the anatomic location. The user can easily check and correct this as well within about two seconds.”
As part of the study, radiologists and 20 oncologic providers were asked to rate the experience of using the AI-assisted software and the value of the AI-assisted reports.
“The AI-assisted software was preferred by 96 percent of radiologists, and the AI-assisted reports were preferred by 100 percent of oncologic providers,” Smith said. “We have established a new standard of care with AI. I think that having this software could save lives, though we don’t yet have that kind of data.”
Smith says the team is not done.
“Since the study, we re-trained the AI on 55,000 tumors, and we hope to get closer to 100,000 in the coming months,” he said. “That is more tumors than an average radiologist measures in a lifetime. This is how we leverage the power of AI.” The researchers are now writing an NIH grant to take their work into cancer screening and early cancer detection and management, Smith says. “Most cancer therapies apply to only a few cancers or even a subtype of a single cancer. This technology applies to all solid cancers imaged on CT and MRI. We can apply this technology to many other stages of cancer.”
Read an abstract of the study, “Multi-institutional comparative effectiveness of advanced cancer longitudinal imaging response evaluation methods: Current practice versus artificial intelligence-assisted,” on ASCO’s website here.