Researchers at Harvard Medical School and National Cheng Kung University in Taiwan have developed a new artificial intelligence (AI) tool that examines tumor sample images to provide doctors with prognoses and guidance on the most appropriate treatments for colorectal cancer. According to the team, the tool can discern how aggressive the colorectal tumor is, how likely the patient is to survive—both with and without cancer recurrence—and the optimal therapy for that patient.
The investigators noted that the tool is meant to enhance care, not replace human expertise. Their report on the tool was published Thursday in Nature Communications.
“Our model performs tasks that human pathologists cannot do based on image viewing alone,” said study co-senior author Kun-Hsing Yu, MD, PhD, assistant professor of biomedical informatics at the Blavatnik Institute at Harvard Medical School. “What we anticipate is not a replacement of human pathology expertise, but augmentation of what human pathologists can do. We fully expect that this approach will augment the current clinical practice of cancer management.”
While there are other AI tools available to enhance cancer care and the practice of pathology, the team says these tools simply replicate, or optimize human expertise. The new freely available tool, called MOMA (Multi-omics Multi-cohort Assessment), detects and interprets visual patterns on microscopy images that are not discernable to the human eye.
To train the model, the researchers gathered data from roughly 2,000 colorectal cancer patients from a number of different patient cohorts, that collectively include more than 450,000 participants. The data sources included the Health Professionals Follow-up Study, the Nurses’ Health Study, the Cancer Genome Atlas Program, and the NIH’s PLCO (Prostate, Lung, Colorectal, and Ovarian) Cancer Screening Trial.
The investigators fed the model data on the patients’ age, sex, cancer stage, and outcomes, as well as information about the tumors’ genomic, epigenetic, protein, and metabolic profiles. They then showed the model pathology images of tumor samples and tasked it with finding visual markers related to tumor types, genetic mutations, epigenetic alterations, disease progression, and patient survival.
Once the model was trained, the team fed it a set of images it had not analyzed before of tumor samples from different patients to see how the model performed in a real-world setting. When they compared the performance of the model to patient outcomes, they found the model accurately predicted the patients’ overall survival following diagnosis and how many of those years would be cancer-free. Finally, the model also accurately predicted how individual patients might respond to therapy based on whether that patient’s tumor harbored mutations that make the cancer more, or less, aggressive.
Now, with a working model in hand, the team noted that it will continue to refine its performance as more information becomes available.
“It is critical that with any AI model, we continuously monitor its behavior and performance because we may see shifts in the distributions of disease burden or new environmental toxins that contribute to cancer development,” Yu said. “It’s important to augment the model with new and more data as they come along so that its performance never lags behind.”
SOURCE: Inside Precision Medicine