Prostate cancer is the most common cancer for men and, for men in the United States, it’s the second leading cause of death.
Some prostate cancers might be slow-growing and can be monitored over time whereas others need to be treated right away. To determine how aggressive someone’s cancer is, doctors look for abnormalities in slices of biopsied tissue on a slide. But this 2D method makes it hard to properly diagnose borderline cases.
Now a team led by the University of Washington has developed a new, non-destructive method that images entire 3D biopsies instead of just a slice. In a proof-of-principle experiment, the researchers imaged 300 3D biopsies taken from 50 patients — six biopsies per patient — and had a computer use 3D and 2D results to predict the likelihood that a patient had aggressive cancer. The 3D features made it easier for the computer to identify the cases that were more likely to recur within five years.
The team published these results Dec. 1 in Cancer Research.
“We show for the first time that compared to traditional pathology — where a small fraction of each biopsy is examined in 2D on microscope slides — the ability to examine 100% of a biopsy in 3D is more informative and accurate,” said senior author Jonathan Liu, a UW professor of mechanical engineering and of bioengineering. “This is exciting because it is the first of hopefully many clinical studies that will demonstrate the value of non-destructive 3D pathology for clinical decision-making, such as determining which patients require aggressive treatments or which subsets of patients would respond best to certain drugs.”
The researchers used prostate specimens from patients who underwent surgery more than 10 years ago, so the team knew each patient’s outcome and could use that information to train a computer to predict those outcomes. In this study, half of the samples contained a more aggressive cancer.
To create 3D samples, the researchers extracted “biopsy cores” — cylindrically shaped plugs of tissue — from surgically removed prostates and then stained the biopsy cores to mimic the typical staining used in the 2D method. Then the team imaged each entire biopsy core using an open-top light-sheet microscope, which uses a sheet of light to optically “slice” through and image a tissue sample without destroying it.
Shown here is a video of a volume rendering of glands in two 3D biopsy samples from prostates (yellow: the outer walls of the gland; red: the fluid-filled space inside the gland; purple: what researchers called the “gland skeleton,” a stick-like model of the fluid-filled spaces inside the glands). The cancer sample (top) shows smaller and more densely packed glands compared to the benign tissue sample (bottom). Credit: Xie et al./Cancer Research
The 3D images provided more information than a 2D image — specifically, details about the complex tree-like structure of the glands throughout the tissue. These additional features increased the likelihood that the computer would correctly predict a cancer’s aggressiveness.
The researchers used new AI methods, including deep-learning image transformation techniques, to help manage and interpret the large datasets this project generated.
“Over the past decade or so, our lab has focused primarily on building optical imaging devices, including microscopes, for various clinical applications. However, we started to encounter the next big challenge toward clinical adoption: how to manage and interpret the massive datasets that we were acquiring from patient specimens,” Liu said. “This paper represents the first study in our lab to develop a novel computational pipeline to analyze our feature-rich datasets. As we continue to refine our imaging technologies and computational analysis methods, and as we perform larger clinical studies, we hope we can help transform the field of pathology to benefit many types of patients.”
The lead author on this paper is Weisi Xie, a UW mechanical engineering doctoral student. Other co-authors on this paper are Robert Serafin, Gan Gao, and Lindsey Barner, all UW mechanical engineering doctoral students; Kevin Bishop, a UW bioengineering doctoral student; Nicholas Reder, a clinical instructor in the laboratory medicine and pathology department in the UW School of Medicine; Hongyi Huang, UW research staff in mechanical engineering; Chenyi Mao, a UW doctoral student in the chemistry department; Nadia Postupna, a research scientist in the laboratory medicine and pathology department in the UW School of Medicine; Soyoung Kang, a UW assistant teaching professor in the mechanical engineering department; Qinghua Han, a UW undergraduate student studying bioengineering; Jonathan Wright, a professor in the urology department in the UW School of Medicine; C. Dirk Keene and Lawrence True, both professors in the laboratory medicine and pathology department in the UW School of Medicine; Joshua Vaughan, a UW associate professor of chemistry; Adam Glaser, a senior scientist at the Allen Institute who completed this research as a UW mechanical engineering postdoctoral researcher; Can Koyuncu, Pingfu Fu, Andrew Janowczyk and Anant Madabhushi, all at Case Western Reserve University; Patrick Leo at Genentech, who completed this research as a doctoral student at Case Western Reserve University; and Sarah Hawley at the Canary Foundation.
This research was funded by the Department of Defense Prostate Cancer Research Program; the National Cancer Institute; the National Heart, Lung and Blood Institute; the National Institute of Biomedical Imaging and Bioengineering; the National Institute of Mental Health; the VA Merit Review Award; the National Science Foundation; the Nancy and Buster Alvord Endowment; and the Prostate Cancer Foundation Young Investigator Award.
Nicholas Reder, Adam Glaser, Lawrence True and Jonathan Liu are co-founders and shareholders of the UW spinout Lightspeed Microscopy Inc. This company has licensed the technology used in this paper.
For more information, contact Liu at email@example.com.
Grant numbers: W81XWH-18-10358, W81XWH-19-1-0589, W81XWH-15-1-0558, W81XWH-20-1-0851, K99 CA24068, R01CA244170, U24CA199374, R01CA249992, R01CA202752, R01CA208236, R01CA216579, R01CA220581, R01CA257612, U01CA239055, U01CA248226, U54CA254566, R01HL151277, R01EB031002, R43EB028736, R01MH115767, IBX004121A, 1934292 HDR: I-DIRSE-FW, DGE-1762114, DGE-1762114
Original Article: Weisi Xie, Nicholas P Reder, Can F Koyuncu, Patrick Leo, Sarah Hawley, Hongyi Huang, Chenyi Mao, Nadia Postupna, Soyoung Kang, Robert Serafin, Gan Gao, Qinghua Han, Kevin W Bishop, Lindsey A Barner, Pingfu Fu, Jonathan L Wright, C Dirk Keene, Joshua C Vaughan, Andrew Janowczyk, Adam K Glaser, Anant Madabhushi, Lawrence D True and Jonathan TC Liu. Prostate cancer risk stratification via non-destructive 3D pathology with deep learning-assisted gland analysis. Cancer Res. December 1 2021. DOI: 10.1158/0008-5472.CAN-21-2843
Source: University of Washington