Efficient 3D light-sheet imaging of very large-scale optically cleared human brain and prostate tissue samples

Anna Schueth*, Sven Hildebrand, Iryna Samarska, Shubharthi Sengupta, Annemarie Kiessling, Andreas Herrler, Axel Zur Hausen, Michael Capalbo, Alard Roebroeck*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

The ability to image human tissue samples in 3D, with both cellular resolution and a large field of view (FOV), can improve fundamental and clinical investigations. Here, we demonstrate the feasibility of light-sheet imaging of ~5 cm3 sized formalin fixed human brain and up to ~7 cm3 sized formalin fixed paraffin embedded (FFPE) prostate cancer samples, processed with the FFPE-MASH protocol. We present a light-sheet microscopy prototype, the cleared-tissue dual view Selective Plane Illumination Microscope (ct-dSPIM), capable of fast 3D high-resolution acquisitions of cm3 scale cleared tissue. We used mosaic scans for fast 3D overviews of entire tissue samples or higher resolution overviews of large ROIs with various speeds: (a) Mosaic 16 (16.4 µm isotropic resolution, ~1.7 h/cm3), (b) Mosaic 4 (4.1 µm isotropic resolution, ~ 5 h/cm3) and (c) Mosaic 0.5 (0.5 µm near isotropic resolution, ~15.8 h/cm3). We could visualise cortical layers and neurons around the border of human brain areas V1&V2, and could demonstrate suitable imaging quality for Gleason score grading in thick prostate cancer samples. We show that ct-dSPIM imaging is an excellent technique to quantitatively assess entire MASH prepared large-scale human tissue samples in 3D, with considerable future clinical potential.

Original languageEnglish
Article number170
Number of pages15
JournalCommunications Biology
Volume6
Issue number1
DOIs
Publication statusPublished - 13 Feb 2023

Fingerprint

Dive into the research topics of 'Efficient 3D light-sheet imaging of very large-scale optically cleared human brain and prostate tissue samples'. Together they form a unique fingerprint.

Cite this