From Slides to Screens: Virtual Staining in Action

Take a closer look at the behind-the-envelope work in virtual staining

Virtual Staining: a State-of-Art Technology

With a backbone comprised of deep learning, virtual staining algorithms are trained on thousands of paired tissue images. This is how by just reading the unstained biopsy slide scans, the virtual staining software generates predicted H&E and IHC slides on your screen.

By bridging cutting-edge R&D with real-world deployment, we help laboratories save time, cut chemical agents usage and create values.

March 2019

Label-Free Staining

The first class of virtual staining is called label-free staining. It turns autofluorescence images of unstained samples to H&E stain.

October 2020

Stain-to-Stain Translation

As its name suggests, the second class of virtual staining converts one type of stain to another. The most common conversion is from H&E to IHC stain.

July 2025

In-silico Multiplex Staining

This third class of virtual staining predicts the expression of multiple protein biomarkers from H&E images. It enables computational cell phenotyping and microenvironment analysis.

Validation studies

We deliver trusted AI technologies for smarter, faster pathology.

Explore our scientific paper digest here!

Label Free Staining

Validated precision (SSIM)

Using a dataset of 523 samples, Zhang et al. demonstrated that label-free staining can reliably generate high-quality H&E stain predictions with strong structural similarity as physical stains.

Pathologists Review

In a double-blind study, pathologists compared real and virtually stained slides from 250 samples, confirming that the AI-generated stains produced similar diagnoses with high structural and perceptual quality (De Haan et al., 2021).

Reference
De Haan, K. et al. Deep learning-based transformation of H&E stained tissues into special stains. Nat Commun 12, 4884 (2021).

Zhang, Y. et al. Digital synthesis of histological stains using micro-structured and multiplexed virtual staining of label-free tissue. Light Sci Appl 9, 78 (2020).

Stain-to-stain translation

High quality output with precision

Using breast cancer biopsy samples, Klöckner et al. (2025) validated a deep learning pipeline that translates H&E into virtual IHC stains (ER, PR, HER2, Ki-67).

Across multiple test sets, the model showed strong structural and perceptual similarity to real IHC, with PSPStain achieving the best performance.

Reference
Klöckner, P. et al. H&E to IHC virtual staining methods in breast cancer: an overview and benchmarking. npj Digit. Med. 8, 384 (2025).

In-silico Multiplex Staining

Validated Clinical Relevance Across Sites

Using 4 independent datasets, the algorithm showed robust prediction of 50 protein biomarkers from H&E slides.

Across datasets, the algorithm significantly outperforming baseline H&E- and morphology-based models.

Cell phenotyping analysis further showed accurate classification of 7 cell types, with performance comparable to ground-truth data.

Reference
Wu, E. et al. ROSIE : AI generation of multiplex immunofluorescence staining from histopathology images. Preprint at https://doi.org/10.1101/2024.11.10.622859 (2024).

Precision detection of biomarkers on any slide...


Why it matters ?

Accurate biomarker detection is the foundation of precision medicine. By identifying the right molecular signals, pathologists can deliver more personalised, targeted cancer care.

At RainPath, our AI-powered software suite is designed to support your lab, strengthen diagnostic confidence, and prepare you for the era of precision medicine.

See our used case of biomarker identification: Ki-67

Why is the new trend of AI and digital pathology reliable?

Explore our news digest of AI healthcare cases in real life!

AI and healthcare
The reliability of AI in healthcare

AI models for virtual staining are built on big training data sets of 100+ and 1000+ of paired biopsy slide images.

Models are validated with internal and external cohorts of data as shown in the previous slides.

Validated by certified pathologists on multiple organ sites including salivary gland, thyroid, kidney, liver and lung (Rivenson et al., 2019).

Rivenson, Y., Wang, H., Wei, Z. et al. Virtual histological staining of unlabelled tissue-autofluorescence images via deep learning. Nat Biomed Eng 3, 466–477 (2019).
Clinical use of AI
Radiotherapy

OSAIRIS: 
Addenbrooke’s Hospital, UK
Dr Raj Jena and team used AI to automatically segment organs, reducing radiotherapy planning time by 2.5x from hours to minutes.

iSeg:
Northwestern University, USA
Dr Abazeed and Dr Teo deployed a 3D deep learning model to segment lung tumors in moving lungs for radiation planning, improving tumour detection accuracy.

How AI is helping one doctor treat cancer: ‘It’s moved out of the hype phase’ (2025) Financial Times, 8th Sept. Available at: https://www.ft.com/content/fcc33754-d583-4302-aaff-de9c09783706  (Accessed: 8 September 2025).

Sarkar, S., Teo, P. T. & Abazeed, M. E. Deep learning for automated, motion-resolved tumor segmentation in radiotherapy. npj Precis. Onc. 9, 173 (2025).
Clinical use of AI
Cardiology & Vascular

NHS, UK: 
The NHS (UK) deployed an AI tool that analyses brain scans across 107 stroke centres in England.
Outcome: Increasing the full recovery rates from 16% to 48%.

NewYork–Presbyterian Hospital, USA:  
Dr Poterucha and his team developed a deep learning model (EchoNext) trained on over 1 million ECG + imaging records.
Outcome: Through controlled external validation studies, they showed that EchoNext can identify structural heart disease with a higher accuracy than cardiologists.

Gregory, A. (2025) Stroke centres in England given AI tool that will help 50% of patients recover, The Guardian, 1 September. Available at: https://www.theguardian.com/society/2025/sep/01/stroke-centres-in-england-given-ai-tool-that-will-help-50-of-patients-recover(Accessed: 8 September 2025).

Poterucha, T. J. et al. Detecting structural heart disease from electrocardiograms using AI. Nature 644, 221–230 (2025).