Correlation Engine 2.0
Clear Search sequence regions


  • learn (1)
  • slide (8)
  • vanilla (1)
  • Sizes of these terms reflect their relevance to your search.

    The Transformer-based methods provide a good opportunity for modeling the global context of gigapixel whole slide image (WSI), however, there are still two main problems in applying Transformer to WSI-based survival analysis task. First, the training data for survival analysis is limited, which makes the model prone to overfitting. This problem is even worse for Transformer-based models which require large-scale data to train. Second, WSI is of extremely high resolution (up to 150,000 x 150,000 pixels) and is typically organized as a multi-resolution pyramid. Vanilla Transformer cannot model the hierarchical structure of WSI (such as patch cluster-level relationships), which makes it incapable of learning hierarchical WSI representation. To address these problems, in this paper, we propose a novel Sparse and Hierarchical Transformer (SH-Transformer) for survival analysis. Specifically, we introduce sparse self-attention to alleviate the overfitting problem, and propose a hierarchical Transformer structure to learn the hierarchical WSI representation. Experimental results based on three WSI datasets show that the proposed framework outperforms the state-of-the-art methods.

    Citation

    Rui Yan, Zhilong Lv, Zhidong Yang, Senlin Lin, Chunhou Zheng, Fa Zhang. Sparse and Hierarchical Transformer for Survival Analysis on Whole Slide Images. IEEE journal of biomedical and health informatics. 2023 Aug 22;PP


    PMID: 37607153

    View Full Text