•  
  •  
 

Abstract

Transformers are a specific category of neural network design. Transformers often depend on extensive pre-training on a large scale and exhibit a notable degree of computational complexity. The disadvantage of using this method is a significant increase in computational complexity, which necessitates a significant commitment of time and computing resources in order to successfully work with these models. Transformer networks possess the desirable benefit of extracting distant characteristics effectively via their self-attention mechanism. In this paper, the Global Self-Attention Transformer module is applied to tackle these issues. The model is based on a segmentation problem called Brain-GS that works as a mechanism and encompasses several forms, one of which is global self-attention. The aim of the experiment is to attain the best precision in segmentation lesions. Unlike localized self-attention, global self-attention assigns equal importance to all items within a given sequence. Global attention mechanism was used that demonstrates high efficiency Unet, making it suitable as the fundamental component of a deep neural network. The model is able to comprehend and accurately reflect the long-range relationships that are present in the data. Using the densnet and Resnet50 backbones, our approach is compared to the recommended architecture in the context of multimodal brain tumor segmentation. The proposed models may have a big effect on the prognosis and treatment of people with glioblastoma, a type of brain cancer that is very likely to be fatal. Our own model achieved a 0.896 dice score and an accuracy of 0.987, and Jaccard achieved 0.901 for validation data and tumor core.

Keywords

Brain-GS, Brain tumor, Global self attention, Segmentation, Transformer network

Subject Area

Computer Science

Article Type

Article

First Page

18167

Last Page

18180

Creative Commons License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Share

COinS