Full Program »
W-Core Transformer Model For Chinese Word Segmentation
Chinese word segmentation tasks is an important research content in the field of natural language processing (NLP). In this paper, we combine the Transformer model to propose the W-core (Window Core) Transformer for the tasks. In this model, W-core can preprocess sentence information according to the characteristics of Chinese and incorporae features extracted by the Transformer model. Experimental results show that the W-core Transformer model can improve the effect of the original Transformer model on Chinese word segmentation.Finally, we improve the performance of W-core Transformer by increasing the number of encoder layers and oversampling.