Skip to main content
9th World Conference on Information Systems and Technologies

Full Program »

W-Core Transformer Model For Chinese Word Segmentation

Chinese word segmentation tasks is an important research content in the field of natural language processing (NLP). In this paper, we combine the Transformer model to propose the W-core (Window Core) Transformer for the tasks. In this model, W-core can preprocess sentence information according to the characteristics of Chinese and incorporae features extracted by the Transformer model. Experimental results show that the W-core Transformer model can improve the effect of the original Transformer model on Chinese word segmentation.Finally, we improve the performance of W-core Transformer by increasing the number of encoder layers and oversampling.

Hai Lin
Guangxi University
China

Lina Yang
Guangxi University
China

Patrick Shen-Pei Wang
Computer and Information Science
United States

 


Powered by OpenConf®
Copyright ©2002-2020 Zakon Group LLC