Papers
arxiv:2211.11304

TCBERT: A Technical Report for Chinese Topic Classification BERT

Published on Nov 21, 2022
Authors:
,
,
,
,
,
,

Abstract

Bidirectional Encoder Representations from Transformers or BERT~devlin-etal-2019-bert has been one of the base models for various NLP tasks due to its remarkable performance. Variants customized for different languages and tasks are proposed to further improve the performance. In this work, we investigate supervised continued pre-training~gururangan-etal-2020-dont on BERT for Chinese topic classification task. Specifically, we incorporate prompt-based learning and contrastive learning into the pre-training. To adapt to the task of Chinese topic classification, we collect around 2.1M Chinese data spanning various topics. The pre-trained Chinese Topic Classification BERTs (TCBERTs) with different parameter sizes are open-sourced at https://huggingface.co/IDEA-CCNL.

Community

Sign up or log in to comment

Models citing this paper 6

Browse 6 models citing this paper

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2211.11304 in a dataset README.md to link it from this page.

Spaces citing this paper 2

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.