huggingface
NLP小帮手,huggingface的transformer
git: https://github.com/huggingface/transformers
paper: https://arxiv.org/abs/1910.03771v5
整体结构
简单教程:
https://blog.csdn.net/weixin_44614687/article/details/106800244
from_pretrained
底层为load_state_dict
1 | Some weights of the model checkpoint at ../../../../test/data/chinese-roberta-wwm-ext were not used when initializing listnet_bert: ['cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.seq_relationship.weight', 'cls.predictions.bias', 'cls.predictions.transform.LayerNorm.weight'] |
关于model
BertModel -> our model
1 加载transformers中的模型
1 | from transformers import BertPreTrainedModel, BertModel,AutoTokenizer,AutoConfig |
2 基于1中的模型搭建自己的结构
huggingface
# Related Post
1.bert_serving
2.bertviz:attention可视化工具
3.pretrain
4.Prompt-learning小帮手-openprompt
5.AutoTokenizer和BertTokenizer区别
6.中文文本分类工具
7.Tensorflow中的Seq2Seq全家桶
8.nlpcda-NLP中文数据增强工具,强推
1.bert_serving
2.bertviz:attention可视化工具
3.pretrain
4.Prompt-learning小帮手-openprompt
5.AutoTokenizer和BertTokenizer区别
6.中文文本分类工具
7.Tensorflow中的Seq2Seq全家桶
8.nlpcda-NLP中文数据增强工具,强推