bertviz:attention可视化工具

看不同layer,不同head的attention

注意:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
from bertviz.neuron_view import show
from bertviz.transformers_neuron_view import BertModel, BertTokenizer
model1=BertModel.from_pretrained(path)
model_type = 'bert'

show(model1, model_type, tokenizer, sentence_a, sentence_b, layer=4, head=3)
可以
###########################

from bertviz.neuron_view import show
from transformers import BertTokenizer, BertModel
model1=BertModel.from_pretrained(path)
model_type = 'bert'

show(model1, model_type, tokenizer, sentence_a, sentence_b, layer=4, head=3)
报错

参考

https://zhuanlan.zhihu.com/p/457043243

bertviz:attention可视化工具

http://example.com/2022/06/30/bertviz/

Author

Lavine Hu

Posted on

2022-06-30

Updated on

2022-07-01

Licensed under

Comments

:D 一言句子获取中...