2021-11-22 6e056f7e582fae7d38d3f1d58fb3f420 99+ fast 0.1 k0 visitstoken embedding https://www.cnblogs.com/d0main/p/10447853.html start=>start: 开始 io=>inputoutput: 输入文本 cond=>condition: 条件 sub=>subroutine: 子流程 end=>end: 结束 op1=>operation: 输入文本 op2=>operation: tokenize op3=>operation: 词向量矩阵(预训练的或者随机初始化) op4=>operation: token embbedding op1->op2->op3->op4{"scale":1,"line-width":2,"line-length":50,"text-margin":10,"font-size":12} token embeddinghttp://example.com/2021/11/22/token-emb/AuthorLavine HuPosted on2021-11-22Updated on2022-01-23Licensed under# Related Post 1.Bert系列之句向量生成 2.SimCSE Simple Contrastive Learning of Sentence Embeddings 3.ConSERT A Contrastive Framework for Self-Supervised Sentence Representation Transfer 4.word2vec 5.nlp中使用预训练的词向量和随机初始化的词向量的区别在哪里? 6.Sentence-BERT Sentence Embeddings using Siamese BERT-Networks 7.文本表示