attention总结

https://easyai.tech/ai-definition/attention/#type

1.Show, Attend and Tell: Neural Image Caption Generation with Visual Attention

提出了两种 attention 模式,即 hard attention 和 soft attention

2.Effective Approaches to Attention-based Neural Machine Translation

文章提出了两种 attention 的改进版本,即 global attention 和 local attention。

3.Attention Is All You Need

提出self attention

4.Hierarchical Attention Networks for Document Classification

提出了Hierarchical Attention用于文档分类

5.Attention-over-Attention Neural Networks for Reading Comprehension

提出了Attention Over Attention的Attention机制

6.Convolutional Sequence to Sequence Learning

论文中还采用了 Multi-step Attention

Author

Lavine Hu

Posted on

2021-10-12

Updated on

2024-07-14

Licensed under

Comments

:D 一言句子获取中...