2021-09-15 0d9efcf05de62def8a88e43181dbe72e 99+ fast 0.0 k0 visitsRNN总结1.单元1.1 普通RNN单元1.2 LSTM1.3 GRU2.结构1.1 输入、输出 1.2 是否双向 1.3 是否堆叠 参考https://blog.csdn.net/gaohanjie123/article/details/88699664 https://www.cnblogs.com/Luv-GEM/p/10788849.html RNN总结http://example.com/2021/09/15/rnn/AuthorLavine HuPosted on2021-09-15Updated on2021-11-26Licensed under# Related Post 1.Transformer-XL Attentive Language Models Beyond a Fixed-Length Context 2.transformer综述 3.attention总结 4.transformer(attention is all your need)