In this video, I cover the top ten most cited and influential papers in the history of natural language processing, ranked by the number of Google Scholar citations. Some of these papers are new, while others are quite old. Not all of them use neural networks, but each one has made a significant impact in the field.
0:43 - Transformer (2017)
1:27 - LSTM (1997)
2:31 - BERT (2019)
3:17 - LDA (2003)
4:11 - Word2Vec (2013)
5:04 - GLoVE (2014)
5:54 - Encoder-decoder (2014)
6:46 - Attention (2015)
8:06 - BLEU (2002)
8:59 - Encoder-decoder (2014)
9:32 - WordNet (1995)
Papers referenced:
1. "Attention Is All You Need" by Ashish Vaswani et al. (2017)
2. "Long Short-Term Memory" by Sepp Hochreiter and Jürgen Schmidhuber (1997)
3. "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding" by Jacob Devlin et al. (2019)
4. "Latent Dirichlet Allocation" by David M. Blei, Andrew Y. Ng, and Michael I. Jordan (2003)
5. "Efficient Estimation of Word Representations in Vector Space" by Tomas Mikolov et al. (2013)
6. "GloVe: Global Vectors for Word Representation" by Jeffrey Pennington, Richard Socher, and Christopher D. Manning (2014)
7. "Neural Machine Translation by Jointly Learning to Align and Translate" by Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio (2015)
8. "Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation" by Kyunghyun Cho et al. (2014)
9. "BLEU: a Method for Automatic Evaluation of Machine Translation" by Kishore Papineni et al. (2002)
10. "Sequence to Sequence Learning with Neural Networks" by Ilya Sutskever, Oriol Vinyals, and Quoc V. Le (2014)
11. "WordNet: An Electronic Lexical Database" by George A. Miller (1995)
Негізгі бет Ғылым және технология Top 10 most cited and influential papers in the history of NLP
Пікірлер: 15