ylqfp / awesome-deep-nlp

Awesome deep learning based NLP papers and survey, also some awesome machine learning/vision material

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

awesome-deep-nlp

Attention

  1. ICLR15, Neural Machine Translation by Jointly Learning to Align and Translate , Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio.

  2. ACL15, Encoding Source Language with Convolutional Neural Network for Machine Translation , Fandong Meng, Zhengdong Lu, Mingxuan Wang, Hang Li, Wenbin Jiang, Qun Liu

  3. ACL15, A Hierarchical Neural Autoencoder for Paragraphs and Documents , Jiwei Li, Minh-Thang Luong, Dan Jurafsky

  4. EMNLP15, A Neural Attention Model for Sentence Summarization, Alexander M. Rush, Sumit Chopra and Jason Weston

  5. EMNLP15 short, Not All Contexts Are Created Equal: Better Word Representations with Variable Attention6, Wang Ling.

  6. NAACL15 , Two/Too Simple Adaptations of Word2Vec for Syntax Problems

  7. NIPS14, Volodymyr Mnih, Nicolas Heess, Alex Graves, Koray Kavukcuoglu. Recurrent Models of Visual Attention.

  8. ICLR15, Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio. Neural Machine Translation by Jointly Learning to Align and Translate.

  9. ACL15, Fandong Meng, Zhengdong Lu, Mingxuan Wang, et al. Encoding Source Language with Convolutional Neural Network for Machine Translation.

  10. ACL15, Jiwei Li, Minh-Thang Luong, Dan Jurafsky. A Hierarchical Neural Autoencoder for Paragraphs and Documents.

  11. EMNLP15, Alexander M. Rush, Sumit Chopra, Jason Weston. A Neural Attention Model for Sentence Summarization.

  12. EMNLP15, Wang Ling, Lin Chu-Cheng, Yulia Tsvetkov, et al. Not All Contexts Are Created Equal: Better Word Representations with Variable Attention

  13. End-to-End Attention-based Large Vocabulary Speech Recognition

  14. NIPS14 ws, End-to-end Continuous Speech Recognition using Attention-based Recurrent NN: First Results http://arxiv.org/abs/1412.1602

  15. NIPS15, Attention-Based Models for Speech Recognition

  16. EMNLP15, Effective Approaches to Attention-based Neural Machine Translation

  17. ICML15, Kelvin Xu, Jimmy Ba, Ryan Kiros, et al. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention.

  18. Karol Gregor, Ivo Danihelka, Alex Graves, et al. DRAW: A Recurrent Neural Network For Image Generation.

  19. NIPS15, Karl Moritz Hermann, Tomáš Kočiský, Edward Grefenstette, et al. Teaching Machines to Read and Comprehend.

  20. NIPS15, Lei Jimmy Ba, Roger Grosse, Ruslan Salakhutdinov, Brendan Frey. Learning Wake-Sleep Recurrent Attention Models.

IR

  1. DSSM

    Learning deep structured semantic models for web search using clickthrough data, cikm2013 Paper code1 code2

  2. CDSSM
    A latent semantic model with convolutional-pooling structure for information retrieval, msr, CIKM2014

  3. ARC-I
    convolutional neural network architectures for matching natural language sentences, NIPS2014

  4. ARC-II
    convolutional neural network architectures for matching natural language sentences, NIPS2014

  5. RAE
    Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection, NIPS2011

  6. Deep Match
    A deep architecture for matching short texts, NIPS,2013

  7. CNTN
    Convolutional Neural Tensor Network Architecture for Community-based Question Answering, IJCAI2015

  8. CNNPI
    convolutional neural network for paraphrase identification, NAACL2015

  9. MultiGranCNN
    MultiGranCNN: An architecture for general matching of text chunks on multiple levels of granularity, ACL2015

  10. CLSTM
    Contextual LSTM (CLSTM) models for Large scale NLP tasks, Google, Arxiv201602

  11. CLSM
    A latent semantic model with convolutional-pooling structure for information retrieval, cikm2014

  12. Recurrent-DSSM
    Palangi, H., Deng, L., Shen, Y., Gao, J., He, X., Chen, J., Song, X., and Ward, R. Learning sequential semantic representations of natural language using recurrent neural networks. In ICASSP, 2015.

  13. LSTM-DSSM
    SEMANTIC MODELLING WITH LONG-SHORT-TERM MEMORY FOR INFORMATION RETRIEVAL,ICLR2016, workshop

  14. DCNN: Dynamic convolutional neural network
    a convolutional neural network for modeling sentences, acl2014 convolutional neural network architectures for matching natural language sentences, nips2014, noah

  15. BRAE: bilingually-constrained recursive auto-encoders
    bilingually-constrained phrase embeddings for machine translation, acl2014, long paper

  16. LSTM-RNN
    Deep sentence embedding using lstm networks: analysis and application to information retrieval, 201602

  17. SkipThought
    Skip thought vectors,

  18. Bidirectional LSTM-RNN
    Bi-directional LSTM Recurrent Neural Network for Chinese Word Segmentation, 201602, Arxiv

  19. MV-DNN
    A multi-view deep learning approach for cross domain user modeling in recommendation systems, WWW2015

Knowledge Graph

Deep Learning Methods

  1. UM
    Bordes, Joint Learning of Words and Meaning representations for open-text semantic parsing
  2. LFM
    A latent Factor Model for Highly Multi-relational Data
  3. SE
    Learning Structured Embeddings of Knowledge Bases
  4. SME
    A Semantic Matching Energy Function for Learning with Multi-Relational Data
  5. RESCAL A three-way model for collective learning on multi-relational data
  6. NTN
    Reasoning With Neural Tensor Networks for Knowledge Base Completion
  7. TransE
    Translating Embedding for Modeling Multi-relational Data
  8. TransH
    Knowledge Graph Embedding by Translating on Hyperplanes
  9. TransR
    Learning Entity and Relation Embeddings for Knowledge Graph Completion
  10. TransM
  11. TransG
  12. KG2E
  13. PTransE
  14. TransA@CAS
  15. TransA@THU
  16. STransE

Multimodal/Transfer/Multitask/Ensemble/Semisupervised

Deep Generative Models

About

Awesome deep learning based NLP papers and survey, also some awesome machine learning/vision material