ravi03071991 / Attentive_LSTM_question-answering

Bi-directional LSTM with attention for question answering

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Attentive LSTM for question/answering

This is an implementation of the paper - Improved Representation Learning for Question Answer Matching. It is implemented on Tensorflow (1.3.0).

Run Model

python train.py

Model Architecture

The model uses bidirectional LSTMs to construct question vector and applies attention on question embedding to contruct answer vector. The loss fuction is the cosine similarity between the question and the answer. For more info, check out the above mentioned paper.

model

Files -

  1. WikiQA-test.tsv - Data for the test dataset
  2. WikiQA-train.tsv - Data for the training dataset
  3. config.yml - Model configurations file for hyperparameter tuning
  4. input_wikiqa.py - The text processing file for the dataset used - WikiQA
  5. train.py - the file to be executed for training the model
  6. model.py - The file containing the model

Find more info at this blog post.

About

Bi-directional LSTM with attention for question answering


Languages

Language:Python 100.0%