deeplearningparis / dl-attention

Attention models

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

dl-attention

Update me!

Attention models at DL Paris workshop session 3

Learn and implement attention models on RNNs

Tasks and Datasets

Character level addition task taken from https://github.com/fchollet/keras/blob/master/examples/addition_rnn.py

Inputs are found in data_numbers.csv or generated in generate_data.py.

Each line is formatted like: "123+89 |212" ; given the sequence "123+89" The decoder must produce the characters "212".

Other tasks to consider

About

Attention models


Languages

Language:Python 77.2%Language:Lua 22.8%