Update me!
Learn and implement attention models on RNNs
Character level addition task taken from https://github.com/fchollet/keras/blob/master/examples/addition_rnn.py
Inputs are found in data_numbers.csv or generated in generate_data.py.
Each line is formatted like: "123+89 |212" ; given the sequence "123+89" The decoder must produce the characters "212".
Other tasks to consider
- CNN QA dataset https://github.com/deepmind/rc-data
- Facebook Babi https://github.com/facebook/bAbI-tasks
- Machine Translation see e.g. https://github.com/kyunghyuncho/dl4mt-material/tree/master/data