Tensorflow and Keras based implementation of a recurrent cell from RNN layer from A bio-inspired bistable recurrent cell allows for long-lasting memory by Nicolas Vecoven, Damien Ernst and Guillaume Drion.
To use the library, clone the repo and place bistablernn
folder inside your project directory.
The Neuromodulated Bistable Recurrent Cell can be imported as:
from bistablernn import NBRCell
and use the cell to create an RNN layer as per keras API as:
tf.keras.layers.RNN(NBRCell(num_units))
Alternatively, a Neuromodulated Bistable RNN layer can be imported and used as:
from bistablernn import NBR
For example:
model = tf.keras.Sequential([
NBR(units=num_hidden, input_shape=input_shape),
tf.keras.layers.Dense(num_classes)
])
An example of training and
evaluation using MNIST data is in the notebooks
folder.
Dependencies
- Tensorflow 2 for layers.
- Tensorflow Datasets for MNIST.
- Jupyter for notebook.
- The Bistable Recurrent Cell modifies the GRU. My code also inherits keras GRUCell and GRU and overloads the functions to reflect the equation changes.
- The implementation is based on my understanding of the equations and modifications. The authors' implementation can be found here.
- The option to add dropout and recurrent dropout remain (assuming they work same as on GRU layers) but are untested.