nebuly-ai / optimate

A collection of libraries to optimise AI model performances

Home Page:https://www.nebuly.com/

Repository from Github https://github.comnebuly-ai/optimateRepository from Github https://github.comnebuly-ai/optimate

Forward Forward Algorithm Questions

and-rewsmith opened this issue · comments

Hi @diegofiori, I am conducting some research for the Allen Institute on the recurrent Forward Forward model based on Hinton’s approach. I am attempting to extend his work with the following:

  1. Inverting the objective function to be more biologically plausible, and to show more similarity with predictive coding.
  2. Hiding the label for the first few timesteps, playing into the concept of predictive coding. (i.e. high activations initially, followed by low activations in case of successfully predicted samples)
  3. Supporting sparse connectivity between layers, playing into concept of modularity / biological plausibility.
  4. It was unclear if Hinton actually implemented the recurrent connections, as the network diagram he provided was copied from his GLOM paper. But I did implement these connections.

My architecture performs on MNIST at about 94% test accuracy. Hinton reports that he got 99%+. I am curious, did you achieve SOTA performance on the recurrent model? If so I have some follow up questions.

My project is here:
https://github.com/and-rewsmith/RecurrentForwardForward

@valeriosofi Do you happen to know the answer to this?

My architecture performs on MNIST at about 94% test accuracy. Hinton reports that he got 99%+. I am curious, did you achieve SOTA performance on the recurrent model? If so I have some follow up questions.

I took a closer look at the code and it seems like your architecture isn't like figure 3 from the paper.
image

Instead it seems like your implementation tracks the last few activities which are used in every layer's forward pass, and the net is otherwise the same as the forward only Forward Forward network.

Still, curious about the accuracy of this approach compared to your other variations.