gaoyuankidult / conceptor

Conceptor Networks

Home Page:http://minds.jacobs-university.de/conceptors

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

WARNING: This repository is not complete or functional yet

Conceptors

From: http://minds.jacobs-university.de/conceptors by H. Jaeger

In biological brains "higher" cognitive control modules regulate "lower" brain layers in many ways. Examples for such top-down processing pathways include triggering motion commands ("reach for that cup"), setting attentional focus ("look closer... there!"), or predicting the next sensory impressions ("oops - that will hit me"). Not much is known about computational mechanisms which would implement such top-down governance functions on the neural level. As a consequence, in machine learning systems which are based on artificial neural networks, top-down regulation is rarely implemented. Specifically, today's top-performing pattern recognition systems ("deep learning" architectures) do not exploit top-down regulation pathways.

The most recent research line in the MINDS group addresses such top-down governance mechanisms in modular, neural learning architectures. We discovered a computational principle, called conceptors, which allows higher neural modules to control lower ones in a dynamical, online-adaptive fashion. The conceptor mechanism lends itself to numerous purposes:

  • A single neural network can learn a large number of different dynamical patterns (e.g. words, or motions).
  • After some patterns have been learnt by a neural network, it can re-generate not only the learnt "prototypes" but a large collection of morphed, combined, or abstracted patterns.
  • Patterns learnt by a neural network can become logically combined with operations AND, OR, NOT subject to rules of Boolean logic. This reveals a fundamental link between the worlds of "subsymbolic" neural dynamics and of "symbolic" cognitive operations.
  • This intimate connection between the worlds of neural dynamics and logical-symbolic operations yields novel algorithms and architectures for lifelong learning, signal filtering, attending to particular signal sources ("party talk" effect), and more.

Expressed in a nutshell, conceptors enable "full top-down logico-conceptual control" of the nonlinear, pattern-generating dynamics of recurrent neural networks. Thanks to its robustness, simplicity, computational efficiency and versatility, we perceive the conceptor mechanism as a key for designing flexibly multifunctional neural learning architectures, which will become crucial for future human-computer interaction systems and robots.

Resources

H. Jaeger (2014): Conceptors: an easy introduction. (arXiv) Short, informal, richly illustrated.

H. Jaeger (2014): Controlling Recurrent Neural Networks by Conceptors. Jacobs University technical report Nr 31 (195 pages) (pdf) (arXiv) (Matlab code) Long, detailed, mathy. The first 20 pages provide a self-contained survey.

Source Code

Contains original MatLab source code, automatic Matlab -> Python translation, and other derivatives.

About

Conceptor Networks

http://minds.jacobs-university.de/conceptors


Languages

Language:Python 53.1%Language:MATLAB 46.6%Language:CLIPS 0.2%Language:M 0.1%