Previous: AIDA-96-06 Next: AIDA-96-09 Index 1996


Intellectics Group: Technical Report 96-08

Multi-Flip Networks: Extending Symmetric Networks to Real Parallelism

Antje Strohmaier

In general, neural networks are regarded as models for massively parallel computation. But very often, this parallelism is rather limited, especially when considering symmetric networks. For instance, Hopfield networks do not really compute in parallel as their updating algorithm always requires sequential execution. Nevertheless, Hopfield networks can be used as auto-associative memories, were shown to have an expressive power equivalent to propositional logic, and can be used to solve several combinatorial problems. Extensions like the Boltzmann Machine with continuous activation functions can additionally be used to solve optimization problems. But, all of these approaches suffer from one disadvantage, namely the impossibility to perform simultaneous computations of more than one unit, i.e. real parallelism. We describe a recurrent network corresponding to a symmetric network and introduce a method of parallel updating multiple units. We show how this may be extended to Boltzmann machines with continuous activation functions, and point out possible applications of this architecture.

Full Paper: Compressed postscript Compressed DVI

BibTeX entry