Pelincec 2005 - 06: Generalized Brain-State-in-a-Box (GBSB) Neural Network | ||
You are here: EPE Documents > 05 - EPE Supported Conference Proceedings > Pelincec 2005 > Pelincec 2005 - 06: Generalized Brain-State-in-a-Box (GBSB) Neural Network | ||
![]() | [return to parent folder] | |
![]() | The Generalized Brain-State-in-a-Box (gBSB) Neural Network: Model, Analysis, and Applications
By Cheolhwan Oh, Stefen Hui, Stanislaw H. Zak | |
Abstract: The generalized Brain-State-in-a-Box (gBSB)
neural network is a generalized version of the Brain-
State-in-a-Box (BSB) neural network. The BSB net is a
simple nonlinear autoassociative neural network that was
proposed by J. A. Anderson, J. W. Silverstein, S. A. Ritz,
and R. S. Jones in 1977 as a memory model based on
neurophysiological considerations. The BSB model gets
its name from the fact that the network trajectory is
constrained to reside in the hypercube Hn = [−1, 1]n.
The BSB model was used primarily to model effects and
mechanisms seen in psychology and cognitive science. It
can be used as a simple pattern recognizer and also
as a pattern recognizer that employs a smooth nearness
measure and generates smooth decision boundaries. Three
different generalizations of the BSB model were proposed
by Hui and ÿZak, Golden, and Anderson. In particular, the
network considered by Hui and ÿZak, referred to as the
generalized Brain-State-in-a-Box (gBSB), has the property
that the network trajectory constrained to a hyperface
of Hn is described by a lower-order gBSB type model.
This property simplifies significantly the analysis of the
dynamical behavior of the gBSB neural network. Another
tool that makes the gBSB model suitable for constructing
associative memory is the stability criterion of the vertices
of Hn. Using this criterion, a number of systematic methods
to synthesize associative memories were developed.
In this paper, an introduction to some useful properties
of the gBSB model and some applications of this model
are presented first. In particular, the gBSB based hybrid
neural network for storing and retrieving pattern sequences
is described. The hybrid network consists of autoassociative
and heteroassociative parts. In the autoassociative part,
where patterns are usually represented as vectors, a set
of patterns is stored by the neural network. A distorted
(noisy) version of a stored pattern is subsequently presented
to the network and the task of the neural network is
to retrieve (recall) the original stored pattern from the
noisy pattern. In the heteroassociative part, an arbitrary
set of input patterns is paired with another arbitrary set of
output patterns. After presenting hybrid networks, neural
associative memory that processes large scale patterns in
an efficient way is described. Then the gBSB based hybrid
neural network and the pattern decomposition concept are
used to construct a neural associative memory. Finally,
an image storage and retrieval system is constructed
using the subsystems described above. Results of extensive
simulations are included to illustrate the operation of the
proposed image storage and retrieval system.
| ||