Документ взят из кэша поисковой машины. Адрес оригинального документа : http://acat02.sinp.msu.ru/presentations/litinski/litinski.doc
Дата изменения: Mon Jul 15 13:48:33 2002
Дата индексирования: Mon Oct 1 20:20:59 2012
Кодировка:

PARAMETRICAL NEURAL NETWORK



B.V. Kryzhanovsky, L.B. Litinskii


Institute of Optical Neural Technologies RAS, Moscow



Fonarev


Dep. of Engin. Science and Physics, The College of Staten Island, NY



Abstract


We develop a formalism allowing us to describe operating of a
network based on the parametrical four-wave mixing process that is
well-known in nonlinear optics. In the network the signals
propagate in the form of quasi-monochromatic pulses at q different
frequencies. The retrieval properties of the network are
investigated. It is shown that the storage capacity of such a
network is higher compared with the Potts-glass neural network.



The work is supported by RBRF (grants 02-0100457 and 01-01-00090) and the
program "Intellectual Computer Systems" (the project 4.5).


References


[1] B.V. Kryzhanovsky, A.L. Mikaelian.
On recognition ability of neuronet based on neurons with parametrical
frequencies convertion. Dokladi RAS (2002), v. 383(3), pp. 1-4.
[2] A. Fonarev, B.V. Kryzhanovsky et al.
Parametric dynamic neural network recognition power.
Optical Memory & Neural Networks (2001), v.10(4), pp. 31-48.
[3] N. Bloembergen. Nonlinear optics. 1966.
[4] N. Chernov. Ann. Math. Statistics (1952), v. 23. pp. 493-507.
[5] I. Kanter. Potts-glass models of neural networks.
Physical Review A (1988), v. 37(7), pp. 2739-2742.
[6] D. Bolle, P. Dupont, J. Huyghebaert.
Thermodynamic properties of the Q-state Potts-glass neural network.
Physical Review A (1992), v. 45(6), pp. 4194-4197
INTRODUCTION


The goal of this work is to analyze the properties of a network that is
capable to hold and handle information encoded in the form of the phase-
frequency modulation. Schematically the work of this network can be
described as follows.

The network consists of N connected neurons. The signals propagate along
interconnections in the form of quasi-monochromatic pulses at q different
frequencies [pic]:

[pic]

When propagating along the interconnections the signals transform due to
the parametrical four-wave mixing processes of the form
[pic]
transmitting to the next neuron as a packet.

The information about p patterns

[pic]

is stored in [pic]- interconnection matrices [pic]( i, j =1,.,N). The
components of the patterns are preassigned quasi-monochromatic pulses.
For example, the pattern is a colored picture at the screen, and the state
of neurons encodes each pixel of the screen.

A neuron has a complex structure. It is composed of:
1) a summator of input signals;
2) a set of q ideal frequency filters [pic];
3) a block comparing the amplitudes of the signals, and
4) q generators of quasi-monochromatic signals [pic].

The signals, which get the given neuron after transformation in
interconnections,
1) are summed up;
2) the summarized signal passes through q parallel filters;
3) the output signals are compared with respect to their amplitudes;
4) the neuron activates the signal with the frequency corresponding to
the maximal amplitude (and the relevant phase).
THE VECTOR FORMALISM

We consider a network consisting of N neurons. The states of the neurons
are given by vectors [pic] from a space [pic]:
[pic] (1)

With the aid of the q-dimensional vectors [pic] the problem can be
formulated in terms of the vector spaces:

[pic]


The p N-dimensional patterns are

[pic]

Their components [pic] are the vectors of the form (1).

According to the generalized Hebb rule, the patterns are stored in the
[pic]matrices [pic]:
[pic] i, j = 1, 2,.., N. (2)

Then, if [pic], we have
[pic]
and the matrix elements are
[pic][pic]

[pic]


The dynamics of the network


Let at the time point t the network is in the state

[pic]

The local field acting on the i-th neuron is
[pic] (3)

If in the expansion (3) [pic] is the amplitude that is maximal in modulus,
[pic]
then in the next time point, t+1, the i-th neuron has the value
[pic] (4)

"Spin" is oriented as close to the external field (3) as possible.

The fixed points of the network are the local minima of the energy
functional

[pic]


In the case q=1, we have the ordinary Hopfield model:

[pic]

When q>1, the model is similar to the Potts-glass neural network
(I. Kanter, 1988; D. Bolle, 1992).








RETRIEVAL PROPERTIES OF THE NETWORK



Let p randomized patterns are stored in the network's memory:


[pic]

Let the network starts from a distorted m-th pattern:

[pic]
where

[pic]

[pic]

Using the Chebyshev-Chernov method it can be shown that the probability of
the error in recognition of the pattern [pic] is equal to
[pic]

If [pic], always the error probability vanishes, when the number of the
patterns p is less than
[pic]

[pic] is an asymptotically possible value of the storage capacity of the
parametrical neural network (PNN).
It differs by the factor [pic] from the analogous characteristic of the
Hopfield model:
[pic]


Example 1
If q=10, and the probability of the frequency failure is [pic] ( the
noise is equal to 33%), then

[pic] [pic] is greater than[pic] in 44 times.


Example 2


If q= 10, the number of neurons N= 100, the number of the patterns p=
200,
and the frequencies of the pattern are 40% noisy, then
the probability of restoring of the pattern is equal to
0.98.

FIXED POINTS OF PNN

[pic] For any [pic], the only fixed points of a network constructed
with the aid of two patterns [pic] and [pic] are these patterns
themselves
(and their negatives [pic]).

[pic] Let us examine a case, when q = 2 and the vectors [pic] are
without the signs [pic] (the signals are without the phase failure):
[pic]
As the neurons possess only one of the two values [pic] or [pic],
formally this case is alike the Hopfield model. Note, here a
"contrast" configuration [pic] corresponds to every configuration X ,
but not its negative - X. The components [pic] of the contrast
configuration [pic] take the values that are alternative to [pic].
This system has some nontrivial properties distinguishing it from the
Hopfield model .
For example, if a configuration X is a fixed point, then its
contrast configuration [pic] is not a fixed point.
The other difference: a network constructed with the aid of three
patterns always has them as fixed points.



Restoration of the pattern with the frequency noise.

N=100, p=200, q=22. The pattern is a picture of a dog.
The frequency noise is 80% (a=0, b=0.8).

The gray squares are noisy pixels.
The states of the network after 50 and 100 steps are shown.



[pic] [pic] [pic]







Restoration of the pattern with the phase noise.

N=100, p=200, q=15. The pattern is a picture of a dog.
The phase noise is 20% (a=0.2, b=0).

The states of the network after 50 and 100 steps are shown.


[pic] [pic] [pic]


POTTS-GLASS NEURAL NETWORK

The properties of our model are close to the properties of PGNN model,
which was suggested in the end of 80-th and examined in 90-th.

PGNN is related to the Potts model of a magnetic solid , as well as the
Hopfield model is related to the Ising model.

In PGNN neurons take q different values, which are represented with the
aid of q special vectors [pic]:
[pic]
The vectors [pic] are nonorthogonal and they are linearly dependent:
[pic]
For the rest, the PGNN-model is similar to the aforesaid vector formalism.

In our PNN-model the signals are transformed due to action of elementary
matrices
[pic]
In this case the filtration of the signals occurs: if the frequency of the
signal
[pic] is not equal to the frequency [pic], the signal does not transmit
through interconnection: [pic]
In the PGNN-model the matrices
[pic]
play the same role, however because of the nonorthogonality of the vectors
[pic], the filtration does not occur:
[pic]
In the PGNN-model the asymptotically possible value of the storage capacity
is twice less than in our PNN-model:
[pic]

-----------------------


t=0

t=50

t=100

t = 0

t = 50

t = 100