Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.cplire.ru/rus/InformChaosLab/papers/cnna96adkp.pdf
Дата изменения: Fri Apr 9 16:56:15 1999
Дата индексирования: Tue Oct 2 12:06:10 2012
Кодировка:
CNNA'96: Fourth IEEE International Workshop on Cellular Neural Networks and their Applications, Seville, Spain, June 24-26, 1996

INFORMATION PROCESSING IN 1-D AND 2-D MAP: RECURRENT AND CELLULAR NEURAL NETWORKS IMPLEMENTATION.
Yu. V. Andreyev, A. S. Dmitriev, D. A. Kuminov, V. V. Pavlov. Institute of RadioEngineering and Electronics of the Russian Academy of Sciences, Mokhovaya St., 11, 103907 Moscow, Russia. Phone: (095)203-4693, FAX: (095) 203-84-14, E-mail: dmitr@ire.rc.ac.ru ABSTRACT: Mathematical models of neural networks, based on one-dimensional and two-dimensional maps are proposed, in which complex dynamics and chaos are used for storing and processing information.

1. Introduction.
A possibility of a use of chaos and complicated dynamics in neural nets for a solution of problems coupled with information processing, was discussed already in the literature [1-3]. It should be noted hover, that in most papers connected with a use of chaos and complex dynamics in neural networks, the neural network itself plays a role of a "black box" which is trained according to certain rules in order to obtain a desired response to a certain signal at its input. The aim of this report is to demonstrate an application of complex dynamics and chaos to storing and processing information in recurrent neural networks, used as a hardware realization of simple mathematical models of a kind of 1-D and 2-D maps, which were used for storing and retrieving information [4,5]. Evidently this concept substantially differs from the presentation of a neural network as a "black box". We shall show below that an existence of a mathematical model realized in neural network of this kind leads to striking computational efficiency. Another important property of such networks is the limited number of interconnections. Therefore they can be considered as a special kind of CNN.

2. Storing and Retrieving Information in 1-D And 2-D Maps.
The procedure of storing and retrieving information on the basis of limit cycles of 1-D dynamical systems was introduced in [4]. Let there be given a sequence of symbols (information block) a1 a2 ... an, (1)

each element ai of which belongs to an alphabet composed of N symbols. 1-D map of a segment into itself is designed for this sequence, possessing a stable limit cycle of the period n, the elements of which are in mutual unambiguous correspondence with the elements of the sequence (1). In the simplest case each element of the alphabet in related to its own value of the mapping variable and to an interval with the length 1/N of the mapping variable. The example of the map with storing information block 174 is shown in Fig.1. The retrieval of an information block is performed by setting initial conditions within one of the segments on the unit interval [0,1] corresponding to the symbols of the information block, and by further transformation of the sequence of numbers occurring by iteration of the map into the sequence of symbols. If the initial conditions are arbitrary, the convergence of the system trajectory to the limit cycle is preceded by a chaotic transient process. The method of storing and retrieving was generalized to storing information in 2-D maps [5]. Is shown, that use two dimension systems for information professing led to high information capacity. For store of the information block (1) on 2-D map of an individual square [0,1]x[0,1] into self, it is necessary to transform (1) as pairs of elements ( a1 ,a 2 ),( a 2 ,a 3 ),...( a n ,a1 ). (2) To each element j from the given alphabet of length N is set in mutual correspondence intervals on axes X and Y j j+ 1 j j+ 1 Ix = [ , ), I jy = [ , ), j = 1,..., N , (3) j NN NN

1 0-7803-3261-X/96/$5.00 © 1996 IEEE.


MODELLING OF BIOLOGICAL SYSTEMS

and points in middle of these intervals (2) there will correspond a square

j + 0.5 . In this case to a pair of elements ( a m ,a N

m+ 1

) from a sequence

Fig.1. 1-D map with storing information block 174.
+1 am am + 1 a a , ) в [ m+ 1 , m+ 1 ). N N N N Is shown, that required 2-D map should look like x m+ 1 = y m . y m+ 1 = f ( x m , y m ) The function f(x, y) is built so that the map (5) had a unique limiting cycle (2). In the squares through which passes a cycle (2) (informative squares) function f(x, y) looks like i + 0.5 k + 0.5 f ( x , y ) = 2 ( x - )+ , 2 < 1. N N In the other squares i x- N ( 1 - 1 ) + , 10-2. f ( x, y ) = 1 N N N An example of a 2-D map, in which string 174 is stored, is shown in Fig.2. I
a
m

вI

a

m+ 1

=[

(4)

(5)

(6)

(7)

3. Realization of the Dynamics of 1-D and 2-D Maps with Stored Information Using Neural Networks.
Let us consider a network composed of three layers of neurons (an input layer, a hidden layer, an output layer) and a feedback loop between the output layer and the input layer. The three layers of the neural network are used to emulate a 1-D map function, and the feedback loop with a unit delay is to organize iteration process [6-8]. The input and output layers contain one element each. There are no couplings between the elements of the hidden layer. A signal corresponding to the initial conditions is applied to the input element. The signal from the output of this element is sent to the hidden layer neurons. The sum of the signals from the hidden layer neurons is applied to the input of the output layer neuron. The coefficients of the couplings between the layers of the neural network are determined by the function of a map with stored information. The signal at the network output represents the value of the map next iteration.

2


Information Processing in 1-D and 2-D Map: Recurrent and Cellular Neural Networks Implementation.

All the neural-like elements have the same structure and the same piece-wise linear characteristics. The threshold values for the elements of the input and output layers are zero, the threshold values for the elements of the hidden layer are determined by the form of the function which is realized in the network. An arbitrary 1-D piece-wise linear map M(x) defined at the unit interval I = [0,1] by the set of its points

Fig.2. 2-D map with storing information block 174.

(x1, y1), (x2, y2), ... (xp+1, yp+1),

(8)

where p is the number of linear segments in the 1-D map; (xi, yi) are the coordinates of the left and right endpoints of the linear segments, can be precisely realized as a neural network, composed of the elements with identical piece-wise linear characteristics:
0 ,x 0, f ( x ) = x ,0 < x 1, 1, x > 1.

(9)

The map function M(x) may be represented as
M( x ) =



N

i f (( x - xi ) / i ) =

i=1



N

Ti2 f ( Ti1 x - i ) ,

(10)

i =1

where i = x

i+ 1

- xi , i = y

i+ 1

- yi , Ti1 = 1 / i are the couplings from the input layer to the hidden layer,

Ti2 = i are the couplings from the hidden layer to the output layer, i = x i / i are the thresholds of the elements of the hidden layer. In Fig.3 an example of a neural net which realizes the dynamics of 1-D map storing the string 174 is presented.
3


MODELLING OF BIOLOGICAL SYSTEMS

In the case of piece-wise linear map realization as a neural net, some of the neural elements are used to realize nonlocal coupling between input and output elements, and nonlocal connections between these elements and the elements of the hidden layer. To emulate a 2-D map function neural network more complex then for 1-D map structure is used.

Fig.3. Neural net which realizes the dynamics of 1-D map storing the string 174. The neural network is built as follows. From Fig.2 it is visible, that 2-D map can be constructed from segments of three types (one for informative squares and two for noninformative). Each of these segments are realized with the help of neural units of simple structure. An example of the net which is approximate the 2-D map is shown in Fig.4. Three modules in the right part of the network on Fig.4 realize informative squares, other realize noninformative squares. The training of the network is reduced to analytical formulas. An important property of such a neural networks is a limited number of interconnections. This is the same property which we have in cellular neural networks.

Fig.4. Neural net which realizes the dynamics of 2-D map storing the string 174.

Acknowledgment
This study is supported by an INTAS Grant ( INTAS-94-2899).

4


Information Processing in 1-D and 2-D Map: Recurrent and Cellular Neural Networks Implementation.

References
[1] I. Guyon, L. Personnaz, J. P. Nadal, G. Dreyfus: "Storing and retrieval of complex sequences in neural networks". Phys. Rev. A, Vol.38, pp.6365-6372, 1988. [2] D. Servan-Schreiber, A. Cluremans, J. L. McCleland: "Learning sequential structure in simple recurrent networks". In Advances in neural information processing systems I, ed. by D. S. Touretzky, Morgan Kaufman Publisher Inc., Palo Alto, CA, 1989. [3] W. J. Freeman: "Tutorial on neurobiology: from single neurons to brain chaos". Int. J. Bifurcation and Chaos, Vol.2, pp.451-482, 1992. [4] S. Dmitriev, A. I. Panas, S. O. Starkov: "Storing and recognition information based on stable cycles of onedimensional maps". Physics Letters A, Vol.155, pp. 494-499, 1991. [5] Yu. V. Andreyev, Yu. L. Belsky, A. S. Dmitriev: "Storing and recognition of information using stable cycles of 2-D and multi-dimensional maps". Radiotekhnika i Elektronika, Vol.39, pp.114-123, 1994 (in Russian). [6] Yu. V. Andreyev, Yu. L. Belsky, A. S. Dmitriev, D. A. Kuminov: "Dynamic systems with chaos as a medium for storing and processing information". Izvestiya Vuzov Radiophizica, Vol.37, pp.1003-1019, 1994 (in Russian). [7] A. S. Dmitriev, D. A. Kuminov: "Recurrent neural networks with chaos for storing and retrieving information". Proceeding of 2nd Int. Symposium on Neuroinformatic and Neurocomputers, Rostov-Don, Russia, pp.24-31, 1995. [8] Yu. V. Andreyev, Yu. L. Belsky, A. S. Dmitriev, D. A. Kuminov: "Information processing using dynamic chaos. Neural networks implementation". IEEE Transactions on Neural Networks, Vol.7, pp.1-11, 1996.

5