Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.cplire.ru/rus/InformChaosLab/papers/ndes97adks.pdf
Дата изменения: Thu Apr 8 12:10:49 1999
Дата индексирования: Tue Oct 2 12:06:46 2012
Кодировка:

Поисковые слова: m 97
NDES'97: Communication

MAPS WITH STORED INFORMATION IN MULTIPLE-ACCESS COMMUNICATIONS SYSTEMS
Yuri Andreyev, Alexander S. Dmitriev, Dmitry Kuminov, and Sergei Starkov Institute of Radioengineering and Electronics of the Russian Academy of Sciences Mokhovaya st. 11, GSP-3, 103907 Moscow, Russia Email: chaos@mail.cplire.ru

ABSTRACT
In this report we discuss the ways for realization of code-division multiple access (CDMA) systems based on dynamic chaos. The use of chaos allows us to transmit information with the robustness close to that of the conventional spread-spectrum systems, and to obtain additional useful consumer properties.

I. Introduction
Conventional spread-spectrum systems employ pseudo-random cyclic spreading sequences (e.g., pseudo-noise (PN) sequences with two states) in order to make a wideband signal from narrowband information signal. In the receiver, the periodic nature of these pseudo-random sequences is used in special synchronization circuits in order to organize the coherent reception. Another way for the construction of the spread-spectrum systems is a use of generators of chaotic signals [1­9]. The attractiveness of the chaos generators as sources of pseudo-random signals for the spreadspectrum communications systems is determined by: (1) simplicity of realization, (2) versatility, and (3) a possibility of self-synchronization of the transmitter and the receiver. This explains the great interest to the application of chaos and its properties to the spread-spectrum communications systems. In this report, we discuss the ways for realization of the mentioned potential features. Before we proceed to the analysis, note, that only those circuits and principles of chaotic communications can prove to be viable whose characteristics are comparable to those of conventional solutions and have other useful properties. We will talk below only about such solutions.

2. Binary Codes
Application of chaotic signals as PN-sequences in classical systems with correlation processing of the signal was already discussed in literature [10­12]. As was shown, such systems could not only be used for masking and encoding, but also were robust with respect to external perturbations. Thus, a possibility of simple realization was shown (simplicity of the chaotic source), which is important for the communications systems. Note, that such communications systems require external synchronization circuits and the presence of exact signal copies in the transmitter and in the receiver, which can be either a drawback or an advantage.

3. M­ary Codes
An important concept of the information theory is the notion of the code base or the length of an alphabet through which the information message is expressed. The above information transmission systems have binary code ('0' and '1'). An increase of the code base at a fixed rate of one element transmission results in an increase of the communications channel transmittance C as C = C0 2M = C0 22K = K 0, log log C (1)

185


NDES'97: Communication

where M = 2K is the code base and C0 is the data transmittance in the case of the code base equal to two. The code elements in (1) are assumed to be used with equal probability. As follows from (1), the increase of the code base is an important reserve for an increase of the information channel transmittance. A set of orthogonal codes can also be used for organization of multi-user communications systems (CDMA). Let us show how to organize the large-base codes and multiple access with the help of chaotic signal sources.

3.1. Unstable cycles of a chaotic attractor and the system of codes
A chaotic attractor is known to have a countable set of periodic orbits (cycles). The number of different forms of such orbits grows exponentially with increasing period of the orbits. For example, in 1D logistic map (quadratic parabola map) or in 2-D Henon map, the number of simultaneously existing different forms of period­20 cycles amounts to several thousands [13]. This system of cycles or a part of it could be applied to transmission of information as a system of codes. In the receiver, we can select the cycles either with standard methods of correlation analysis, or with the methods utilizing specific features of these special trajectories, unstable cycles. For example, when using 1024 alphabet elements (code cycles), the information transmission rate increases by a factor of 10 with respect to the binary code. The discussed approach employs special trajectories of dynamic systems with chaos, namely, the "skeleton" of the chaotic attractor, i.e., the system of unstable periodic orbits, as the "alphabet elements". The next approach is based on chaotic trajectories of a general kind.

3.2. Fragments of chaotic time series as codes
In this case, in the construction of the system of codes, we use the property of very week correlation between randomly taken fragments of a chaotic time series. For example, consider the correlation properties of fragments generated by the cubic parabola map X
n+1

= µXn ­ Xn2). (1

(2)

In Fig. 1, we present an image of the correlation matrix calculated for 256 fragments of the chaotic attractor, each having 400 samples started from randomly set initial conditions on the attractor (µ = 3.0). As can be seen, nondiagonal elements of the correlation matrix do not exceed 0.1. With these

Fig. 1. Cross-correlation function of 256 codes with the processing length 400.

186


NDES'97: Communication

fragments taken as codes, we can implement practically optimum reception with the signal accumulation. This allows us to completely use the redundancy (the signal base) in order to overcome the negative effect of noise in the communications channel, and interference from other users and that caused by multipath signal propagation. The transmitter involves a chaos generator with a system of preset initial conditions. The number of the initial conditions M is equal to the number of the chaotic time series fragments that make the code elements. Each code element is related to its initial conditions, which are set when an element of the information message {bk} is fed to the transmitter input. Then the chaos generator makes n iterates, and waits for the next information element. The fragment of the chaotic series is transformed in the modulator into a physical signal that is sent to the communications channel. In the channel, the signal undergoes the effect of additive noise (t). So, the signal r(t) at the receiver input is a sum of the transmitted signal m(t) and the additive noise (t): r(t) = m(t) + (t). To retrieve the original message signal, the received signal r(t) is demodulated and the obtained sequence is copied to M branches. The elements of each of the M copies of the obtained fragment are subtracted one by one in ith channel from a copy of the corresponding ith code element. Then, in all the channels the residue magnitudes are summed (accumulated) over the fragment. The alphabet element corresponding to the branch with the least accumulated sum is considered received. The characteristics of the system with accumulation are close to those of the optimum systems with orthogonal signals (Fig. 2).

Fig. 2. Receiver with accumulation. (1) code base 2, optimum receiver with orthogonal codes (2) code base 32, optimum receiver with orthogonal codes (+ ) two chaotic codes () 32 chaotic codes Special cases of the discussed communications system are a system with two orthogonal codesfragments and a system with a pair of antipodal signals using the same piece of the chaotic time series. Note, that the proposed approach as well as the approach discussed in [10­12] requires the presence of exact copies of the chaotic signal fragments-codes both in the transmitter and the receiver, and also external synchronization. 4. Self-Synchronization The dynamic systems with stored information [14­20] have a number of features of chaotic systems of a general kind. In particular, they can be synchronized and can play the role of nonlinear matched

187


NDES'97: Communication

filters for special signals, e.g., for code sequences stored in these maps. At the same time, their periodic trajectories assume a finite number of values, and in this sense, they are close to systems with pseudorandom sequences. Let us show that these properties allow us to construct communications systems with selfsynchronizing transmitters and receivers. Consider a communications system with a binary information source. The system transmitter involves a map with spreading binary code sequences stored in it. We can store one or a few code sequences in the map. Each pair of users can have their own pair of spreading codes. The codes are stored in the map in a generalized alphabet. To translate them into a binary code, a decoder is also included in the transmitter. The product of the information-carrying signal and a spreading signal gives a signal whose spectrum is equal to the convolution of the two initial spectra. This and other operations are quite similar to those used in classical systems of binary information transmission based on PN-sequences. Therefore, the signal properties at all stages of the signal transformation are well known. If a message signal has a narrow bandwidth and a noiselike signal a broad bandwidth, their product, the modulated signal, will have a broad spectrum close to the spectrum of the noiselike signal. In order to retrieve the initial message, the received signal is fed to a demodulator, involving a multiplier, an integrator, and a decision-making unit. Besides the received signal, a code sequence generated by the map is fed to the multiplier. The receiver is assumed to work under ideal synchronization with the transmitter. In this case, by passing the signal from the multiplier output through a low-pass filter with the bandwidth just as wide as the information signal bandwidth, we filter out the most part of the noise signal. Let us show, that the synchronization between the transmitter and receiver can be achieved without special external circuits. In order to investigate the synchronization establishment process, consider a communications system composed of a transmitter and the receiver, both containing identical 1-D maps with random sequences stored in it. These 10,000-symbol long random sequences assuming the values of ± 1 play the role of PN-sequences of classical spread-spectrum communications systems. In the transmitter, this sequence is modulated by the binary message symbols and is sent through the communications channel to the receiver. In the channel, the transmitted signal can be distorted by noises. The proposed synchronization circuit is principally different from the correlation-based synchronization circuits. Synchronization here is established due to the property of associative memory inherent of these maps [14­20]. The received sequence is fed to the map for recognition, and if it contains sufficiently long undistorted fragments of the stored sequence, then the initial conditions for the map are formed. When starting from these initial conditions, we obtain the same sequence (and from the same point) as was generated by the transmitter map, i.e., synchronization is established. The synchronization establishment time was estimated through the length of the received sequence fragment necessary to find the initial condition for the map. In the case of no external noise in the channel the synchronization occurs very quickly, i.e., in less than ~ 30 symbols of the spreading sequence. The presence of a noise (here, Gaussian white noise) changes the situation only at a sufficiently large noise level. Mean time of the synchronization establishment as a function of the noise level is shown in Fig. 3. As can be seen, up to the values of 0.3 the system is practically insensitive to the noise in the channel. Then the synchronization establishment time increases exponentially, and at the values close to 0.85 and higher, the synchronization becomes practically impossible at other system parameters kept constant. At lower noise levels, synchronization occurs quickly, mostly after reception of a few symbols, as is seen in Fig. 4, where the distribution of synchronization times is presented for = 0.3. These results can be explained by the fact that at low noise levels the received sequence is corrupted but slightly, i.e., the

188


NDES'97: Communication

Fig. 3. Synchronization establishment time versus the noise level .

Fig. 4. Distribution of the synchronization establishment time.

corrupted symbols are seldom in it. This is illustrated in Fig. 5, where the average portion of the corrupted symbols in the received sequences is presented as a function of the noise level . At 0.85 the average length of undistorted code sequence fragments becomes less than some critical length necessary for associative memory operation, and synchronization becomes impossible.

Fig. 5. Number of the symbols corrupted in the channel as a function of the noise level . Conclusions Thus, dynamic chaos can be used by construction of multi-user robust communications systems and communications systems with large alphabet base. Dynamic chaos can also be used in order to provide robust synchronization between the transmitter and receiver without any special external circuits. Acknowledgment This work is supported in part by the Russian Foundation for Basic Research, Grant No. 97­01-- 00800. References [1] L. Kocarev, K.S. Halle, K. Eckert, L. Chua, and U. Parlitz, "Experimental Demonstration of Secure Communications via Chaotic Synchronization", Int. J. Bifurcation and Chaos, Vol. 2, No. 3, pp. 709­713, 1992.

189


NDES'97: Communication

[2] U. Partlitz, L. Chua, L. Kocarev, K. Halle, and A. Shang, "Transmission of Digital Signals by Chaotic Synchronization", Int. J. Bifurcation and Chaos, Vol. 2, No. 4, pp. 973­977, 1992. [3] K. Cuomo and A.Oppenheim, "Circuit Implementation of Synchronized Chaos With Applications to Communications", Phys. Rev. Lett, Vol. 71, No. 1, pp. 65­68, 1993. [4] Yu. Belsky and A. Dmitriev, "Transmission of Information Using Deterministic Chaos", Radiotekhnika i Elektronika, Vol. 38, No. 7, pp. 1310­1315, 1993. [5] A. Volkovsky and N. Rul' kov, "Synchronous Chaotic Response of a Nonlinear Communications System With Chaotic Carrier", Pis' v GTF, Vol. 19, No. 3, pp. 71­75, 1993. ma [6] A. Kozlov and V. Shalfeev, "Selective Suppression of Deterministic Chaotic Signals", Pis' v ma GTF, Vol. 19, No. 23, pp. 83­87, 1993. [7] H. Dedieu, M. Kennedy, and M. Hasler, "Chaos Shift Keying: Modulation and Demodulation of a Chaotic Carrier Using Self-Synchronizing Chua' Circuits", IEEE Trans. Circuits Syst, Vol. CASs 40, No. 10, pp. 634­642, Oct. 1993. [8] K.S. Halle, C.W. Wu, M. Itoh, and L.O. Chua, "Spread Spectrum Communications Through Modulation of Chaos", Int. J. Bifurcation and Chaos, Vol. 3, No. 2, pp. 469­477. 1993. [9] A. Dmitriev, A. Panas, and S. Starkov, "Experiments on Speech and Music Signals Transmission Using Chaos", Int. J. Bifurcation and Chaos, Vol. 5, No. 3, pp. 371­376, 1995. [10] T. Kohda and A. Tsuneda, "Pseudonoise Sequences by Chaotic Nonlinear Maps and their Correlation Properties", IEICE Trans. Commun., Vol. E76-B, No. 8, pp. 855­862, 1993. [11] T. Kohda, A. Oshiumi, A. Tsuneda, and K. Ishii, "A Study of Pseudonoise-Coded Image Communications", SPIE, Vol. 2308, pp. 874­884, 1994. [12] U. Parlitz and S. Ergezinger, "Robust Communications Based on Chaotic Spreading Sequences", Phys. Lett. A, Vol. 188, pp. 146­150, 1994. [13] A.S. Dmitriev, S.O. Starkov, and M.E. Shirokov, "Structure of Periodic Orbits of a Chaotic Oscillation System Described by 2nd­Order Difference Equations", Radiotekhnika i Elektronika, Vol. 39, No. 8-9, pp. 2387­2396, 1994. [14] A. Dmitriev, "Storing and Recognition of Information in One-Dimensional Dynamic Systems", Radiotekhnika i Elektronika, Vol. 36, No. 1, pp. 101­108, 1991. [15] A.S. Dmitriev, A.I. Panas, and S.O. Starkov, "Storing and Recognition Information Based on Stable Cycles of One-Dimensional Maps", Phys. Lett. A, Vol. 155, No. 8/9, pp. 494­499, 1991. [16] Yu.V. Andreyev, A.S. Dmitriev, L.O. Chua, and C.W. Wu, "Associative and Random Access Memory Using One-Dimensional Maps", Int. J. Bifurcation and Chaos, Vol. 2, No. 3, pp. 483­ 504, 1992. [17] Yu.V. Andreyev, Yu.L. Belsky, and A.S. Dmitriev, "Storing and Recognition of Information Using Stable Cycles of 2-D and Multi-Dimensional Maps", Radiotekhnika i Elektronika, Vol. 39, No. 1, pp. 114­123, 1994. [18] Yu.V. Andreyev, Yu.L. Belsky, A.S. Dmitriev, and D.A. Kuminov, "Information Processing Using Dynamical Chaos: Neural Networks Implementation", IEEE Trans. Neural Networks, Vol. NN-7, No. 2, pp. 290­299, 1996. [19] Yu.V. Andreyev, A.S. Dmitriev, D.A. Kuminov, L.O. Chua, and C.W. Wu, "1-D Maps, Chaos and Neural Networks for Information Processing", Int. J. Bifurcation and Chaos, Vol. 6, No. 4, pp. 627­646, 1996. [20] Yu.V. Andreyev, A.S. Dmitriev, and S.O. Starkov, "Information Processing in 1-D Systems With Chaos", IEEE Trans. Circuits Syst.­I, Vol. CAS-44, No. 1, pp. 21­28, Jan. 1997.

190