Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.adass.org/adass/proceedings/adass00/reprints/P1-08.pdf
Дата изменения: Wed May 30 01:21:09 2001
Дата индексирования: Tue Oct 2 11:15:24 2012
Кодировка:

Поисковые слова: http astrokuban.info astrokuban
Astronomical Data Analysis Software and Systems X ASP Conference Series, Vol. 238, 2001 F. R. Harnden Jr., F. A. Primini, and H. E. Payne, eds.

A New Field-Matching Metho d for Astronomical Images
C. Thiebaut, M. BoЁ er Centre d'Etude Spatiale des Rayonnements (CESR-CNRS), BP 4346, 31028 Toulouse Cedex 4, France Abstract. We propose a new Field-Matching algorithm for astronomical images. This new method is based on a multiresolution analysis. We tried two cases: first, we compared the test image with a synthetic image built from a point source catalog; second, we used a reference image of the relevant portion of the sky. Structures of images are obtained at different scales by applying the wavelet transform. An appropriate thresholding of the wavelet coefficients gives the significant pixels in the Wavelet Transform Space. In order to compare the selected coefficients between the test and the reference images we used a genetic algorithm. We applied this method on images taken by the automatic TAROT (Rapid Action Telescope for Transient Ob ject) telescope. The reference data are taken from the USNO-A2.0 Catalog and from the Digital Sky Survey. The results are more robust and reliable than those obtained with the FOCAS algorithm. Moreover, the new algorithm is faster than FOCAS.

1.

Introduction

The field-matching consists in recognizing the field of an image with unknown coordinates in a reference image taken from a catalog. The existing field-matching methods, like FOCAS, require good knowledge of the approximated centroЁ id position. We have developed a new method of field-matching which takes into account the geometrical characteristics of the test image and tries to compare these characteristics with those of a reference image. To do this, we use a multiresolution analysis which will give us the details of the image at the different scales. First, we use Mallat's analysis, which is anisotropic. Because the images we study are quite isotropic (astronomical ob jects: stars, galaxies etc.) we try to use an isotropic analysis too: the "a trous" algorithm. After having obtained ` the different wavelet plans, we have to threshold the coefficients in order to keep the most significant ones. The pixels we keep represent the image structure. We obtain two structures that we match using a genetic algorithm. It will give us the vertical and horizontal offsets between the two structures and then, the offset between the two original images.

388 c Copyright 2001 Astronomical Society of the Pacific. All rights reserved.


A New Field-Matching Method for Astronomical Images 2. 2.1. The Developed Method The Multiresolution Analysis

389

Thanks to a multiresolution analysis, we obtain the details of an image at different scales by applying a wavelet transform. Mallat introduced this concept in 1989, which led to the discrete wavelet transform (Mallat, 1989). Mal lat's Analysis This analysis is a non-redundant one because the amount of data is divided by two at each scale: this is called a dyadic analysis. In two dimensions, Mallat's analysis uses three wavelets which leads to an anisotropic analysis. We obtain the horizontal, diagonal and vertical details of the image at each scale. We used the Daubechies wavelet of degree four. However, the astronomical ob jects are usually isotropic and without privileged directions. That is why we use the so called "` trous" algorithm (Starck et al. 1995). a The "` trous" Algorithm This analysis is isotropic but redundant. The image a is smoothed on the different scales. Because of the redundancy, this algorithm is not as fast as Mallat's algorithm. 2.2. Threshold of the Coefficients

When we have obtained the structures at the different scales, we have to keep only the significant wavelet coefficients. Because of the wavelet form, the best coefficients of Mallat's analysis are the most negative and positive ones, whereas the best coefficients of the "` trous" analysis are the most positive ones. As a a consequence, we take the absolute value of the wavelet plans from the first analysis and we keep the 20 best coefficients of these images. For the "` trous" a analysis, we keep the 20 best coefficients of the original wavelet plans. At each scale and for each details image, we have 20 significant pixels. The set we obtain is called the structure of the studied image. We apply one of the algorithms on the test image and on the reference image and we obtain two structures. Finally, we have to match both structures and find the original offset between the two images. 2.3. Matching Both Structures: the Genetic Algorithm

To match both obtained structures we use a genetic algorithm (Houck et al. 1995). These algorithms are inspired by natural evolution theory: they maintain and manipulate a family or a population of solutions and implement a "survival of the fittest" strategy in their search for better solution. Those algorithms have been shown to solve linear and nonlinear problems by exploring all regions of the state space and exponentially exploiting promising areas through mutation, crossover, and selection functions applied to individuals in the population. Here we want to find the vertical and horizontal offsets between the two original images. The population of our algorithm is a vertical and horizontal offset pair to apply to one of the structures. The algorithm will converge to the best offset pair. Taking into account the scale of the studied structure, we can find the original offset by multiplying by the relevant factor.


390 3.

Thiebaut and BoЁ er Application and Results

We want to find the offset position between the astronomical images taken by the automatic TAROT telescope (BoЁr et al. 2001) and a reference image. The refere ence image of a relevant portion of the sky is an image taken from the Digitized Sky Survey. The second reference image is built from the point source catalog USNO-A2.0. The pixel image is convolved with a Gaussian which represents the Point Spread Function of the TAROT telescope. We made a Matlab implementation of the algorithms. For the genetic algorithm, we took a population of 100 offset pairs, the selection function was a tournament. We only used two crossover and two mutation functions. With a convergence time of 40 s, the Mallat's analysis is faster than the "` trous" a algorithm (120 s). We then decided to use the anisotropic method. We matched the horizontal and vertical details images of the third scale. Then, we compare the matching with the DSS images and the one with the USNO-A2.0 catalog images. In Table 1, we give the original vertical and horizontal offset, and those found after the convergence. Finally, we show the results of the FOCAS method (matching with the USNO catalog): we give the number of matched stars and the number of stars found on the image. For the DSS images, 30 images of 31 are matched. For the only nonmatched image, we took the structures of the second scale. The new found offset is (-40,-64), and the matching is then done. For the USNO images, only 13 of 31 images are matched. 4. Conclusion

The matching with the USNO images is not as good as the one with DSS images, which is very good. In fact, a multiresolution analysis is perhaps not well adapted to such constructed images, which present no structure. We could apply the genetic algorithm directly to the brightest ob jects of both images. Nevertheless, the new method is faster and more robust than other methods. It does not require a good knowledge of the centroЁd coordinates. i References BoЁr, M. et al. 2001, this volume, 111 e Valdes, F., Campusano, L., Velasquez, J., & Stetson, P. 1995, PASP, 107, 1119 Starck, J.-L., Murtagh, F., & Bijaoui, A. 1995, in ASP Conf. Ser., Vol. 77, Astronomical Data Analysis Software and Systems IV, ed. R. A. Shaw, H. E. Payne, & J. J. E. Hayes (San Francisco: ASP), 279 Lega, E., Bijaoui, A., Alimi, J. M., & Scholl, H. 1996, A&A, 309, 23 Houck, C., Joines, J., & Kay, M. 1995, NCSU-IE TR 95-09 Mallat, S. 1989, IEEE Trans on Pattern Anal. and Math. Intel., 11, 7


A New Field-Matching Method for Astronomical Images

391

Table 1. Summary of the results. New method Image 1a 1b 1c 1d 2a 2b 3a 3b 3c 3d 4a 4b 4c 5a 5b 5c 5d 6a 6b 6c 7a 10a 11a 11b 11c 11d 12a 12b 12c 13a 13b Initial offset (7,44) (19,53) (16,46) (20,52) (-8,-101) (-1,-101) (-15,-71) (-35,-63) (-38,-65) (-38,-65) (-10,-94) (-20,-98) (-20,-98) (-1,-76) (-18,-78) (-19,-80) (-20,-82) (-8,-73) (-32,-79) (-33,-78) (18,-3) (-40,-60) (10,-83) (0,-88) (-4,-90) (-5,-90) (10,-14) (-2,-14) (-14,-12) (10,19) (9,18) DSS Images (8,48) (16,48) (24,48) (16,48) (-8,-96) (-8,-88) (-8,-64) (-24,-48) (-32,-56) (-32,-56) (-8,-88) (-24,-88) (-24,-88) (8,-72) (-8,-72) (-8,-80) (-8,-80) (-8,-72) (-24,-80) (-24,-72) (16,-8) (80,-178) (16,-72) (0,-72) (0,-80) (-24,-80) (16,0) (0,-8) (-16,8) (8,24) (8,24) USNO Images (8,-232) (-40,-102) (-24,-104) (8,56) (-8,-88) (0,-72) (-48,16) (-64,40) (0,64) (-8,64) (-16,-88) (-16,-88) (-16,-88) (-8,-88) (0,-80) (-32,-120) (-40,-136) (64,136) (80,152) (24, 152) (32,-64) (-136,-72) (8,-64) (0,-72) (0,-88) (-16,-80) (136,-160) (120,-136) (32,-48) (8,16) (-120,-96) Matched 23 624 28 1408 382 4 3 205 2 1 198 117 159 1 286 1 20 1 69 46 1 68 267 278 2 2 127 157 2 7 22 FOCAS Ob jects 593 1295 141 1668 484 109 364 254 37 44 231 155 198 29 347 36 69 55 74 51 97 77 311 342 62 75 149 175 36 202 57 Matching OK OK OK OK OK NO NO OK NO NO OK OK OK NO OK NO OK NO OK OK NO OK OK OK NO NO OK OK NO NO OK