Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.adass.org/adass/proceedings/adass00/P1-27/
Дата изменения: Tue May 29 19:48:05 2001 Дата индексирования: Tue Oct 2 04:48:09 2012 Кодировка: Поисковые слова: universe |
Integral Field Spectroscopy (IFS) is a technique to produce spectra over a contiguous 2-D field, producing as a final data product a 3-D data cube of the two spatial coordinate axes plus an additional axis in wavelength. Although existing techniques, such as stepping a longslit spectrograph or scanning a Fabry-Perot device, can produce such a data cube the IFS technique collects the data simultaneously with obvious savings in observing efficiency. However, IFS has only recently approached maturity as a hardware technique (e.g., Haynes et al. 1998; Haynes et al. 1999; Allington-Smith et al. 2000).
Initial data reduction to remove instrumental effects such as flat fielding and cosmic ray removal, and mapping between the 2-D detector coordinates and the data cube, is highly instrument dependent.
There are two paradigms for IFS data reduction. First, the ``traditional'' method, adapted from multi-object spectroscopy (MOS), where the output from each fibre is extracted by tracing the spectrum and accounting for wavelength dependent distortion (normally referred to as the MOS paradigm). More recently, with the arrival of TEIFU, where the fibre outputs are under-sampled by the detector, an alternative paradigm has arisen (usually referred to as the longslit paradigm). Although the independence of the spatial samples is lost due to the under-sampling of the PSF by the detector, it can be shown that this is irrelevant so long as the target is critically sampled by the IFU (Allington-Smith & Content 1998). Here the methods adapted from MOS cannot be used and the resulting dataset bears more resemblance to traditional longslit spectroscopy than to MOS data.
While data reduction software is available for the currently operating IFUs, e.g., SMIRFS (Haynes et al. 1999), software to deal with data from the next generation of instruments, such as GMOS (Allington-Smith et al. 2000) or GNIRS, is either still in development or it is unclear who is tasked with providing the software. This is worrying, as it seems unlikely that (with currently available resources) a comparison between the two data reduction paradigms will be made for the upcoming generation of IFUs, many of which fall between the two reduction paradigms (e.g., GMOS).
While the initial data reduction software for IFUs is highly instrument dependent, the data analysis of the final science data product for all these instruments should be fairly generic. The end product of the data reduction for IFS is, almost naturally, an ( x,y,) data cube. Once assembled, with associated variance and quality arrays, scientifically interesting information can be extracted from the cube.
While not every possible operation can be anticipated there are several standard processes that most observers will want to carry out during the data analysis stage:
A lot of these required tasks can be carried out using pre-existing Starlink software with only minor or no modifications necessary to the code. This situation has arisen due to the use of the extensible N-Dimensional Data Format (NDF). This is a format for storing bulk data in the form of N-dimensional arrays of numbers. It is typically used for storing spectra, images, and similar datasets with higher dimensionality. The NDF format is based on the Hierarchical Data System (HDS) and is extensible; not only does it provide a comprehensive set of standard ancillary items to describe the data, it can also be extended indefinitely to handle additional user-defined information of any type.
While most Starlink applications were written with 2-D CCD data in mind, they were written generically to make use of the NDF format and hence a great many have the capability to handle data which has more than the anticipated two dimensions, e.g., many KAPPA and FIGARO applications are capable of being used on multi-dimensional data.
No. | Type | Name | Format | BITPI | INH |
0 | ifs_data.fits | 16 | |||
1 | BIN TABLE | TAB | num.of fibres | 8 | |
2 | IMAGE | SCI | num.of fibres | -32 | F |
3 | IMAGE | VAR | num.of fibres | -32 | F |
4 | IMAGE | DQ | num.of fibres | 16 | F |
Here the first extension is a binary FITS table with columns: ID, RA, DEC, and SKY. This table would hold information specific to individual lenslets/fibers like relative fibre positions on the sky (RA, DEC), whether the fibre is a sky or object spectrum (SKY), etc. The three image planes are multispec-like, each row is a separate spectrum.
However this MOS-style MEF format is not particularly natural way of handling IFS data. Indeed, under the paradigm these files cannot be generated. A conversion program for GMOS and CIRPASS data to a more easily analysed data cube,which will involve re-binning the input spectra onto a rectangular array, is therefore desirable:
No. | Type | Name | Format | BITPI | Comment |
0 | ifs_data.fits | ||||
1 | IMAGE | SCI | XY | -32 | 3-D science array |
2 | IMAGE | VAR | XY | -32 | 3-D variance array |
3 | IMAGE | DQ | XY | 16 | 3-D data quality array |
In this case, the IFU geometry information is no longer needed, but it would make sense to include the coordinates for each fibre if the user was not taking home the raw data from the telescope, presumably as a FITS binary table.
Allington-Smith, J. R., Content, R., Haynes, R., & Robertson, D. 2000, in ASP Conf. Ser., Vol. 195, Imaging the Universe in Three Dimensions, ed. W. van Breugel & J. Bland-Hawthorn (San Francisco: ASP), 319
Allington-Smith, J. R. & Content, R., 1998, PASP, 110, 1216
Haynes, R., et al. 1999, PASP, 111, 1451
Haynes, R., Doel, A. P., Content, R., Allington-Smith, J. R., & Lee, D. 1998, SPIE, 3355, 788