Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.eso.org/projects/dfs/
Дата изменения: Tue May 25 17:57:04 2004
Дата индексирования: Tue Oct 2 00:18:59 2012
Кодировка:

Поисковые слова: п п п п п п п п п п п п п п п п п п п п п п п п п п п п п п п п п п п п п р
Data Flow System overview
 [ ESO ]  

Data Flow System Overview

HOME INDEX SEARCH HELP NEWS

Introduction

In order to realize the optimal scientific return from the VLT, ESO has undertaken to develop an end-to-end data flow system from proposal entry to science archive. The VLT Data Flow System (DFS) is being designed and implemented by the DFS Group (ESO Internal access only) in collaboration with VLT Division and Instrumentation Division. The DFS Group is part of the DMD division.

The major components of the DFS include:

Below you will find a brief summary of the operational aspects and of the components of the DFS system.
More complete and up-to-date information can be found on the DFS Group website (ESO Internal access only). Some dfs related series of papers are also accessible from here.

You will find an introduction to the Beowulf Project.

Operations Aspects

The operations model of the VLT allows PIs to apply for visitor-mode or service-mode observation programs. The visitor mode corresponds to the mode of operations that has prevailed until now in most ground-based observatories: the visiting astronomer is physically present at the telescope and can adapt the observation program to specific target properties, changing observation conditions, or calibration needs. The service-mode is inspired in its philosophy from the operations of space-borne observation facilities like the Hubble Space Telescope. The adaptation of that scheme to a ground-based observatory of the dimension of the VLT makes it necessary to re-design major parts of the operational concepts. The Data Flow System binds the components involved in the observation life-cycle from observation preparation and execution to processing and archiving.

Program Handling

The procedure for proposal preparation in the Data Flow system involves a Phase I and a Phase II proposal preparation. In Phase I, proposals are submitted electronically to ESO and evaluated by the Observing Program Committee (OPC). After the OPC selection has taken place, Phase II preparation is based on template forms describing standard instrument modes and configurations. Observation Blocks are created by specifying the template parameters, target information, and user-defined scheduling constraints. The user will be assisted in these phases by an Instrument Scientist and by observation preparation tools. These tools include generic systems like finding chart generators or guide star selection systems, and instrument related tools like exposure time calculators (ETCs). Feasibility checks of the proposals are performed by the observatory and include technical feasibility and exposure time control.

Observation Handling

During Phase 2 of the proposal preparation process, successful programs convert their observing programs into structures which are schedulable and executable by the VLT. These files are called Observation Blocks (OBs) and combine together target and instrument data for one or more exposures on typically a single astronomical object. The description of an instrument within an OB is called a Template. A subset of instrument modes and functions supported by ESO is presented as a template whose input values are set by the astronomer during OB creation. All ESO instruments under the DFS/VCS system will be operated via templates and OBs whether in service or visitor mode. OBs are the quantum of data that flows within the DFS, collecting state, data and status information as they flow. When scheduling service mode observations, OBs represent the smallest schedulable unit of telescope resources. OBs are submitted to an OB Repository from which a medium term and short term schedule can be constructed for service mode observations or from which a visiting astronomer can schedule his/her nights' observations.

VLT Control System

Observing Blocks contain instructions and data that can be executed by the VLT Control System (VCS) and result in telescope movement, instrument control and data being taken by VLT instruments. Completion status on this execution is signaled to the DFS and data is stored on the on-line archive system.

Science Archive

The Science Archive stores all raw frames produced by the instruments, as well as reference calibration data, and log files including maintenance and ambient conditions logs. The Science Archive is available to archive researchers and astronomers for catalog access and retrieval of scientific data as they become available after the end of the proprietary period, as well as retrieval of calibration, instrument data and logs as soon as they have been processed and verified by the Data Flow Operations.

Data Pipelines

A subset of instrument modes is supported by Data Pipelines that remove instrumental signatures and apply physical unit calibrations. The Data Organizer assembles calibration and raw data to be processed by the pipeline following analysis recipes specified in a reduction block (RB). Data analysis recipes are scripts written in a particular data reduction system and are the point were the DFS makes contact with a particular DRS.

Quality Control

Every VLT instrument has a calibration plan which specifies a series of data taking actions necessary to properly calibrate raw data and monitor instrument performance. OBs corresponding to the calibration observations are created by the Quality Control System. The resulting raw data is processed by Quality Control and used to repopulate the calibration data base, track instrument performance in the short and long term and maintain the accuracy of instrument simulators.

Quality Control will also define new technical programs for instruments and submit them for telescope time approval. Quality Control also provides online systems at the telescope to help data flow operations staff assess whether service mode data has been taken under the conditions specified by the astronomer. DFS Quality Control takes place at the VLT and in Garching using the same pipeline infrastructure.

DFS Development

The first unit telescope of the VLT (UT1) saw first light on 25 May 1998 using the VLT test camera. During the remainder of 1998, the first two VLT instruments (FORS and ISAAC) were commissioned. Science operations on the first Unit Telescope (UT1, ANTU) commenced with both instruments on 31 March, 1999.

Prototypes of the various components of the DFS were first tested on ESO's New Technology Telescope. Hereafter the DFS software was installed on the first Unit Telescope of the VLT (ANTU) and commissioned. The DFS for ANTU commenced operations with the Call for Proposals for the first VLT semester on 1 August, 1998.


 [Projects and Developments]  [ESO]  [Index]  [Search]  [Help]  [News]