Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.stecf.org/conferences/adass/oralabstracts.html
Дата изменения: Mon Jun 12 18:51:57 2006
Дата индексирования: Mon Oct 1 21:22:19 2012
Кодировка:

Поисковые слова: http www.badastronomy.com bad tv foxapollo.html
ADASS '97 oral paper abstracts

The VLT Science Archive System

M.A. Albrecht, E. Angeloni, A. Brighton, J. Girvan, F. Sogni, A.J. Wicenec, H. Ziaeepour (ESO)

The ESO Very Large Telescope (VLT) will deliver a Science Archive of astronomical observations well exceeding the 100 Terabytes mark already within its first five years of operations.

In order to safely store and subsequently maximize the scientific return of these data, ESO is undertaking the design and development of both On-Line and Off- Line Archive Facilities. The main objective of these facilities is to provide the infrastructure needed to offer the Science Archive as an additional instrument of the VLT. The main capabilities of the system will be a) handling of very large data volume, b) routine computer aided feature extraction from raw data, c) data mining environment on both data and extracted parameters and d) an Archive Research Programme to support user defined projects.

This talk reviews the current planning and development state of the VLT Science Archive project.


Astronomy On-Line - the world's biggest astronomy event on the World-Wide-Web

R. Albrecht (ESO/ESA/ST-ECF), R. West (ESO), C. Madsen (ESO)

This educational programme was organised in a collaboration between ESO, the European Association for Astronomy Education (EAAE)and the European Union (EU) during the 4th European Week for Scientific and Technological Culture. Asronomy on-line brings together thousands of students from all over Europe and other continents. Learning to use the vast resources of tomorrow's communication technology, they also experience the excitement of real-time scientific adventure and the virtues of international collaboration. The central web site is hosted by ESO, and there are satellite sites in all participating countries. Astronomy on-line features an electronic newspaper which reports on current astronomical events and provides hotlinks to appropriate sites. The "Marketplace" provides a gateway to collaborative projects, astronomical data and software, and to professional astronomers in the different participating countries who have agreed to support the project.


TAKO: Astro-E's Mission Independent Scheduling Suite

A. Antunes (HSTX/GSFC), P. Hilton (Hughes/ISAS), A. Saunders (GSFC)

The next generation of Mission Scheduling software will be cheaper, easier to customize for a mission, and faster than current planning systems. TAKO ("Timeline Assembler, Keyword Oriented", or in Japanese, "octopus") is our in-progress suite of software that takes database input and produces mission timelines. Our approach uses openly available hardware, software, and compilers, and applies current scheduling and N-body methods to reduce the scope of the problem. A flexible set of keywords lets the user define mission-wide and individual target constraints, and alter them on the fly. Our goal is that TAKO will be easily adapted for many missions, and will be usable with a minimum of training. The especially pertinent deadline of Astro-E's launch motivates us to convert theory into software within 2 years. The design choices, methods for reducing the data and providing flexibility, and steps to get TAKO up and running for any mission are discussed herein.


Suggested presentation: Demo

European Southern Observatory - MIDAS + SKYCAT

K. Banse, M. Albrecht (ESO)

The latest version of On-line Midas as used for ESO's Dataflow System will be shown. The Archive group will demonstrate SKYCAT, the catalog display tool which is based on ESO's Real Time Display (RTD).

P.S. Also, the 96NOV Midas CD-ROM should be ready.


Invited talk

Parkes Multibeam realtime object-oriented data reduction using AIPS++

D. Barnes (University of Melbourne), L. Staveley- Smith, T. Ye, T. Oosterloo (Australia Telescope National Facility (ATNF))

We present algorithms and their implementation details for the Australia Telescope National Facility (ATNF) Parkes Multibeam Software. The new thirteen-beam Parkes 21 cm Multibeam Receiver is being used for the neutral hydrogen (HI) Parkes All Sky Survey (HIPASS). This survey will search the entire southern sky for neutral hydrogen in the redshift range -1200 km/s to +12600 km/s; with a limiting column density of approximately 5 x 10^{17} atoms per square centimetre. Observations for the survey began late in February, 1997, and will continue through to the year 2000.

A complete reduction package for the HIPASS survey data has been developed, based on the AIPS++ library. The major software component is realtime, and uses advanced inter-process communication coupled to a graphical user interface (GUI), provided by AIPS++, to apply bandpass removal, flux calibration, velocity frame conversion and spectral smoothing to 26 spectra of 1024 channels each, every five seconds. AIPS++ connections have been added to ATNF-developed visualization software to provide on-line visual monitoring of the data quality. The non-realtime component of the software is responsible for gridding the spectra into position-velocity cubes; typically 200000 spectra are gridded into an 8 x 8 degree cube.


Object-Relational DBMSs for Large Astronomical Catalogues Management

A. Baruffolo, L. Benacchio (Astronomical Observatory of Padova)

Astronomical catalogues containing from million up to hundreds million records (e.g. Tycho, GSC-I, USNO-A 1.0) are becoming commonplace. While they are of fundamental importance to support operations of current and future large telescopes and space missions, they appear also as powerful research tools for galactic and extragalactic astronomy.

Since even larger catalogues will be released in a few years (e.g. the GSC-II), researchers are faced with the problem of accessing these databases in a general but efficient manner, in order to be able to fully exploit their scientific content.

Traditional database technologies (i.e. relational DBMSs) have proven to be inadequate for this task. Segmentation of catalogues in a catalogue-specific file structure accessed by a set of programs provide fast access but only limited query capabilities. Other approaches, based on new access technologies, must thus be explored.

In this paper we describe the results of our pilot project aimed at assessing the feasibility of employing Object-Relational DBMSs for the management of large astronomical catalogues. In particular we will show that the database query language can be extended with astronomical functionalities and to support typical astronomical queries. Further, access methods based on spatial data structures can be employed to speed up the execution of queries containing astronomical predicates.


Parallel tree N-body code: data distribution and DLB on CRAY T3D for large simulations

U. Becciani, V. Antonuccio-Delogu (Obs. Catania), G. Erbacci (CINECA), R. Ansaloni (Silicon Graphics Italy), M. Gambera (Obs. Catania), A. Pagliaro (Inst. Astr. Catania)

During the last 3 years we have developed an N-Body code to study the origin and the evolution of the Large Scale Structure of the Universe (Becciani et al. 1996, 1997). The code, based on the Barnes-Hut tree algorithm, has been developed under the CRAFT environment to share work and data among the PEs involved in the run. The main purpose of this work was the study of the optimal data distribution in the T3D memory, and a strategy for the Dynamic Load Balance in order to obtain good performances when runnig the simulation with more than 10 million particles. To maximize the number of particles per second, updated at each step, we studied the optimal data distribution and the criterion to choose the PE executing the force computing phase and to reduce the load unbalancing. The results of our tests show that the step duration depends on two main factors: the data locality and the T3D network contention. Increasing data locality we are able to minimize the step duration. In a very large simulation, due to network contention, an unbalanced load arises. The DLB consists in implementing an automatic structure: each PE executes the force compute phase only for a fixed portion N of all the bodies residing in the local memory. The computation of all the remaining bodies is shared among all the PEs. The obtained results show that, fixing the PEs and the particles number, the same N value gives the best performance both in uniform and clustered condition. This means that it's possible to fix this quantity which can be usefully adopted during the running time without introducing any significant overhead to obtain a good Dynamic Load Balance.


Teaching Astronomy via the Internet

L. Benacchio, M. Brolis (Padova Astronomical Observatory), I. Saviane (Padova Department of Astronomy)

A project is being carried on at the Padova Astronomical Observatory, with the partnership of the Italian Telecom, whose aim is to supply high quality multimedia educational material and tools to public schools (14-18 teen) via the Internet. A WWW server has been set up, and in the early experimental phase, a number of schools in the city area will be connected to the Observatory and hence to the Internet. Teachers and students will use it for the annual course (1997/98)in astronomy.

Our purpose is to remove a lack in the Astronomical WWW sites currently active, i.e., by providing a carefully designed server which will deliver reliable information in a structured way and, at the same time, take full advantage of the medium. Apparently there are no sites devoted to the explanation of the basic astronomy and astrophysics, at the middle school level.

Our educational approach is based on the so-called Karplus cycle, that is: introduction of new concepts by means of proposed experiments, discussion and selection of the discovered laws and 'correct' explanation of the observations and application to new situations. To avoid the subject will try to `fit` the new knowledge into his/her already existing wrong schemes, a preliminary phase for the removal of existing misconceptions is present. In turn, the knowledge is introduced according to a hierarchical order of concepts.

The medium involved allows the full exploitation of this approach, since it permits direct experimentation by means of animation, java applets, and personalized answers to the proposed questions. Also, automatic tests and evaluations can be straightforwardly implemented. In this respect, it has a clear advantage over the traditional static book. Finally, the user can choose his/her own pace and path through the material offered.

We also propose a number of hands-on activities which extend and reinforce the concepts, and which require the presence of the teacher as an active guide.


Suggested presentation: Demo

Demonstration of Starlink Software

M. Bly, R. Warren-Smith (Rutherford Appleton Laboratory)

We will demonstrate the latest Starlink applications which will be available on the late summer Starlink CD-ROM release. The highlights include FIGARO with handling of error data, the GAIA GUI , an enhanced CURSA (catalogue access) and FIGARO running from the IRAF CL. Applications using the NDF data library will be able to work with non-NDF (eg IRAF) data formats using on-the-fly data conversion.

The demo needs:

SUN Ultra model 140 workstation or higher, with 8-bit colour TGX graphics (or equivalent) 128Mb memory 4Gb disk (2Gb to be available for software and data) 20-inch colour display 4x or better CD-ROM drive Solaris 2.5 operating system with CDE (Common Desktop Environment) Sparc Compiler 4.2: Fortran 77, C and C++ Internet connection if available.


Nightly Scheduling of ESO's Very Large Telescope

A.M. Chavan (ESO), G. Giannone (Serco), D. Silva (ESO), T. Krueger, G. Miller (STScI)

A key challenge for ESO's Very Large Telescope (VLT) will be responding to changing observing conditions in order to maximize the scientific productivity of the observatory. For queued observations, the nightly scheduling will be performed by staff astronomers using an Operational Toolkit. This toolkit consists of a Medium Term Scheduler (MTS) and a Short Term Scheduler (STS) both integrated and accessible through a Graphical User Interface (GUI). The Medium Term Scheduler, developed by ESO, will be used to create candidate lists of observations based on different scheduling criteria. There may be different candidate lists based on "seeing", or priority, or any other criteria that is selected by the staff astronomer. A MTS candidate list is then selected and supplied to the Short Term Scheduler for detailed nightly scheduling. The STS uses the Spike scheduling engine, which was originally developed by STScI for use on the Hubble Space Telescope.


Invited talk:

CyberHype or Educational Technology - What is being learned from all those BITS?

C. Christian

I will discuss various information technology methods being applied to science education and public information. Of interest to the group at STScI and our collaborators is how science data can be mediated to the non-specialist client/user. In addition, I will draw attention to interactive and/or multimedia tools being used in astrophysics that may be useful, with modification, for educational purposes. In some cases, straightforward design decisions early on can improve the wide applicability of the interactive tool.


How to exploit an astronomical gold mine: Automatic classification of Hamburg/ESO Survey spectra

N. Christlieb (Hamburg Obs.), G. Graeshoff (MPI History of Science, Berlin / Univ. Hamburg), A. Nelke, A. Schlemminger (Univ. Hamburg), L. Wisotzki (Hamburg Obs.)

We present methods for automatic one-dimensional classification of digitized objective prism spectra developed in the course of the Hamburg/ESO Survey (HES) for bright QSOs. The HES covers about 10,000 deg2 in the southern extragalactic sky, yielding several million usable spectra in the range 12 <- B <- 17. The resolution of the HES spectra is ~ 15A at Hg, permitting to detect the strongest stellar absorption features.

Our astronomical aims are:

Ґ Construction of complete samples of quasar candidates by identification of objects that do not have stellar absorption patterns, via classification with the Bayes rule plus a reject option.

Ґ Construction of complete samples of rare stellar objects, e.g. White Dwarfs, horizontal branch A- stars, or extremely metal poor halo stars. Here a minimum cost rule is used.

Ґ "Simple" classification of all HES spectra with the Bayes rule, e.g. to provide a data basis for cross- identification with surveys in other wavelength ranges.

The feature space used for classification consists of equivalent widths of stellar absorption features.

We report on the discovery of the extremely metal poor halo star HE 2319-0852, [Fe/H]=-3.5±0.5, which was discovered in a test survey for these objects on a few of our plates using simulated spectra as learning sample.


Building Software Systems from Heterogeneous Components

M. Conroy, E. Mandel, J. Roll (SAO)

Over the past few years there has been a movement within astronomical software towards "Open Systems". This activity has resulted in the ability of individual projects or users to build customized processing systems from a variety of existing components. We will present examples of user customizable systems that can be built from existing systems where the only requirements on the components are:

a) Use of a common parameter interface library.

b) Use of FITS as the input/output file format.

c) Unix + X-windows environment

With these three minimal assumptions it is possible to build a customized image-display driven data analysis system as well as automated data reduction pipelines.


Suggested presentation: Demo

Demonstration of AIPS++

T. Cornwell, B. Glendenning (NRAO), J. Noordam (NFRA)

AIPS++ is a package for radio-astronomical data reduction now under development by a consortium of radio observatories. It is currently in beta release and it expected to be publicly released in late 1997.

Description of demo:

We will demonstrate the beta version of AIPS++. This will consist of a demonstration by an AIPS++ Project Member at regularly scheduled times. In addition, we will make the system available for use by others.


VRML and Collaborative Environments: New Tools for Networked Visualization

R.M. Crutcher, R.L. Plante, and P. Rajlich (National Computational Science Alliance/Univ. of IL)

We present two new applications that engage the network as a tool for astronomical research and/or education. The first is a VRML (virtual reality modeling language) server which allows users over the Web to interactively create three-dimensional (3D) visualizations of FITS images contained in the NCSA Astronomy Digital Image Library (ADIL). The server's Web interface allows users to select images from the ADIL, fill in processing parameters, and create renderings featuring isosurfaces, slices, contours, and annotations; the often extensive computations are carried out on an NCSA SGI supercomputer server without the user having an individual account on the system. The user can then download the 3D visualizations as VRML files, which may be rotated and manipulated locally on virtually any class of computer. The second application is the ADILBrowser, a part of the NCSA Horizon Image Data Browser Java package. ADILBrowser allows a group of participants to browse images from the ADIL within a collaborative session. The collaborative environment is provided by the NCSA Habanero package which includes text and audio chat tools and a white board. The ADILBrowser is just an example of a collaborative tool that can be built with the Horizon and Habanero packages. The classes provided by these packages can be assembled to create custom collaborative applications that visualize data either from local disk or from anywhere on the network.


Fitting and Modeling of the AXAF Data with the ASC Fitting Application

S. Doe, A. Siemiginowska, M. Ljungberg, W. Joye (SAO)

The AXAF mission will provide X-ray data with unprecedented spatial and spectral resolution. Because of the high quality of these data, the AXAF Science Center will provide a new data analysis system - part of which includes a new fitting application. Our intent is enable users to do fitting that is too awkward with or beyond the scope of existing astronomical fitting software. Our main goals are: 1) to take advantage of the full capabilities of the AXAF, we intend to provide a more sophisticated modeling capability (i.e., models that are f(x,y,E,t), models to simulate the response of AXAF instruments, and models that enable "joint-mode" fitting, i.e., combined spatial-spectral or spectral-temporal fitting); and 2) to provide users with a wide variety of models, optimization methods, and fit statistics. In this paper, we discuss the use of an object- oriented approach in our implementation, the current features of the fitting application, and the features scheduled to be added in the coming year of development. Current features include: an interactive, command-line interface; a modeling language, which allows users to build models from arithmetic combinations of base functions; a suite of optimization and fit statistics; the ability to perform fits to multiple data sets simultaneously; and, an interface with SM and SAOtng to plot or image data, models, and/or residuals from a fit. We currently provide a modeling capability in one or two dimensions, and have recently made an effort to perform spectral fitting in a manner similar to XSPEC. We also allow users to dynamically link the fitting application to algorithms written by users. Our goals for the coming year include: incorporating the XSPEC model library as a subset of models available in the application; enabling "joint-mode" analysis; and adding support for new algorithms.


New Capabilities of the ADS Abstract and Article Service

G. Eichhorn, A. Accomazzi, C.S. Grant, M.J. Kurtz, S.S. Murray (SAO)

The ADS abstract service at: http://adswww.harvard.edu has been updated considerably in the last year. New capabilities in the search engine include searching for multi-word phrases and searching for various logical combinations of search terms. Through optimization of the custom built search software, the search times were decreased by a factor of 4 in the last year.

The WWW interface now uses WWW cookies to store and retrieve individual user preferences. This allows our users to set preferences for printing, accessing mirror sites, fonts, colors, etc. Information about most recently accessed references allows customized retrieval of the most recent unread volume of selected journals. The information stored in these preferences is kept completely confidential and is not used for any other purposes.

Two mirror sites (at the CDS in Strasbourg, France and at NAO in Tokyo, Japan) provide faster access for our European and Asian users.

To include new information in the ADS as fast as possible, new indexing and search software was developed to allow updating the index data files within minutes of receipt of time critical information (e.g., IAU Circulars which report on supernova and comet discoveries).

The ADS is currently used by over 10,000 users per month, which retrieve over 4.5 million references and over 250,000 full article pages each month.


Invited talk:

Object-Oriented Experiences with GBT Monitor and Control

J.R. Fisher (NRAO)

The Green Bank Telescope Monitor and Control software group adopted object-oriented design techniques as implemented in C++. The experience has been generally positive, but there certainly have been many lessons learned in the process. The long analysis phase of the OO approach has lead to a fairly coherent software system and a lot of module (class) reuse. Many devices (front-ends, spectrometers, LO's, etc.) share the same software structure, and implementing new devices in the latter part of the project has been relatively easy, as is to be hoped with an OO design. One disadvantage of a long design phase is that it is hard to evaluate progress and to have much sense for how the design satisfies the real user needs. In retrospect, the project might have been divided into smaller units with tangible products at early and mid stages of the project. The OO process is only as good at the requirement specifications, and the process has had to deal with continually emerging requirements all though the analysis, design, and implementation phases. Changes and fixes to core software modules have not been too painful, but they do require a robust software version control system. Large and medium scale test of the system in the midst of the implementation phase has required quite a bit of time and coordination effort. This has tended to inhibit progress evaluations.


News on the ISOPHOT Interactive Analysis (PIA)

C. Gabriel (ESA-SSD)

The ISOPHOT Interactive Analysis system, a calibration and scientific analysis tool for the ISOPHOT instrument on board ESA's Infrared Space Observatory (ISO), has been further developed while ISO is under operations.

After 18 months of ISO operations considerable experience has been achieved by the use of PIA, which led to several new features in the package. This experience has been achieved not only by the ISOPHOT Instrument Dedicated Team in its tasks of e.g. calibration, instrument performance check and refinement of analysis techniques, but also by a large number of ISOPHOT observers in around 100 astronomical institutes all over the world. PIA is distributed freely since longer than one year to all astronomers wishing to use it for ISOPHOT data reduction and analysis. The feedback from the different users is reflected not only in the extension of the analysis capabilities but also on a more friendly graphical interface, a better documentation, an easier installation. So became PIA not only a very powerful calibration tool but also the software tool of choice for the scientific analysis of ISOPHOT data.

In this paper we will concentrate on some of the PIA enhancements, by the scientific analysis, by the documentation and by the related general service to the astronomical community.


Distributed Searching of Astronomical Databases with Pizazz

K. Gamiel (National Computational Science Alliance/Univ. of IL)

The NCSA Pizazz SDK is an information retrieval communications toolkit that includes code and applications for for easily integrating existing database systems into a globally accessible, open standards-based system. The toolkit includes a TCP- based server and information retrieval protocol engine that handles all network communication between client and server. The server is designed as a drop-in application, extending the functionality of legacy database systems and creating a global infrastructure of astronomical database resources. The toolkit uses the Z39.50 information retrieval protocol.


Achieving Stable Observing Schedules in an Unstable Worls

M. Giuliano (STScI)

Operations of the Hubble Space Telescope (HST) require the creation of stable and efficient observation schedules in an environment where inputs to the plan can change daily. Operations must allow observers to adjust observation parameters after submitting the proposal. PIs must also be informed well in advance the approximate date of an observation so they can plan for coordinated observations and data analysis. Scheduling is complicated due to ongoing changes in the HST operational parameters and because the precise ephemeris for HST is not known in advance. Given these constraints, it is not possible to create a single static schedule of observations. Instead scheduling should be considered an ongoing process which creates and refines schedules as required. Unlike other applications of replanning, the HST problem places a premium on ensuring that a replan minimally disturbs the existing plan. A process and architecture is presented which achieves these goals by dividing scheduling into long term and short term components. The long term scheduler, the main focus of this paper, provides approximate 4-8 week plan windows for observations. A plan window is a subset of an observation's constraint windows, and represents a best effort commitment to schedule in the window. The long range planner ensures plan stability, balances resources, and provides the short term scheduler with the proper mixture of visits to create week long schedules. The short term scheduler builds efficient week long observation schedules by selecting observations who have plan windows open within the week.

The long term scheduler as implemented within the Spike software system provides support for achieving stable observations schedules. Spike models the planning process as a function which takes as input a previous plan, a set of proposals, and some search criteria and produces as output a new plan. Stability is ensured by using the input plan to guide the creation of a new plan. Through this mechanism Spike can handle instabilities such as changed observation specifications, out of date observation products, and errors in loading observation specifications. Special routines are provided for planning and ensuring stability for observations linked by timing requirements (e.g. Observation 2 after observation 1 by 6-8 days). Spike provides a combination heuristic and stochastic search engine with user defined weights for finding near optimal plans.


Suggested presentation: Demo

New applications of Artificial Neural Networks in stellar spectroscopy

R. Gupta, R.K. Gulati (IUCAA), H.P. Singh (University of Delhi)

Recently, Artificial Neural Networks (ANNs) have been proved to be a very efficient technique for stellar spectral classification in spectral regions of UV, Optical and IR. Various groups including ours have used this technique with the main aim to evolve an automated procedure for use with large upcoming stellar spectral libraries which will be the major outcome of several ongoing surveys being undertaken at many astronomical observatories. In an attempt to explore newer areas of applications, we have extended this technique to obtain stellar atmospheric parameter Teff; determination of a third dimension of classification from UV data i.e. color excess E(B-V) and applying Principal Component Analysis (PCA) as a pre-processor before using the ANN on spectral data. In the application of stellar atmospheric effective temperature, we present the first ever attempt to obtain Teff for dwarf stars by ANN technique and obtain results comparable to earlier attempts by other statistical techniques. In the second