Communiqué de presse

New Method for Data Treatment Developed at ESO

How Future Astronomical Observations Will be Done

2 août 1996

The past four centuries have seen dramatic improvements in astronomical equipment, in terms of better and larger telescopes, more accurate and sensitive detectors and, not the least, by advanced space instruments with access to new spectral regions. However, until recently there has been little progress on another equally important front, that of quantifying the unavoidable influence of this equipment on the astronomical data they produce. For a long time, astronomers have desired to remove efficiently these `instrumental effects' from their data, in order to give them a clearer understanding of the objects in the Universe and their properties. But it is only now that this fundamental problem can finally be tackled efficiently, with the advent of digital imaging techniques and powerful computers.

Two researchers at the ESO Headquarters, Michael R. Rosa of the Space Telescope European Co-ordinating Facility (ST/ECF [1]) and Pascal Ballester of the Data Management Division (DMD) are now developing a new approach to this age-old problem. These results are important for the future use of the ESO Very Large Telescope (VLT), the Hubble Space Telescope (HST) and other large facilities as well [2].

The observational process

Observations are crucial to the progress of all natural sciences, including astronomy. Nevertheless, the properties of the observed objects are rarely revealed directly.

First, observational data are gathered at the telescopes with instruments such as cameras and spectrophotometers. Then these `raw' data are processed with advanced computer programmes to produce scientifically meaningful data which are finally scrutinized by the astronomers in order to learn more about the observed celestial objects.

A basic problem in this chain is the influence of the telescopes and instruments on the data they produce. The `raw' observational data carries the marks, not only of the celestial objects that are observed, but also of the `recording equipment' and, in the case of ground-based observations, of the atmospheric conditions as well.

These disturbing effects, for example straylight in the telescope and light absorption in the atmosphere, are referred to as the instrumental and atmospheric `signatures'. Only when they have been `removed' from the data, can these be properly interpreted. In fact, unless these effects are completely known, an observation may not result in any new knowledge at all or, even worse, may lead to erroneous results.

The history of astronomy contains many examples of the battle with instrumental effects; see also the Appendix. With the advent of new and advanced astronomical facilities like the VLT and HST, the need for an efficient solution of this fundamental problem has become particularly acute.

The calibration challenge

Until now, the usual procedure to tackle this common problem has been to observe socalled `reference sources' (celestial objects with well-known properties [3]) with exactly the same instrument and observational mode and under same atmospheric conditions as the celestial object under study, referred to as the `target'.

A comparison between the `raw' observational data recorded for the reference sources and their known properties then allows to determine, more or less accurately, the instrumental and atmospheric signatures. Subsequently, these effects can be removed during the data processing from the raw data obtained for the programme targets. This leaves behind - at least in theory - `clean data' which only contain the desired information about the celestial object under investigation. This fundamental, observational procedure is known as `calibration'.

Nevertheless, serious limitations are inherent in such a calibration procedure. In principle, it is only logically valid if the reference source has the same properties as the target and both are observed under identical instrumental and atmospheric conditions. These requirements, however, are never fulfilled in practice. One way around this obstacle is to observe a sufficient number of reference sources, the properties of which are supposed to bracket the properties of the targets. Likewise, repeated observations must be made whenever the observing conditions change. This way one hopes to obtain estimates of the instrumental and atmospheric signatures at the time of the observation of the target by means of interpolation.

Until now, this empirical calibration process was the only one available. Unfortunately, it demands a lot of the valuable telescope time just for repeated observations of the reference sources, significantly diminishing the time available for observations of the scientifically important objects. Moreover, every time the instrument is even slightly changed or some condition is altered, a new calibration procedure must be carried through.

Maximizing observational efficiency

In just over one year from now, ESO will begin to operate the largest optical telescope ever built, the Very Large Telescope (VLT) at the new Paranal Observatory in Chile. Because of its enormous light-collecting area and superior optical quality, the VLT is destined to make a break-through in ground-based observational astronomy.

The demand by astronomers for observing time at this unique facility is overwhelming. Even with the unsurpassed number of clear nights at Paranal, each available minute will be extremely precious and everything must be done to ensure that no time will be lost to unnecessary actions.

This is a major challenge to the scientists. For instance, how long a time should an exposure last to ensure an optimum of new knowledge about the object observed? In addition, how much time should be spent to define in sufficient detail the `signatures' of the atmosphere, the telescope and the instruments which must be removed from the `raw' data before the resulting `clean' data can be interpreted in a trustworthy way?

In short, how can the scientific return from the VLT and other telescopes such as the HST best be optimised? It is exactly for this reason that astronomers and engineers at ESO are now busy developing new methods of telescope operation and data analysis alongside with the VLT instrumental hardware itself.

The new solution by means of models

The appropriate strategy to make progress in the inherent conflict between calibration demand and time available for scientific observations is to obtain a physically correct understanding of the effects exerted on the data by different instruments. In this way, it is possible to decide which calibration data are actually required and on which timescale they have to be updated. One can then use computer models of these instruments to predict calibration solutions which are now valid for the full range of target properties and which handle environmental conditions properly. Such computer models can also be used to simulate observations.

This brings a lot of benefits for the entire observational process. First, the astronomer can prepare observations and select instrumental modes and exposure times suited for optimal information return. Secondly, it provides confidence in the validity of the calibration process, and therefore in the cleanliness of the corrected data. Finally, once a theory about the target and its properties has been developed, one may simulate observations of a set of theoretical targets for which the properties are slightly modified in order to study their influence on the raw data.

For the observatory there are also advantages. Optimization from the point of view of data analysis can now take place already during instrument design, calibration and data analysis procedures for any observational mode can be tested before real observations are obtained, and the maintenance staff can make sure that the instrument performs as expected and designed.

How far have we come along this road?

The present project consists of a close collaboration between the ESO Data Management Division (DMD) and Space Telescope European Co-ordinating Facility (ST/ECF). The VLT and the HST facilities have quite similar demands, because both astronomical observatories are committed to make data from a variety of instruments rapidly available to the world-wide community at a large scale. Once the basic concept had been defined, several groups at ESO started to develop models for particular instruments in order to study its general validity.

One of the VLT instruments under construction is the high resolution echelle spectrograph UVES; first light is planned for 1999. The DMD model for this instrument now succeeds in predicting the geometrical aspects of observational data to better than one resolution element (pixel) of the detector. In parallel, the ST/ECF has produced a computer model for the low-resolution Faint Object Spectrograph (FOS) on HST. This software is tuned in particular to simulate the aspects of internally scattered light, which is a serious nuisance for observations of faint targets.

A direct derivative of such models are accurate exposure time calculators, which the observer can use to estimate the length of each exposure when preparing his/her observing program. This is the time an electronic detector is exposed to the light of the astronomical object under study. If it is too short, the resulting image of the object will not contain enough information. On the other hand, if the exposure time is too long, the image may be degraded by too many artefacts from cosmic rays that hit the detector during the exposure, or it may saturate the detector completely. Clearly, this time may better be used to observe other objects.

In order to correctly plan the length of the exposure time for each astronomical target during an observing program, it is necessary to estimate the total effect of the instrument and the atmosphere on the light produced by the target. For this it is necessary to take into account the effects of the colour-dependent atmospheric absorption and the spreading of light by turbulence (seeing), the complete propagation of the light by the telescope mirrors and by the different optical components of the instrumentation (reflection, diffusion, absorption), as well as the properties of the electronic detector. In order to allow a wide access of the scientific community to such tools, the software for these calculators is being made available on the Internet.

In co-operation with a contractor, ESO has developed a complete computer model for each of the 8.2-m telescopes. This simulation model includes a large number of effects, for instance from atmospheric disturbances, wind shaking of the telescope and structural vibrations. Using this model, it is possible from simulations to predict the quality achievable, i.e. the signature of the telescope. Furthermore, the model can be used to study the effect of changes before they are implemented in practice.

The success of these first modelling experiments has led to the definition of a common framework for the development of such models and the creation of a versatile software package and associated database. Within this environment, a slight modification of the UVES software was efficiently re-used to model an existing high-resolution spectrograph, CASPEC at the ESO 3.6-metre telescope, and is currently being transformed into a model for the STIS spectrograph on HST.

The next steps will be to provide models for all those instruments that will become operational on the VLT and the HST in the coming years, and to study further the impact of the improved calibrations on new data analysis techniques.

Appendix: Limits of observations

Ever since the beginning of astronomical observations with instruments, the problem of the instrumental influence has played a significant role. Indeed, a key challenge for past and present astronomers has always been to convince critical colleagues that they have been able to achieve a clear separation in their data between the intrinsic properties of the celestial object observed on one side, and disturbing instrumental and atmospheric effects on the other side. Through the ages, many learned disputes have centered on this basic problem.

For instance, the famous astronomer Tycho Brahe spent a major part of his time at the Uraniborg observatory (1576 - 1597) in trying to describe and understand the `errors' (i.e. `signatures') of his pointing instruments. This was a new approach among observers of his day which greatly contributed to his successful studies.

Another early historical example is the first detection of a structure around the planet Saturn by Galileo in 1610. He was the first ever to point an optical telescope - albeit of very small size and rather bad optical quality by today's standards - towards celestial objects. To his great surprise, the disk of Saturn appeared to have two `handles' [4]. He had no means to know whether they were artifacts from light reflections inside the telescope or real objects, and in the latter case what kind of natural object this might be. In fact, it was only 50 years later that improved optical equipment which produced sharper images (`higher optical resolution') finally revealed the true nature, i.e. the well-known Saturnian rings of small particles.

In this case, the issue could only be solved by awaiting the technical progress of the optical telescope. Today digital imaging and computer processing allows the astronomers to reach beyond the limits of the raw observations.

But even though the equipment available to astronomers has recently made tremendous progress - the HST and VLT are prime examples - the basic problem of verifying the reality of results and correcting the `raw' data for instrumental and atmospheric signatures remains.

Notes

[1] The ST/ECF is a joint undertaking of the European Space Agency (ESA) and the European Southern Observatory (ESO).

[2] A presentation of the ideas and results described in this Press Release was made at the recent international workshop on `High Precision Data Analysis', held at the National Astronomical Observatory, Tokyo, Japan.

[3] There exist, for instance, many `photometric standard stars' in the sky. The apparent brightness of these stars has been repeatedly measured with different instruments and is assumed known to a high degree of accuracy.

[4] See also eso9603 of 19 January 1996.

Connect with ESO on social media

A propos du communiqué de presse

Communiqué de presse N°:eso9634
Legacy ID:PR 12/96
Type:Unspecified : Technology : Observatory
Facility:Other

Images

ESO logo
ESO logo