In the foregoing analysis we have shown that algorithmically rather simple, yet very precise models can be constructed for spectrographs, even if significant off-plane designs are involved. The approach from first principles and its inherent predictive power allows one to quantitatively assess the degree of simplification permitted for various applications, which are at the focus of this discussion.

The most direct application of instrument models is certainly in the area of data calibration. Historically, the data calibration process has been tackled as "a cleaning from instrumental signatures''. Usually one employs empirical approximations obtained from the observation of "standards'' (cf. Rosa 1995). A typical example is the empirical determination of a dispersion relation by fitting a low order polynomial to a list of positions of calibration lines. Dahlem & Rosa (1997) have shown for low dispersion, first order grating spectra from the FOS on-board HST that the dispersion relation analysis in the presence of noise, small line lists, centering errors and line blending is greatly improved by the use of optical relations. This is entirely due to the predictive power for the local curvature of a first principle derived relation, thereby avoiding the overshoot of polynomials at the boundaries of the data range.

As was shown in Sect. 5, the first principle based analytical model of echelle spectrographs can ease very significantly the tedious wavelength calibration task of 2D-echellograms. We stress the fact that it is important to perform the proper simplification of the problem in order to preserve the accuracy of the approach while limiting the complexity of the formulae to be implemented. In particular, it is compulsory to use 3D geometrical transformations in order to reproduce on the sub-pixel scale the observed dispersion relations of off-plane echelle designs and in the presence of detector rotation.

One can on this basis set up template based calibration procedures by providing sets of parameters with associated uncertainty ranges in order to fit the expression to actual calibration exposures pertaining to a given scientific observation. We emphasize the fact that these parameters are the engineering values such as focal length, grating constants, construction and grating angles and so forth. For the case of CASPEC one has shown in Sect. 4 how these engineering parameters can be determined and verified by adjusting the proper instrument model to observational data.

It is straight forward to conceive the next stage, i.e. predictive calibration (cf. Rosa 1995), by using contemporary values for the engineering parameters in order to provide best guess dispersion relations, which may be adjusted by small linear offsets using inexpensive control observations. Seemingly similar procedures are indeed implemented in e.g. the HST pipeline calibration. However, it is important to emphasize that in the latter case empirical calibration relations are used, which have only limited predictive power as regards the effect of varying individual engineering parameters. For example, an ambient temperature effect on the dispersion relation does not reflect itself directly in a single parameter of the 3rd order polynomial fit to this relation.

One of the main objections to the "first principle'' approach has been of a practical kind, namely the lack of control of the setup of ground-based astronomical instruments. However, more rigid configuration control is now being implemented in an increasing number of observatories. It is clear, that data analysis methods based on physical models of such controlled configurations will enjoy a large gain over purely empirical methods. Furthermore, a physical model based approach removes the non-uniqueness of the relation between engineering parameters and coefficients in empirical relations. This offers operational advantages such as methods for automatic configuration control and instrument check-out.

The observational process as a whole can therefore benefit from a first principle instrument model approach during all its phases. At first, the proposer can prepare observations using model based exposure time estimators and data simulators. This is helpful for the selection of instrumental modes and exposure times suited for optimal information return. Second, the observatory controls the instrumental configuration, tests data analysis procedures, and provides calibration solutions with the help of instrument models. Thirdly, the interpretation of the data can be supported by the simulation of raw observational data for a range of target properties.

web@ed-phys.fr