Up: Testing for synchrotron self-absorption
The multiwavelength fading counterparts of gamma-ray bursts (GRBs)
have been shown to be in agreement with
the relativistic blast wave model
(Mészáros & Rees 1993).
More predictive variations of this model, such as the synchrotron
shock model
(Katz 1994; Tavani 1996)
are consistent with a small number of
time-integrated GRB spectra
(Cohen et al. 1997), but fail to explain
several time-resolved GRB spectra.
In particular, the asymptotic photon
slope
(
)below the spectral break is predicted by the synchrotron shock
model to be between
and
.
This was shown to be inconsistent with time-resolved
GRB spectra fit with the
Band et al. (1993) GRB function
(Crider et al. 1997;
Crider et al. 1998).
Fitting both the Band GRB function and a broken power-law
to over 100 bursts,
Preece et al. (1998)
found roughly a fourth of
time-integrated spectra to be
inconsistent with the synchrotron shock predictions, as well.
The observed high values of
do not easily differentiate between the many possible
absorption mechanisms. However,
the evolution of
, when fitting spectra with the Band GRB function,
may favor saturated Comptonization as the absorption mechanism
(Crider et al. 1997).
This may well be a result of
extracting the photon spectra assuming a Band GRB function.
In this paper, we fit the time-resolved BATSE LAD spectra of GRB 970111
directly with a self-absorbed synchrotron shock function.
We choose this burst because it was very bright, it was
seen by many instruments including BeppoSAX and BATSE (trigger 5773),
and it had a very high
(Crider et al. 1998).
Up: Testing for synchrotron self-absorption
Copyright The European Southern Observatory (ESO)