The two‐dimensional dispersion relation for the cold‐beam, free‐electron laser is derived by applying a Lorentz transformation to the Raman decay instability of plasma. It is found that a low‐density, relativistic, free‐electron laser is actually a broad‐band amplifier; it can amplify waves throughout a frequency range of 4g2. Of these waves, the forward, high‐frequency wave has the largest temporal growth rate. However, the off‐axis, lower‐frequency waves have larger spatial growth rates. In fact, the backward waves are absolutely unstable. These low‐frequency, absolute instabilities can be very detrimental, because they have very large nonlinear saturation levels, and, hence, they can deplete the energy of the electron beam.