Infrared astronomy is the branch of astronomy and astrophysics that studies astronomical objects visible in infrared (IR) radiation. The wavelength of infrared light ranges from 0.75 to 300 micrometers. Infrared falls in between visible radiation, which ranges from 380 to 750 nanometers, and submillimeter waves.
Infrared astronomy began in the 1830s, a few decades after the discovery of infrared light by William Herschel in 1800. Early progress was limited, and it was not until the early 20th century that conclusive detections of astronomical objects other than the Sun and Moon were detected in infrared light. After a number of discoveries were made in the 1950s and 1960s in radio astronomy, astronomers realized the information available outside of the visible wavelength range, and modern infrared astronomy was established.
Infrared and optical astronomy are often practiced using the same telescopes, as the same mirrors or lenses are usually effective over a wavelength range that includes both visible and optical light. Both fields also use solid state detectors, though not the specific type of solid state detectors used are different. Infrared light is absorbed at many wavelengths by water vapor in the Earth's atmosphere, so most infrared telescopes are at high elevations in dry places, above as much of the atmosphere as possible. There are also infrared observatories in space, including the Spitzer Space Telescope and the Herschel Space Observatory.
The discovery of infrared radiation is attributed to William Herschel, who performed an experiment where he placed a thermometer in sunlight of different colors after it passed through a prism. He noticed that the temperature increase induced by sunlight was highest outside the visible spectrum, just beyond the red color. That the temperature increase was highest at infrared wavelengths was due to the spectral index of the prism rather than properties of the Sun, but the fact that there was any temperature increase at all prompted Herschel to deduce that there was invisible radiation from the Sun. He dubbed this radiation "calorific rays", and went on show that it could be reflected, transmitted, and absorbed just like visible light.
Efforts were made starting in the 1830s and continuing through the 19th century to detect infrared radiation from other astronomical sources. Radiation from the Moon was first detected in 1873 by William Parsons, 3rd Earl of Rosse. Ernest Fox Nichols use a modified Crookes radiometer in an attempt to detect infrared radiation from Arcturus and Vega, but Nichols deemed the results inconclusive. Even so, the ratio of flux he reported for the two stars is consistent with the modern value, so George Rieke gives Nichols credit for the first detection of a star other than our own in the infrared.
Full article ▸