Back to basics: Understanding infrared detectors
The rapid development of microtechnologies has caused a revolution in infrared detector techniques. Today it is possible to manufacture sensors offering a slate of features—performance, convenience, shape, weight, and price—that would have been unimaginable a few years ago.
Read on to gain a better understanding of how the different technologies compare
About infrared radiation
Every object emits infrared electromagnetic radiation within a spectral range that varies according to the object’s temperature. The intensity of the radiation, however, depends on both the object’s temperature and surface properties. The higher the temperature, the lower the maximum emission wavelength. At room temperature, the maximum infrared emission occurs at 10 µm. The following wavebands have been determined according to atmospheric transmission windows:
• Band 1 (SWIR): from 1µm to 2 µm
• Band 2 (MWIR): from 3 µm to 5 µm
• Band 3 (LWIR): from 8 µm to 14 µm
Therefore, because the atmosphere is fully transparent at 10 µm, objects at room temperature are clearly visible.
How infrared radiation is detected
For the visible array, photodiodes or photo-MOS can be used to detect infrared radiation. However, these quantum detectors have to be adapted to the wavelength of interest. The energy contained in LWIR infrared radiation is at least ten times smaller than that of visible radiation (0.1 eV as compared to > 1 eV). Therefore, at room temperature, the signal of interest is drowned out by background current that contains no useful information except the temperature of the quantum detector itself. The solution is to cool down the photodiode, which lowers the background current to values less than the signal current. Cooled-sensor semiconductor technology (used to detect 10 µm) offers excellent performance. However, systems that use this technology are costly, limiting use to military and space applications.
Another method is to use a thermal detector—which detects infrared energy rather than infrared photons. The infrared energy emitted is absorbed by an absorber; the shift in the absorber’s temperature is then measured. Because it is a temperature variation that is being measured, the sensor does not need to be cooled. This makes uncooled thermal sensors suitable for high-volume applications like thermographic cameras, surveillance cameras, and head-guided weapon sights.
The amorphous silicon uncooled infrared microbolometer
Thermal detectors are classified by the type of thermometer used to measure absorber temperature. The most popular type is the bolometer (or microbolometer due to the reduced pixel size now available). The bolometer is made from a resistive film, the electrical resistance of which changes according to temperature. Resistive vanadium oxide or amorphous silicon films are mainly used in this type of thermal sensor. Because it is made of a single material rather than a complex mixture of different vanadium oxides, amorphous silicon presents many advantages. Firstly, the composition of amorphous silicon does not present any variations. At the pixel level, the first benefit is high spatial uniformity. The second benefit is predictable temperature behavior, a factor that contributes to easier sensor operation in changing ambient temperatures.
Another advantage of amorphous silicon is its usefulness in low-power-consumption focal-plane arrays that save batteries when used in applications like handheld cameras.
Finally, amorphous silicon enables the design of faster high-sensitivity sensors. Thermal sensors present a thermal time constant that is at least 30% to 40% lower in amorphous silicon (10 ms) than in vanadium oxide sensors. This lower time constant helps operate the system at higher frame rates without lowering performance.