Camouflage photography, developed during World War II, opened the era of remotely sensed images and the possibility of seeing more than the visible light spectrum that the human eye can see. On either side of the visible light spectrum is ultraraviolet (UV) light and infrared light. Infrared is further subdivided into “near infrared” and “red edge,” another part of the spectrum between near infrared and visible wavelengths.
A common kind of imagery now is a four-band set of images for blue light, green light, red light, and near infrared. Some imaging systems also detect red-edge “light” (radiation) and even two kinds of blue light, perhaps yellow light, and perhaps another kind of near infrared.
Beyond these UV, visible, red-edge, and near infrared parts of sunlight are the even longer bands called, collectively, “shortwave infrared” (SWIR for short). Then, we run out of sunlight as the wavelengths get longer than SWIR, and we get into long wave infrared, which is related to the temperature of the objects that have been imaged.
Some imaging systems do not rely on natural sun light and sky light. LEDs or lasers might be used instead of the sun as a way to illuminate the soil and the vegetation … even at night … and then measure the colors and reflections of the vegetation and soils with more precision.
Other laser-like instruments are radar and LiDAR (short for “Light Detection And Ranging”). Some images might run on the notion that plants will emit light when stimulated by shortwave radiation such as for a UV laser or other such wavelength.
In addition to these kinds of UV, visible, red-edge, near infrared, SWIR, and thermal cameras, there are spectrometers that sense hundreds of distinct wavelengths both for related natural solar radiation and those using artificial sources.