Imagery Basics: Terms and Acronyms

Different-resolutions

These photos demonstrate the stark difference between (left) medium and very high resolution.

Field imagery in agriculture has been around for decades. But as the industry stands in “Crop Season 2017,” agriculture still has a ways to go to provide actionable information instead of selling pixels and pictures. Service providers and farmers do not need imagery, they need information — actionable information. They need alerts that tell them there is something that needs to be looked at now — and that there are actions that can be taken to improve profitability. But to get here, the imagery end-user needs to start by understanding what is behind those dots on the page that make up a picture of the farm field.

Advertisement

Understanding Resolution

Resolution is categorized by the pixel size of the remotely sensed imagery. Since the 1972 launch of Landsat by NASA, we know have nearly 1,500 satellites, as well as thousands of airplanes and drones, capturing imagery at different resolutions. Today, we will consider three classes of resolution, as classified by meters (m): Medium Resolution (5 m to 30 m), High Resolution (1 m to 5 m), and Very High Resolution (1 m or better). For agricultural applications, there are a wide range of platforms providing imagery at multiple resolutions and frequency at the global and local scale. The following sections will discuss the sensors and sources of various satellite, aerial, and UAV platforms.

Why does resolution matter? It depends on what is your need. It’s all about aligning the correct information for your problem.

Top Articles
Solix Sprayer Robot Is Now Commercially Available for Farmers, Retailers

Medium Resolution Imagery (5-30 meters)

NASA’s Landsat is a fantastic sensor that’s been imaging the planet for decades. At 15-meter resolution, it is best used to identify landscape scale change; example uses are zone and drainage plans. Between Landsat 7 and Landsat 8, a new image of the entire planet is taken every eight days. For paid imagery, we have RapidEye at 6.5-meter which is now owned by Planet Labs.

MORE BY LISA PRASSACK

High Resolution Imagery (1-5 meters)

Companies such as Planet Labs and Skybox (now Google) are launching fleets of micro satellites or small sats, with a vision to make imagery cheap, frequent and accessible. They plan to revisit a given place on the globe multiple times a day. While these micro satellites are designed for a short life of 3 years or less, they can take advantage of changes in new sensors. Further, the constellation approach of medium resolution satellites offers a unique revisit for broad crop monitoring area on a daily basis.

Very High Resolution Imagery (< 1 meter) Queue the Airplanes and Drones

This imagery range has been the domain of the airplane imagery provider — and over the last 3 years — drone providers. This imagery is ideally suited to identify and address in-field, in-season agriculture problems.

There are quite a number of airplane and drone imagery providers focused on the agriculture industry. The greatest advantage is that airplanes and drones are able to collect information for specific fields at specific times in the growing season. Part of the challenge is overall inconsistency in the process, which includes the following: no standard type of cameras are used; the quality of pilots or drone operators is variable; and how fast and well the imagery is processed and presented in order to inform the grower or adviser on what to do next.

Another source of very high resolution imagery is the smartphone and tablet. They can take very detailed pictures of plants, bugs, soil and more. Smartphones enabled with personal navigation can be used to cue humans to very specific locations to then capture sub-millimeter color imagery. Smartphone imagery can then be sent to experts and machines for interpretation and diagnosis.

Terminology Alphabet Soup

One of the symptoms of a maturing, competitive industry (and one of its biggest frustrations) is the evolving, swirling, confusing stew of terms and acronyms used to describe differences in technologies and offerings. Imagery terms NIR, VI, UV, TIR, SIR and NDVI (among several more) use to identify and explain the variations of imagery, but do little to help end-users decide what will work in their particular applications.

In a perfect world the industry would evolve to simpler, easier to understand designations and end the TLA (three-letter acronym) madness. What service providers and farmers care about is that when they review the imagery, will it tell them something?

Someday the industry will arrive at that point. In the meantime, these are some of the most common terms in imagery and their meanings and common uses.

  • GSD – Ground Sampling Distance or pixel size of the remotely sensed imagery. Example: 30-meter; 1-meter; 20-centimeters.
  • NDVI – Normalized Difference Vegetation Index is the ratio of the difference between the red and near-infrared bands divided by their sum used to identify and enhance the vegetation contribution in a digital remote sensing analysis.
  • VI – Vegetation Index, is a ratio created by dividing the red by the near-infrared spectral bands used to identify and enhance the vegetation contribution in a digital remote sensing analysis.
  • VRT – Variable-Rate Technology, refers to a system that varies the rate of agricultural inputs such as seed, fertilizer, and crop protection chemicals in response to changing local conditions.
  • TIR – Thermal Infrared is shown in gray tones to illustrate temperature. It measures radiation from the plant and soil surface.
  • NIR – Near infrared (red), green (blue), red (green) is useful in seeing changes in plant health.
  • SIR – Shortwave infrared (red), near infrared (green), and green (blue) used to show flooding or newly burned land.
  • Yield Mapping – refers to the process of collecting geo-referenced data on crop yield and characteristics, such as moisture content, while the crop is being harvested.

0

Leave a Reply

Avatar for Gary Roberson Gary Roberson says:

I am confused by the reference to colors in the NIR and SIR definitions. Can you clarify?