The concept of risk, especially in the context of a decision, is difficult to assess for a number of reasons. First, risk is more based on perception than facts. Paul Slovic, a psychologist who has done landmark studies over the past three decades, identified key characteristics that helped define human perception of risk. He found that people did not tolerate risks that were perceived as lacking control, associated with dread, had a catastrophic potential, had fatal consequences, and would result in an inequitable distribution of risks and benefits.
The classic example of a perceived risk that is uncontrollable and potentially catastrophic is a nuclear power plant. While modern plants have numerous safeguards to prevent a meltdown, they are perceived as highly dangerous by many individuals because they lack personal control over them. Ironically, some of the same people who perceive nuclear power plants as high risk do not use seatbelts while driving a car or wear a helmet when riding a motorcycle. Even though it is well-documented that seatbelts and helmets prevent serious head injuries in an accident, these people believe they are in control of their vehicles and bikes at all times.
Second, risk can be in the form of an extremely rare event. Nassim Taleb, in his book “The Black Swan,” defines a “black swan” as an event that is unexpected, extremely impactful, and difficult to predict. Black swans are so rare that people are unprepared for their occurrence. In some sense, the risk of a black swan is incalculable as it is catastrophic.
An example of a black swan is the Big Thompson Flood which occurred in Colorado in 1976. This flood was considered a one-in-1,000-year event and therefore beyond the living memory of many generations. Its occurrence was unexpected, impactful, and unpredictable by forecasting techniques at that time.
Third, the concept of risk is shaped by our educational and cultural backgrounds. People fear what they do not understand and people reject positions and/or practices that are not politically or socially acceptable. Education exposes individuals to new ways of thinking and to facts beyond those collected through experience. New technologies and novel solutions can challenge a position or practice.
Lastly, risk can be difficult to assess due to too little or too much data or information. In most cases, an individual may understand the underlying logic for processes affecting a decision but lack the data to reasonably predict an outcome. In a few cases, an individual may be overwhelmed by too much data and suspend any logical thinking when making a decision. In these cases, a person will focus on one or two variables to assess the behavior of a complex system.
Remembering The Odds
Risk by definition is associated with loss and everyone tries to avoid it. However, in order to take advantage of new technologies, products, and/or services, we accept certain calculated risks since they are part of our lifestyle and jobs. For example, it is almost certainty that over a lifetime everyone will experience some type of non-fatal automobile accident.
The same is true for food poisoning, a burn, electric shock, or injury. This social acceptance of risk is well-illustrated by the example of electricity. Michio Kaku, a popular physicist and futurist, noted that after the invention of electricity, many people resisted the building of an electrical infrastructure due to the fear of electrocution. And they were right. The risk of electrocution was real and it still exists today. In fact, according to the American Burn Association, there are on average 400 deaths and 4,400 injuries due to electrical hazards. We, as a society, accept this risk because the benefits of an electrical infrastructure far outweigh the likelihood of an individual experiencing death from electrocution.
While one can never rule out a “black swan” occurring in agriculture, most of our concerns about risk have to do with data and information, education, and cultural practices because the decisions we make are under our control. With that attention, there are a few steps we can take to minimize risk in our production decisions.
A Five-Step Approach
First, keep good records. In agriculture, as in other environmental systems, history repeats itself. There are predictable patterns in weather, crops, and pests.
Second, expect the unexpected. While a growing season may appear to be similar to a previous one, there will always be subtle differences. A selected hybrid may be less sensitive to environmental stresses but more susceptible to a new pest.
Third, whenever possible, have more than one solution to a problem. For example, weather conditions may result in a narrow planting window and therefore additional equipment may be required to complete seeding by a desired date. Or a worker may become sick at critical time requiring the temporary hiring of additional personnel.
Fourth, understand the technology supporting a decision. Whether its software or equipment, be sure everything works and has been calibrated to your specific field setting and operation.
Lastly, seek advice and learn from others. We are living in an information age. Help is just a key stroke or call away. While you may be facing a new event, such as a herbicide-resistant weed, you can be sure someone else in another location has experienced the same event in a previous year.