The digital revolution is changing the way agriculture is done.
Rapid population growth, changes in market demands, depleting agricultural land, and significant changes in climate patterns, including much more frequent extreme events — all these factors are pushing agriculture out of its traditional limits, towards a digital age. This trend is supported by governments worldwide.
Local know-how and practices are becoming, in many cases, almost irrelevant and many growers find themselves having to adapt to changing conditions. This might even include abandoning their current cultivation and shifting to growing other crops. In order to survive in their business, farmers must become more efficient and produce more. Making decisions based on a hunch, intuition, personal experience, or guesswork is no longer sustainable.
Variety selection, planting dates, water and nutrient requirements, and pests and disease management are merely a few of the decisions that growers have to make. Each of these decisions is influenced by ever-changing environmental conditions and variability in the field. Plants growing in one section of the farm might grow under totally different conditions in other sections of the same farm.
Making data-driven decisions is, therefore, essential. However, taking all the variable factors into consideration, and doing so in real time, is an impossible mission for a human being.
Can machines and algorithm make better decisions?
The answer to this question is – Yes, but they have to be taught how to do so.
A shift of agriculture towards AI (Artificial Intelligence) in is inevitable, but in order for it to gain momentum, huge amount of viable data must be collected and analyzed. Statistical models and algorithms are used to predict future events and behaviors. Analyzing historical data, such as yields, weather, trends in soil, fertilizer inputs, and more, together with real-time data, can give the farmer powerful tools to make informed decisions and manage risks.
There are three main questions to be asked: how to collect the data? How to make sense of the data collected? And can the data be validated?
Unlike in other industries, data collection in agriculture lags behind. Many startup companies are developing decision support tools, but they are still struggling with data collection, as farms lack the technological infrastructure for data collection.
In recent years, data collection technologies are being developed – spatial data, drones, soil, water and plant sensors, image recognition, and other technologies. These technologies can collect large amount of data that can be further analyzed and used for better decision making.
For example, detecting a disease before it spreads in the field, identifying water and fertilizer stresses in real time, and more.
However, the limitations of the currently available technology should be recognized. There is a limit to the number of parameters that can be measured using the currently available technologies and the extent to which the data represents the whole field is still undetermined.
- Can sensors that are installed on a relatively small number of plants provide data that represents the entire field?
- For optimizing crop nutrition it is necessary to take into consideration dozens and even hundreds of parameters, while the current technology allows measuring just two or three parameters (e.g. nitrogen level, NDVI).
Making sense of the data is another challenge. In fact, the vast data collection drives the evolution of a new scientific realm, governed by machine learning. New models that have never existed before must be developed to make the data useful and actionable. New insights arise, and a large portion of the research is actually done by means of machine learning, which displaces the work being done by researchers.
Adaption and validation processes are going to be lengthy, many companies and technologies will rise and fall, but the revolution in agriculture is here to stay.