Eyeing AI-Powered Imagery Expansion, Taranis Raises $20 Million

“For the first time ever, farmers are seeing that our imagery is as good as walking in the field. Farmers get it because it’s so visual. It’s a wow moment. ‘You captured this from a plane? That’s pretty amazing,’” Ofir Schlam, Taranis CEO and co-founder, says.

Eyeing AI-Powered Imagery Expansion, Taranis Raises $20 Million


It’s not just the visuals – captured via high-speed drone – that have the company grabbing the attention of ag community (and investors – but we’ll get to that in a moment.)

“We are essentially overloading farmers with too much data, and we don’t expect the client to go through each bit of imagery. We make it more actionable and create an automated report within a few hours to a maximum of 48 hours.”

Integrating the ultra-high-resolution imagery of Mavrx, which Taranis acquired in May, into its AI2 product line, has been top priority for the past six months. Any doubts Mavrx customers had were soon put to rest.


“At the beginning of season, it was very stressful for some clients who needed reassurance that they’re going to be supplied with the goods, but I think that what they would agree now is that they added a unique product,” Schlam tells PrecisionAg. “We’re just excited to work with them and grow the relationship.”

How unique? Taranis’ AI2 offers 1,400 times greater resolution than the Mavrx UHR imagery and any other competitors on the market. Where Mavrx’s strength lies in providing broader-scope, expansive imagery of entire fields, Taranis samples the field. Take the two platforms together and you have the best of both worlds.

“We use both, so we can see the problem in high resolution and then extrapolate to understand the condition of the whole field. We can see how big the problem is and decide whether to treat or wait. Crop protection companies are interested in this because it could help R&D plots to monitor control groups and treated plots,” he says, adding that Taranis has done some initial work with several top input companies.

Investors Take Note

Most imagery solutions start giving feedback 45 to 60 days after planting, after the crop is already established. With Taranis, a first flight is completed one to two weeks after crops such as soybeans, corn, or cotton are planted, and “we can already count an entire field, and create an estimate on whether it’s worth it do a replanting in some areas,” Schlam says. Then a second flight aimed at weeds management is completed, still before Mavrx or other competitors can provide similar information.

“We start to look at weeds. We can recognize species of weeds, and that is something that cannot be done with other imagery. Even for agronomists, it’s tough to differentiate between specific grass and broadleaf weeds, especially when very small. Our system is sensitive enough to capture images and do an analysis through our deep learning-trained AI on different weeds and detect them by name. If they know exactly the combinations of weeds they have on their field, they can choose more selective weed management programs and even variable rate if they want to.”

“It’s really a game-changer,” says Schlam.

Those are not empty words. On Nov. 6, Taranis announced the closing of a $20 million Series B round of financing led by Viola Ventures, with participation from existing investors Finistere Ventures, Vertex Ventures, OurCrowd, Eyal Gura, and Gal Yarden. The round is also joined by strategic investors – Nutrien, Cavallo Ventures, the venture capital arm of Wilbur-Ellis, and Sumitomo Corp. Europe. The company has raised $30 million to date.

The company was co-founded in 2014 by Schlam, whose family has farmed in Israel for four generations, and his partner, Amihay Gornik, an Israeli military intelligence expert who spent 15 years working on aircraft fighter jet imagery for the Israeli aerospace industry firm Elbit and IAI. It took three long years of R&D and several patents to develop the AI2 platform.

“It’s been a very exciting process and a hard process. The idea is really being able to track the ground while moving. Our camera system can at fly very high speeds at low altitudes, close to the crop, and track the ground taking photos, eliminating the usually inevitable motion blur that occurs when flying at 120 miles an hour at these altitudes.

“We currently use a combination of planes and drones. Drones cover 2,000 to 5,000 acres a day, and planes can do four times that. Thanks to the Mavrx acquisition, we have assets in more than 30 states, with the largest concentration in the Midwest and Mid-South,” says Schlam. “We’ll have about 40% more planes next year, and we’re setting them up right now.”

Taranis drones and planes also fly after herbicide applications to identify any resistant weeds left in the field. If they know about resistant weeds sooner, they can treat them before they propagate and limit their spread to neighboring fields.

Unexpected Uses

Besides the U.S. – by far its biggest market – the company now operates in Brazil, Western Canada and Ontario, Ukraine, and Russia, and just launched in Australia at the end of October. The way that each of these markets uses its technology is not the same.

“It’s interesting to see different cultures and how they adopt solutions. It’s interesting to see that although Brazil is advanced, everything related to variable rate application is not as advanced. A lot of the growers own the most advanced machinery, but team members don’t usually work with the technology. I’m sure they will get there, and we’ll help them,” Schlam says. “They are using it for scouting and better disease management, but they do a lot of the actual applications manually because they don’t necessarily know how to use the automation the machinery offers. In general, Brazil is good for us in terms of growth, as farms are much larger. Our average client in Brazil is close to 10,000 acres – a lot bigger than our U.S. clients.”

Eyeing AI-Powered Imagery Expansion, Taranis Raises $20 MillionRussia and Ukraine, Schlam concedes, are complicated markets in which to operate. The regulatory constraints along with more complexity in contracts, service models, and conducting business in general led Taranis to partner with top companies and employ just a small number of its own sales force.

“Ukraine is more in tune with new technology and best practices and is focused on agronomy. It offers the same value propositions as in Brazil and the U.S., and the yield increases, and variable rate were well received,” he says. “Growth there has been exceptional. We had nothing in Ukraine in April, and now we have more than 260,000 acres of pretty big growers there, and we received a lot of coverage in local media.”

In any new market, the first year means a learning curve, and Taranis limits its commercial activity to a few select opinion-leader clients. Ukraine’s eager adoption of the platform took the company by surprise. “Overall it was a very successful launch – better than we expected,” he says.

It’s the Russians, though, who have gotten creative in finding some more basic uses of information that a highly complex precision ag intelligence platform can provide.

“AI2 for Russia is more about finding machinery problems and monitoring the workforce itself,” Schlam explains. “It’s a ‘side effect’ of our technology. Our focus is to increase yields and find problems earlier, but they were more interested in how to manage their fleet and manage what their agronomists were doing,” he says.

What’s Next

Taranis has maximized the close-up view of the field, but it is working on improving spectral resolution, and is experimenting with multi- and hyper-spectral resolution simultaneously, all made possible by its patented technology. Multi-spectral imagery, which is already available in its solution from Mavrx and satellite imagery, tracks the amount of chlorophyll in plants.

“Multi-spectral imagery allows us to see crop stress before the naked eye can see it. We can get early warnings on disease and for weeds, we can get stem counts. For nutrient deficiency and disease, we think it could bring even more value for growers, and we might even need hyper-spectral for that, so it’s definitely part of our R&D road map.”

“But our largest project is tackling this huge matrix of crops, more pests, diseases, more insects, and of course there are priorities according to the severity of issues in each market,” Schlam says.

“It’s a very complicated process and we are fine-tuning how we (collect data). Corn, soybean, cotton, wheat, canola, potato, and sugarbeet cover 90% of what we do. In diseases, there is a lot more work to be done, and a lot of interest with potential clients in specialty and high-value crops. But it’s not as big of an opportunity as row crops.”

Taranis employs over 350 agronomists and annotation experts whose job it is to annotate data it captures: flea beetle in canola, late blight in potato, northern blight in corn, and so on.

“Deep learning is learning software that learns as it goes. It’s like the human brain – a neural network. You show it enough good examples of the correct way, so it can operate on its own afterward,” he explains. The software needs 500-plus examples and sometimes thousands if the problem is more complicated to spot until the system can accurately detect it.

“With each new model we create and perfect, our solution becomes more valuable to the grower, and more of a competitive edge for us. We’re always thinking about the next crops,” Schlam says. “We hope to use this technology on as many acres as possible to bring value to the market.”

Leave a Reply