Verdant Robotics unveils ‘multi-action’ technology for specialty crops

Disclosure: AFN’s parent company, AgFunder, is an investor in Verdant Robotics.

Verdant Robotics has officially launched its platform that combines data analysis and crop applications such as spraying and weeding into a single agricultural tool for specialty crop farmers. The company says its tool delivers better results for these farmers, ranging from bigger yields and cost savings to a better overview of what’s happening on the farm.

Verdant, based in Hayward, Calif., was founded in 2018 and over the past 18 months has been stealthily rolling out its product to farms of varying sizes across the United States.

According to co-founder and CEO Gabe Sibley, it is committed to serving approximately 40% of the US carrot market exclusively for the next five years.

He also names plums, cherries, apples, onions, garlic, and peaches as a few other crops the Verdant system works with. “After extensive research, we found that specialty crops with fixed infrastructure were very suitable,” he said. APN. “Last year we really increased our thousands of running hours every day, all day.”

To date, Verdant has raised $21.5 million from AgFunder, Autotech Ventures, Cavallo Ventures, DCVC Bio, and Future Ventures, among others. Building on its deployment to multiple US farms, the company plans to partner with more growers in the immediate future while commercializing a version of its system for fruit crops in 2023.

‘Multi-action’ technology

Verdant’s “robotics as a service” model, as the company calls it, combines computer vision, GPS navigation, artificial intelligence, and soil and plant science. It says its main differentiator is its ability to perform multiple tasks from the same machine – what the company calls “multi-action” technology. Currently, this includes high-speed weeding and the application of chemicals.

This is quite unique among the current array of agricultural robotics offerings, Sibley says, as most only perform a single task. Among the notable agricultural robotics deals last year were one for a rock-picking machine, one for a harvesting robot and one for a weeding system. In theory, at least, this could give Verdant a competitive edge in the agricultural robotics market, which is still in its early stages of adoption.

Verdant’s equipment is available as a six- or 12-row implement that attaches to the back of any tractor and can cover up to 4.2 acres per hour for weeding and laser spraying. A lighting and camera system inside the machine identifies and indexes plants in real time and provides updates to the farmer via an in-cab interface.

Simultaneously, the system uses its sensors and cameras to create a digital twin of the field so farmers can track with millimeter precision.

It’s the kind of precision and data that’s only possible through computation, suggests Sibley.

“Computers are capable of things that people are not. They can count millions of things per acre,” he says.

“Google can index the web. What we do is index the physical space.

Farm indexing

Sibley and his team, which includes COO Curtis Garner and CTO Lawrence Ibarria, share a collective story that includes self-driving cars, Mars rovers, simulators and hands-on farming experience. He says the team spent six months talking to farmers and trying to understand their most pressing problems before finding their solution.

“Farmers told us not to give them more data, but to know what to do with the data they already have – or better yet, go for it,” says Sibley.

He points out that farmers will continue to play a crucial role in developing start-up strategies when it comes to extending Verdant’s system to other tasks on the farm. Verdant’s current capabilities are “just the tip of the iceberg” of what’s possible with agricultural robotics, he says.

“As the technology develops rapidly, we believe it is extremely important to work closely with farmers, operating a service. If you’re not out in the field getting your boots dirty, you won’t find the value.

Comments are closed.