From Identifying Plant Pests to Picking Fruit, AI is Reinventing How Farmers Produce Your Food
January 10, 2018 | John H. Tibbetts
This article was originally published on ensia.com.
A sophisticated form of artificial intelligence known as deep learning could help make agriculture more efficient and environmentally friendly.
Sick crops? These Indian subsistence farmers know just what to do: Pull out their smartphones and take their picture. The farmers then upload the images with GPS locations to a cloud-based artificial intelligence (AI) app named Plantix. The app identifies the crop type in the image and spits out a diagnosis of a disease, pest or nutrient deficiency. Plantix also aids farmers by recommending targeted biological or chemical treatments for ailing plants, reducing the volume of agrochemicals in groundwater and waterways that can result from overuse or incorrect application of herbicides and pesticides.
“Nearly every household in India has a smartphone, and many want to see how Plantix works,” says Srikanth Rupavatharam, a digital agriculture scientist with the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) in Hyderabad, India, which is collaborating with Plantix’s developer to adapt the tool for Telugu and Hindi languages. For common crop diseases in India, Rupavatharam says, the app has an identification success rate of more than 90 percent.
Plantix is based on deep learning, one of today’s most powerful AI tools. Deep learning involves neural networks — digital imitations of the human brain’s system of neurons and synapses. Deep learning models are trained to look for certain patterns in giant datasets. They also go beyond basic pattern recognition to devise their own rules as they go, deciding how best to perform their jobs.
Deep learning isn’t a new idea; it’s been around for decades. But over the past five years, deep neural networks have advanced very rapidly because of super-fast processing hardware and huge volumes of data from the internet to train models. Deep learning already reaches into our daily lives: Facebook, for instance, uses deep learning to identify and tag the faces of our friends in posted photos. Now ag-tech companies are testing deep learning as a tool to boost productivity and reduce environmental impacts while helping farmers carry out tasks that people now perform.
As in every other industry, AI advances could lead to greater efficiencies but also job losses.
Opportunity and Challenge
The world must produce at least 60 percent more food relative to 2006 to feed an estimated global population of 9 billion by 2050, according to United Nations figures. But most arable land is already cultivated, so farmers must find ways to grow more food on the acres they already cultivate. To improve farmers’ efficiency, researchers are experimenting with deep learning models that help guide robots and drones in monitoring fields for pests and predators, tracking crop and livestock diseases, and many other tasks. Ag-tech companies are testing deep learning in robots to identify the “faces” of fruit and pick them more rapidly. Someday, a farmer in a developed country could remain in an office all day remotely deploying AI-driven machinery.
As exciting as applications are, they are not without challenges.As exciting as such applications are, they are not without challenges. Deep learning has proven successful in highly structured, predictable environments such as those involved in medical care. Farms, though, are far messier places.
“Agriculture is a little behind in adoption of advanced digital tools,” says Shriram Ramanathan, who heads the big data analytics practice at Lux Research, a technical innovation consulting firm. “Deep learning models have improved a long way, but once you go outdoors, there are so many factors that can affect their accuracy.” Lighting conditions, shadows, weather, dust and other factors outdoors can change by the hour. A plant can look quite different at dawn, noon, or evening. If poor-quality images are fed into a deep learning model, the algorithm is more likely to misidentify the plant or its ailment.
Relatively few small and medium-size U.S. farms are experimenting with new digital tools, while larger farms with more capital are willing to take on more risk, says Ramanathan. “If you are a farmer on a very tight budget with a low-margin crop such as corn, there’s no time to run a proof of concept with a digital technology.”
That said, two small companies, California-based Abundant Robotics and Israel-based FFRobotics, are racing to develop the world’s first apple-picking robot guided by deep learning. Mounted on tractors, these machines use cameras to recognize individual apples and robotic arms and other devices to gently pick them from trees.
Abundant Robotics field-tested its apple-picking machine in Washington state orchards in 2017. FFRobotics tested a machine in Israel over the past three years, and plans to test in Washington state in 2018.
Part of the rationale behind the rush is political. Apple growers face financial disaster if fruit isn’t picked on time. Nearly one-fourth of Washington’s seasonal agricultural workers arrive on H-2A (temporary agricultural worker) guest visas, mostly from Latin America. Growers worry that future federal policies will reduce their seasonal workforce, motivating interest in robotic replacements, says Karen Marie Lewis, a Washington State University tree-fruit horticulturist.
“Better machines and fewer workers—we’ve seen that in every other industry,” Lewis says. “Now it’s our turn.”
Weeds vs. Cotton
Some major agricultural companies and high-tech subsidiaries are betting on deep learning. John Deere Labs, which opened in 2017, spent US$305 million in September to purchase Blue River Technology, a startup based in Sunnyvale, California. Blue River has developed a tractor-mounted device called See & Spray that deploys two color cameras, computer vision and a deep-learning algorithm to detect weeds and cotton plants in milliseconds and spray herbicide just on the weeds.
To train its algorithm, Blue River Technology fed it accurately labeled, high-resolution images of cotton plants and weeds at various times of day. “The algorithm has to be very robust to account for different field conditions,” says Lee Redden, the company’s co-founder and chief technology officer.
The more images that the model “sees,” the more accurate it becomes in recognizing plants. The model’s first computational layers detect simple elements in plant images such as edges and corners. The next layers can bring together corners and edges to form features such as leaf margins. Each successive layer of the model builds on the knowledge of the previous layers. The software becomes increasingly expert at recognizing features that the programmers instruct it to look for. When the model misidentifies a plant, the programmers flag the result and adjust the model.
If the training goes well, the model is tested. Can it accurately identify a plant in an image it has never seen before? If the model passes, the algorithm can be tested in the field.
By differentiating weeds from cotton plants, the machine allows farmers to transition from broadcast spraying to a more targeted application, reducing herbicide usage by 90 percent, says Redden — saving money and reducing pollution. The results for the See & Spray device are promising, says Redden. By differentiating weeds from cotton plants, the machine allows farmers to transition from broadcast spraying to a more targeted application, reducing herbicide usage by 90 percent, says Redden — saving money and reducing pollution. Blue River tested two machines in 2017 and is planning to test 10 machines full time in commercial cotton fields during the 2018 growing season from Georgia to West Texas.
The Food and Agriculture Organization of the United Nations estimates that 20 to 40 percent of global crop yields are lost each year to pests and diseases. To protect their livelihoods, many Indian farmers douse fields with agrochemicals. Farmers often take plant samples to local traders for advice about which agrochemicals to apply, says Korbinian Hartberger, one of the cofounders of Progressive Environmental & Agricultural Technologies, a German start-up that developed Plantix. “The shopkeepers sell whatever they may have in the shop, telling farmers, ‘Give this a try, and this too.’”
“We see a lot of unnecessary spraying of pesticides and overuse of fertilizers,” says Rupavatharam of ICRISAT, which teaches agricultural extension officers how to use the Plantix app. The extension officers, in turn, reach out to farmers.
With Plantix as a guide, farmers can identify weeds and pests more accurately and earlier in the growing season, allowing them to slash insecticide and herbicide use, reduce damage to wildlife and human health, and minimize issues related to resistance to pesticides, Rupavathram says.
Plantix has about 300,000 active users a month, primarily in India but also in Brazil and North Africa. Broadband access is not required to use Plantix. In the field, users can look up crop diseases and solutions stored in the smartphone app’s library. The app’s database holds more than 270,000 labeled images of some 400 diseases or pests, including about 20 that afflicted rice, peanut and pulse crops in central India in 2017.
“For extension officers, Plantix is like a great treasure in their hands, a refresher course for them,” says Rupavatharam.
Hartberger sees the app as validating extension workers’ role as well. “Farmers who were once cut off from products and services, who rarely saw an extension officer, can now see the benefit in getting [such] technical expertise,” he says. “Farmers want more good information and that can lead to more demand for technical advisory jobs and healthier crops and that can improve local economies.”
About the Author: John H. Tibbetts is a freelance writer in Charleston, South Carolina. His work has appeared in BioScience, Hakai, The Scientist, The Washington Post, Yale e360, Environmental Health Perspectives and others. For 25 years, he was editor of Coastal Heritage magazine, which focuses on the South Carolina lowcountry.