December 25, 2024 | 13:43 GMT +7
December 25, 2024 | 13:43 GMT +7
Hotline: 0913.378.918
Fifteen years ago, when high-accuracy GPS became available for civilian use, farmers thought things would be simple: Put a GPS receiver station at the edge of the field, configure a route for a tractor or a combine harvester, and off you go, dear robot!
Practice has shown, however, that this kind of carefree field cultivation is inefficient and dangerous. It works only in ideal fields, which are almost never encountered in real life. If there's a log or a rock in the field, or a couple of village paramours dozing in the rye under the sun, the tractor will run right over them. And not all countries have reliable satellite coverage—in agricultural markets like Kazakhstan, coverage can be unstable. This is why, if you want safe and efficient farming, you need to equip your vehicle with sensors and an artificial intelligence that can see and understand its surroundings instead of blindly following GPS navigation instructions.
You might think that GPS navigation is ideal for automated agriculture, since the task facing the operator of a farm vehicle like a combine harvester is simply to drive around the field in a serpentine pattern, mowing down all the wheat or whatever crop it is filled with. But reality is far different. There are hundreds of things operators must watch even as they keep their eyes fastened to the edge of the field to ensure that they move alongside it with fine precision. An agricultural combine is not dissimilar to a church organ in terms of its operational complexity. When a combine operator works with an assistant, one of them steers along the crop edge, while the other controls the reel, the fan, the threshing drum, and the harvesting process in general. In Soviet times, there were two operators in a combine crew, but now there is only one. This means choosing between safe driving and efficient harvesting. And since you can't harvest grain without moving, driving becomes the top priority, and the efficiency of the harvesting process tends to suffer.
Harvesting efficiency is especially important in Eastern Europe, where farming is high risk and there is only one harvest a year. The season starts in March and farmers don't rest until the autumn, when they have only two weeks to harvest the crops. If something goes wrong, every day they miss may lead to a loss of 10 percent of the yield. If a driver does a poor job of harvesting or gets drunk and crashes the machine, precious time is lost—hours or even days. About 90 percent of the combine operator's time is spent making sure that the combine is driving exactly along the edge of the unharvested crop to maximize efficiency without missing any of the crop. But this is the most unpleasant part of the driving, and due to fatigue at the end of the shift, operators typically leave nearly a meter at the edge of each row uncut. These steering errors account for a 25 percent overall increase in harvesting time. Our technology allows combine operators to delegate the driving so that they can instead focus on optimizing harvesting quality.
Add to this the fact that the skilled combine operator is a dying breed. Professional education has declined, and the young people joining the labor force aren't up to the same standard. Though the same can be said of most manual trades, this effect creates a great demand for our robotic system, the Cognitive Agro Pilot.
Developing AI systems is in my genome. My father, Anatoly Uskov, was on the first team of AI program developers at the System Research Institute of the Russian Academy of Sciences. Their program, named Kaissa, became the world computer chess champion in 1974. Two decades later, after the collapse of the Soviet Union, the Systems Research Institute's AI laboratories formed the foundation of my company, Cognitive Technologies. Our first business was developing optical character recognition software used by companies including HP, Oracle, and Samsung, and our success allowed us to support an R&D team of mathematicians and programmers conducting fundamental research in the field of computer vision and adjacent areas.
In 2012, we added a group of mathematicians developing neural networks. Later that year, this group proudly introduced me to their creation: Vasya, a football-playing toy car with a camera for an eye. "One-eyed Vasya" could recognize a ball among other objects in our long office hallway, and push it around. The robot was a massive distraction for everyone working on that floor, as employees went out into the hallway and started "testing" the car by tripping it up and blocking its way to the ball with obstacles. Meanwhile, the algorithm showed stable performance. Politely swerving around obstacles, the car kept on looking for the ball and pushing it. It almost gave an impression of a living creature, and this was our "eureka" moment—why don't we try doing the same with something larger and more useful?
After initially experimenting with large heavy-duty trucks, we realized that the agricultural sector doesn't have the major legal and regulatory constraints that road transport has in Russia and elsewhere. Since our priority was to develop a commercially viable product, we set up a business unit called Cognitive Pilot that develops add-on autonomy for combine harvesters, which are the machines used to harvest the vast majority of grain crops (including corn, wheat, barley, oats, and rye) on large farms.
Just five years ago, it was impossible to use video-content analysis to operate agricultural machinery at this level of automation because there weren't any fully functional neural networks that could detect the borders of a crop strip or see any obstacles in it.
At first, we considered combining GPS with visual data analysis, but it didn't take us long to realize that visual analytics alone is enough. For a GPS steering system to work, you need to prepare a map in advance, install a base station for corrections, or purchase a package of signals. It also requires pressing a lot of buttons in a lot of menus, and combine operators have very little appreciation for user interfaces. What we offer is a camera and a box stuffed with processing power and neural networks. As soon as the camera and the box are mounted and connected to the combine's control system, we're good to go. Once in the field, the newly installed Cognitive Agro Pilot says: "Hurray, we're in the field," asks the driver for permission to take over, and starts driving. Five years from now, we predict that all combine harvesters will be equipped with a computer vision–based autopilot capable of controlling every aspect of harvesting crops.
Getting to this point has meant solving some fascinating challenges. We realized we would be facing an immense diversity of field scenes that our neural network must be trained to understand. Already working with farmers on the early project stages, we found out that the same crops can look completely different in different climatic zones. Preparing for mass production of our system, we tried to compile the most highly diversified data set with various fields and crops, starting with videos filmed in the fields of several farms across Russia under different weather and lighting conditions. But it soon became evident we needed to come up with a more adaptable solution.
We decided to use a coarse-to-fine approach to train our networks for autonomous driving. The initial version is improved with each new client, as we obtain additional data on different locations and crops. We use this data to make our networks more accurate and reliable, employing unsupervised domain adaptation to recalibrate them in a short time by adding carefully randomized noise and distortions to the training images to make the networks more robust. Humans are still needed to help with semantic segmentation on new varieties of crops. Thanks to this approach, we have now obtained highly resilient all-purpose networks suitable for use on over a dozen different crops grown across Eastern Europe.
The way the Cognitive Agro Pilot drives a combine is similar to how a human driver does it. That is, our unique competitive edge is the system's ability to see and understand the situation in the field much as a human would, so it maintains full efficiency in collaboration with human drivers. At the end of the day, it all comes down to economics. One human-driven combine can harvest around 20 hectares of crops during one shift. When Cognitive Agro Pilot does the driving, the operators' workload is considerably lower: They don't get tired, can make fewer stops, and take fewer breaks. In practical terms, it means harvesting around 25 to 30 hectares per shift. For a business owner, it means that two combines equipped with our system deliver the performance of three combines without it.
On the market now there are some separate developments from various agricultural-harvesting companies. But each of their autonomous features is done as a separate function—driving along a field edge, driving along a row, and so on. We haven't yet seen another industrial system that can drive completely with computer vision, but one-eyed Vasya showed us that this was possible. And so as we thought about cost optimization and solving the task with a minimum set of devices, we decided that for a farmer's AI-based robot assistant, one camera is enough.
The Cognitive Agro Pilot's primary sensor is a single 2-megapixel color video camera that can see a wide area in front of the vehicle, mounted on a bracket near one of the combine's side mirrors. A control unit with an Nvidia Jetson TX2 computer module is mounted inside the cab, with an integrated display and driver interface. This control unit contains the main stack of autonomy algorithms, processes the video feed, and issues commands to the combine's hydraulic systems for control of steering, acceleration, and braking. A display in the cab provides the interface for the driver and displays warnings and settings. We are not tied to any particular brand; our retrofit kit will work with any combine harvester model available in the farmer's fleet. For a combine more than five years old, interfacing with its control system may not be quite so easy (sometimes an additional steering-angle sensor is required), but the installation and calibration can still usually be done within one day, and it takes just 10 minutes to train a new driver.
Our vision-based system drives the combine, so the operator can focus on the harvest and adjusting the process to the specific features of the crop. The Cognitive Agro Pilot does all of the steering and maintains a precise distance between rows, minimizing gaps. It looks for obstacles, categorizes them, and forecasts their trajectory if they're moving. If there is time, it warns the driver to avoid the obstacles, or it decides to drive around them or slow down. It also coordinates its movement with a grain truck and with other combines when it is part of a formation. The only time that the operator is routinely required to drive is to turn the combine around at the end of a run. If you need to turn, go ahead—the Cognitive Agro Pilot releases the controls and starts looking for a new crop edge. As soon as it finds one, the robot says: "Let me do the driving, man." You push the button, and it takes over. Everything is simple and intuitive. And since a run is normally up to 5 kilometers long, these turns account for less than 1 percent of a driver's workload.
During our pilot project last year, the yield from the same fields increased by 3 to 5 percent due to the ability of the harvester to maintain the cut width without leaving unharvested areas. It increased an additional 3 percent simply because the operators had time to more closely monitor what was going on in front of them, optimizing the harvesting performance. With our copilot, drivers' workloads are very low. They start the system, let go of the steering wheel, and can concentrate on controlling the machinery or checking commodity prices on their phones. Harvesting weeks are a real ordeal for combine drivers, who get no rest except for some sleep at night. In one month they need to earn enough for the upcoming six, so they are exhausted. However, the drivers who were using our solution realized they even had some energy left, and those who chose to work long hours said they could easily work 2 hours more than usual.
Gaining 10 or 15 percent more working hours over the course of the harvest may sound negligible, but it means that a driver has three extra days to harvest the crops. Consequently, if there are days of bad weather (like rain that causes the grain to germinate or fall down), the probability of keeping the crop yield high is a lot greater. And since combine operators get paid by harvested volume, using our system helps them make more money. Ultimately, both drivers and managers say unanimously that harvesting has become easier, and typically the cost of the system (about US $10,000) is paid off in just one season. Combine drivers quickly get the hang of our technology—after the first few days, many drivers either start to trust in our robot as an almighty intelligence, or decide to test it to death. Some get the misconception that our robots think like humans and are a little disappointed to see that our system underperforms at night and has trouble driving in dust when multiple combines are driving in file. Even though humans can have problems in these situations also, operators would grumble: "How can it not see?" A human driver understands that the distance to the combine ahead is about 10 meters and that they are traveling at a constant speed. The dust cloud will blow away in a minute, and everything will be fine. No need to brake. Alex, the driver of the combine ahead, definitely won't brake. Or will he? Since the system hasn't spent years alongside Alex and cannot use life experience to predict his actions, it stops the combine and releases the controls. This is where human intelligence once again wins out over AI.
Turns at the end of each run are also left to human intelligence, for now. This feature never failed to amaze combine drivers but turned out to be the most challenging during tests: The immense width of the header means that a huge number of hypotheses about objects beyond the line of sight of our single camera need to be factored in. To automate this feature, we're waiting for the completion of tests on rugged terrain. We are also experimenting with our own synthetic-aperture radar technology, which can see crop edges and crop rows as radio-frequency images. This does not add much to the total solution cost, and we plan to use radar for advanced versions of our "agrodroids" intended for work in low visibility and at night.
During the summer and autumn of 2020, more than 350 autonomous combines equipped with the Cognitive Agro Pilot system drove across over 160,000 hectares of fields and helped their human supervisors harvest more than 720,000 tonnes of crops from Kaliningrad on the Baltic Sea to Vladivostok in the Russian Far East. Our robots have worked more than 230,000 hours, passing 950,000 autonomous kilometers driven last year. And by the end of 2021, our system will be available in the United States and South America.
Common farmers and the end users of our solutions may have heard about driverless cars in the news or seen the words "neural network" a couple of times, but that about sums up their AI experience. So it is fascinating to hear them say things like "Look how well the segmentation has worked!" or "The neural network is doing great!" in the driver's cab.
Changing the technological paradigm takes time, so we ensure the widest possible compatibility of our solutions with existing machinery. Undoubtedly, as farmers adapt to the current innovations, we will continuously increase the autonomy of all types of machinery for all kinds of tasks.
A few years ago, I studied the work of the United Nations mission in Rwanda dealing with the issues of chronic child malnutrition. I will never forget the photographs of emaciated children. It made me think of the famine that gripped a besieged Leningrad during World War II. Some of my relatives died there and their diaries are a testament to the fact that there are few endings more horrible than death from starvation. I believe that robotic automation and AI enhancement of agricultural machinery used in high-risk farming areas or regions with a shortage of skilled workers should be the highest priority for all governments concerned with providing an adequate response to the global food-security challenges.
(Spectrum)
(VAN) FAO, WFP and UNICEF urge immediate humanitarian access and action to avert what could become the worst hunger crisis in recent history.
(VAN) We tend to look at environmental problems in isolation. A holistic approach would be more effective, a new report says.
(VAN) Twisted equipment and snapped tree limbs still litter Chris Hopkins’ Georgia farm more than two months after Hurricane Helene made its deadly march across the South.
(VAN) The US poultry processing industry has long relied on illegal workers, but huge adjustments are going to have to be made after President-elect Donald Trump takes power on 20 January 2025.
(VAN) Drought is projected to affect 75% of the world's population by 2050. Take that in.
(VAN) Voice of Animals, a Russian NGO, has prepared amendments to the draft veterinary regulation in the poultry industry, which is scheduled to come into force on 1 August 2025.
(VAN) From the FAO Regional Office for the Near East and North Africa.