Robot control with gestures and a weed map are some of the technology ideas that will make farmers’ lives easier
In the plains of Imathia, George has been picking peaches since dawn and it’s almost noon. His back and shoulders have started to ache from long hours of work. A four-wheeled robot, equipped with special cameras and Artificial Intelligence (AI) software, which receives data from the land worker’s wearables, “perceives” his musculoskeletal strain. It autonomously heads to where George is and relieves him of the extra work of carrying the crate of peaches: he picks up the crate that the worker has already filled and takes it to the loading station.
Elsewhere in the orchard, a drone hovers over rows of trees and photographs their arborist. These photos are compared with time series of other photos of the same trees. If among the dozens of trees in the orchard one does not grow at the expected rate, the grower realizes that something is not going well, so that he can intervene with the necessary cultivation care.
George may be a fictional person, but the scenarios described above are not far from reality.
A group of researchers from the Institute of Bio-economy and Agricultural Technology (iBO) of the National Center for Research and Technological Development (EKETA) is working on them, which will present a total of seven innovations for smart agriculture, using AI and robotics, during the 30 international exhibition AGROTICA.
“Automated agricultural machinery and autonomous robotic platforms, which in recent years have increasingly penetrated agriculture, enable precise management of resources, adaptation to changing conditions and increased productivity. Sensors and monitoring systems provide critical information about soil, weather and plant health. This technological progress enhances sustainability and reduces losses,” explains the main researcher (Researcher B) of iBO/EKETA, Dimitris Kateris, to the Athenian and Macedonian News Agency.
iBO will present the rest of the innovations for smart agriculture every day at the big agricultural fair, on February 1-4, from 10 am to 7 pm, at Pavilion 11/stand 1 of the TIF-Helexpo AE exhibition center .
Gesture robot control and weed map
Today, farmers spend a lot of time – and therefore money – going over their fields and identifying and mapping harmful weeds in crops. What if a robotic ground platform (UGV) took on this task for them? As Mr. Kateris explains, the UGV created at iBO has the ability to navigate autonomously in the field, on a pre-planned course, and recognize and store the location of the weeds using AI and depth cameras. Thus, knowing in advance where weeds are concentrated in his crop, the producer can take care of their neutralization, saving time. Visitors to AGROTICA will have the opportunity to see up close how this particular robot works, as the weed identification process in the field will be simulated with the help of artificial turf.
Let’s look at another scenario. What if instead of needing a remote control, the farmer could operate the UGV with gestures? “Today we have already managed to encode specific gestures with which the earth worker can direct a robot. For example, if he raises his right hand, then the robot, using depth cameras and AI, “locks in” to the farmer or worker and follows him wherever he goes. The next step is, through machine learning, for the robot to “learn” to communicate with continuous gestural movements, for example, the human pointing to the front or right and the robot knowing which direction to follow” notes the researcher of iBO/EKETA. In the exhibition, there will be a presentation of the complete remote control with the movements of the user’s hands, as well as the process of identifying a dangerous situation (falling a worker) by the collaborative robotic platform” underlines Mr. Kateris.
The process of autonomous parking and starting from a parking space of the autonomous robotic platform, without supervision, will also be demonstrated. “The ground robotic platform, through AI algorithms, will recognize the targets (parking points), which will be placed at the demonstration site. The robotic platform will navigate autonomously to the parking spot, and when it approaches it, it will start a pre-planned sequence of movements, which will ensure its safe parking. The ultimate goal is for the robot to move autonomously to the charging position and charge with a special arm, when it completes its work, so that it is ready to be used again” notes the EKETA researcher, while adding that at stand 11/stand 1 it will be presented and plant stress recognition application, through a combination of a weather station and a drone with a thermal camera. Also, a video related to the identification of diseases through UGV and drone in a vineyard (use of multispectral – hyperspectral cameras) through Artificial Intelligence will be shown daily.
Source :Skai
I am Terrance Carlson, author at News Bulletin 247. I mostly cover technology news and I have been working in this field for a long time. I have a lot of experience and I am highly knowledgeable in this area. I am a very reliable source of information and I always make sure to provide accurate news to my readers.