Laser-Guided Robot Farmers Offer Hope for Shrinking Labor Force
Farms might soon have a new helper that never needs a lunch break. Agricultural robots are inching closer to becoming practical farm hands, thanks to a new navigation system developed by researchers in Japan. These robots can now autonomously travel between high-bed cultivation rows, like those used for growing strawberries, without relying on expensive infrastructure or perfect positioning systems.
According to research published in Computers and Electronics in Agriculture, this advancement could be transformative for many small-scale farmers facing labor shortages. Rather than expensive GPS systems or having to install special markers in fields, these robots use a straightforward approach that mimics how humans navigate: they keep a consistent distance from the crop beds while moving, adjusting as they go.
Small-scale farms are significantly different from larger agricultural or industrial settings. Smaller operations typically can’t afford extensive infrastructure modifications, and conventional navigation methods designed for factories don’t work well in dynamic farm environments.
How do these farm bots work? When the robot needs to move from one general area to another, it uses “waypoint navigation” to reach predefined spots. Once it arrives near crop rows, it switches to “cultivation bed navigation,” where it uses a laser scanner to maintain a specific distance and orientation to the crop bed.

This hybrid approach helps robots move accurately through farm rows, even in places where GPS and other positioning tools don’t work well. Between rows of strawberries, a robot’s sensors detect limited distinctive features, making it hard to know exactly where it is. By using the cultivation beds themselves as guides, the robot can move accurately without perfect self-localization.
During testing in both simulated and real-world environments, the system kept the robot within 0.05 meters (about 2 inches) of its target distance from cultivation beds and maintained its orientation within 5 degrees—impressive precision for agriculture. Testing in virtual environments helped them refine the robot’s specifications and navigation algorithms without wasting time and resources in real fields.
Self-driving tractors are already rolling across big, open farms, but getting autonomous robots to work in smaller, trickier spaces has been a real challenge. This study brings us one step closer to making robots a real option for smaller operations where labor shortages hit hardest.
Testing the system in actual greenhouses was a little bit more challenging. Unlike factories with flat floors and consistent lighting, farms have uneven terrain, changing light conditions, and even fluttering plastic sheets that can confuse sensors. Despite this, the robot successfully navigated its way around.
This new system was much more efficient than older navigation methods. The traditional approach took over three times as long and often left the robot confused, stopping, turning around, or even backing up just to figure out where it was.
Today’s farmers are struggling to find enough workers to plant, tend, and harvest crops, especially on smaller farms. In many countries, aging rural populations, declining interest in agricultural work, and tighter immigration policies have made seasonal labor hard to come by. As a result, growers are increasingly turning to automation to fill the gap, hoping technologies like agricultural robots can help keep their operations running.
For the millions of small-scale farmers who grow a substantial portion of the world’s food on limited acreage, this new technology brings autonomous helpers one step closer to their fields.
Paper Summary
Methodology
The researcher developed a hybrid navigation method for agricultural robots operating in high-bed cultivation environments like strawberry farms. The system switches between two navigation modes: waypoint navigation to move the robot to predetermined locations, and cultivation bed navigation that uses real-time LiDAR (Light Detection and Ranging) data to maintain proper distance and orientation relative to the crop beds. The method was first tested in a virtual environment using the Gazebo simulation platform with a 3D model of the robot and a 2D map generated from real farm data. The same navigation system was then evaluated in a real strawberry farm with high-bed cultivation inside a greenhouse. The experiment involved multiple navigation trials where the robot had to maintain a target distance of 0.40 meters from cultivation beds while traveling between them.
Results
The hybrid navigation method successfully maintained the robot within ±0.05 meters of the target distance from cultivation beds and kept the orientation angle within ±5 degrees during cultivation bed navigation in both virtual and real environments. The proposed method significantly outperformed the baseline method (using only waypoint navigation) in the real environment. While the baseline method frequently required recovery behaviors (stopping, rotating, reversing) and took approximately 375 seconds to complete navigation tasks, the hybrid method completed the same tasks in approximately 135 seconds without recovery behaviors. The results demonstrated that the proposed method provides more stable and efficient navigation for agricultural robots in high-bed cultivation environments.
Limitations
The study identified several limitations of the current navigation method. First, it doesn’t account for obstacles that might appear during movement between cultivation beds, such as agricultural machinery or personnel. Second, the 2D LiDAR can be affected by sunlight penetrating the greenhouse, potentially rendering point cloud data unusable in certain conditions. Third, the system uses a static map that doesn’t update to reflect changes in the dynamic farm environment. Finally, the virtual environment doesn’t perfectly replicate real-world conditions, particularly surface properties and dynamic factors like wind-induced movement of plastic sheets covering the cultivation beds.
Funding/Disclosures
The research was supported by Ozawa and Yoshikawa Memorial Electronics Research Foundation. The author acknowledged cooperation from Mr. Seigo Tanaka of ichigokirari during the experiments.
Publication Information
This paper titled “Autonomous navigation method for agricultural robots in high-bed cultivation environments” was authored by Takuya Fujinaga from Osaka Metropolitan University, Japan. It was published in Computers and Electronics in Agriculture, volume 231, in February 2025 (available online February 13, 2025). The paper is an open access article under the CC BY-NC-ND license.