Using Autonomous Pathing Orchard Robots

View Print Version
Robot tractor in a field lane
Robot transversing the orchard. Photo: Deven Biehler and Achyut Paudel

I spent this summer working as an intern at the AgAID Institute with Washington State University, a renowned research facility dedicated to advancing the field of sustainable agriculture through innovative technologies and methodologies. I have been making significant developments toward a fully autonomous orchard robot by expanding on the same safety technology used in self-driving cars. For the agricultural robot to operate effectively in an orchard, it must be able to “see” its surroundings precisely. My robot safety research focused on computer vision, sensing technology, and robot localization algorithms. That is, I worked on how different sensors can come together to give the robot an accurate map of its surroundings. This is an essential step in developing a fully autonomous orchard robot that can contribute to sustainable agriculture and farming.

Mapping the Robot’s Surroundings

The team is working with a Clearpath Warthog, a commercial-use rover sold by ClearpathRobotics. Our Warthog has a LiDAR (Light Detection and Ranging) and stereo vision sensor attached to the front of its frame. The combination of these two sensors allows it to create a precise map of the orchard lane and any obstacles. The Warthog has also been saddled with precision location estimation technologies such as an IMU (Inertial Measurement Unit), odometer, and GPS. These three devices are combined to produce an accurate position on Earth and within the orchard.

The robot must have precision measurement devices to ensure that it acts safely. LiDAR is perfect for this because its precision allows the robot to notice anything bigger than chicken wire. LiDAR is a remote sensing method that uses the time light takes to leave the sensor and bounce back after it hits an object to measure the distance to the object accurately. If you have ever used Google Earth and have seen the heights of terrain on the Earth, that’s because LiDAR was used on an aircraft to measure it. I am using this technology to allow the Warthog to measure the width of orchard lanes, distances between trees, and anything else that might come in front of it, like a ladder left out, or a worker in the lane.

Stereo vision is a method of using two cameras to create a 3D image of an environment. The cameras capture two images from slightly different angles, similar to how our eyes can tell the distance to an object just by looking at it. The system then uses this information to create a 3D image to measure length and shape. It even allows the robot to recognize locations it’s been to for a more accurate location measurement.

The LiDAR and Stereo Vision technologies are combined to give the robot a more accurate sense of its surroundings. This allows it to move through orchard lanes with ease. The robot’s ability to know where it is in space and the location of objects near it is essential for its use as a farming tool. By better understanding its environment, the robot can navigate through different terrains and avoid obstacles more effectively, resulting in a safer environment for the robot, the people in the orchard with the robot, and the trees. If the robot can navigate effectively, it can then be designed to perform various tasks, such as pruning, flower thinning, harvesting, fertilizer or pesticide application, and much more. The current application is directed toward apple and cherry trees. Still, the robot’s generalized use can be anything with lanes at least larger than the one-and-a-half meter-wide Warthog.

Close up of motor on robot
LiDAR mounted on the Warthogs frame. Photo: Deven Biehler and Achyut Paudel

Sustainability for Agriculture and Farms

Certain farming practices can sometimes negatively impact the environment or affect workers’ health and safety. Overapplication of pesticides and fertilizers can contaminate water sources. The use of heavy machinery can cause soil compaction. The robot’s ability to target the application of pesticides, fertilizers, and water can help reduce their use while maintaining crop yield and quality, making agriculture more sustainable. The robot can likely achieve these tasks at a lower cost than manual labor in the long run. 

One of the downsides of robots that needs to be considered is that they may take away from manual labor jobs. However, this is a complex issue, and other upsides may result as well:

  1. Automation may free up labor resources, allowing farmers to focus on value-added processes like food processing, packaging, and marketing, which can create jobs downstream in the supply chain.
  2. While automation can handle many tasks, people will still play a crucial role in decision-making, strategic planning, and ensuring overall farm success.
  3. The robot’s metal frame is much more resistant to chemicals unsafe for workers, relieving workers from the associated risks.

In conclusion, my work as an intern at the Agricultural Automation and Robotics lab has led to significant progress in creating a fully autonomous orchard robot. By integrating LiDAR, stereo vision, and precision location technology, the robot now navigates orchards safely and efficiently. This innovation can promote sustainable agriculture by helping reduce resource usage and possibly creating new job opportunities. I believe that embracing automation offers a greener, more efficient future for farming, fostering a harmonious blend of technology and nature. With farmers and autonomy, we can create a more sustainable future for agriculture.

This internship is supported by the AI Research Institutes program supported by NSF and USDA-NIFA under the AI Institute: Agricultural AI for Transforming Workforce and Decision Support (AgAID) award No. 2021-67021-35344.