Next Article in Journal
Soil from Serianthes Rhizosphere Influences Growth and Leaf Nutrient Content of Serianthes Plants
Previous Article in Journal
Effects of Legume–Grass Ratio on C and Nutrients of Root and Soil in Common Vetch–Oat Mixture under Fertilization
Previous Article in Special Issue
An Agricultural Event Prediction Framework towards Anticipatory Scheduling of Robot Fleets: General Concepts and Case Studies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UAV-Supported Route Planning for UGVs in Semi-Deterministic Agricultural Environments

by
Dimitrios Katikaridis
1,2,
Vasileios Moysiadis
1,2,
Naoum Tsolakis
1,3,
Patrizia Busato
4,*,
Dimitrios Kateris
1,
Simon Pearson
5,
Claus Grøn Sørensen
6 and
Dionysis Bochtis
1,7
1
Institute for Bio-Economy and Agri-Technology (IBO), Centre of Research and Technology-Hellas (CERTH), 6th km Charilaou-Thermi Rd., 57001 Thessaloniki, Greece
2
Department of Computer Science and Telecommunications, University of Thessaly, 35131 Lamia, Greece
3
Department of Supply Chain Management, International Hellenic University, 57001 Thessaloniki, Greece
4
Department of Agriculture, Forestry and Food Science (DISAFA), University of Turin, Largo Braccini 2, 10095 Turin, Italy
5
Lincoln Institute for Agri-Food Technology (LIAT), University of Lincoln, Lincoln LN6 7TS, UK
6
Department of Electrical and Computer Engineering, Aarhus University, 8000 Aarhus, Denmark
7
FarmB Digital Agriculture P.C., Doiranis 17, 54639 Thessaloniki, Greece
*
Author to whom correspondence should be addressed.
Agronomy 2022, 12(8), 1937; https://doi.org/10.3390/agronomy12081937
Submission received: 11 July 2022 / Revised: 11 August 2022 / Accepted: 14 August 2022 / Published: 17 August 2022

Abstract

:
Automated agricultural operations must be planned and organized to reduce risk and failure potential while optimizing productivity and efficiency. However, the diversity of natural outdoor environments and the varied data types and volumes required to represent an agricultural setting comprise critical challenges for the deployment of fully automated agricultural operations. In this regard, this study develops an integrated system for enabling an unmanned aerial vehicle (UAV) supported route planning system for unmanned ground vehicles (UGVs) in the semi-structured environment of orchards. The research focus is on the underpinning planning system components (i.e., world representation or map generation or perception and path planning). In particular, the system comprises a digital platform that receives as input a geotagged depiction of an orchard, which is obtained by a UAV. The pre-processed data define the agri-field’s tracks that are transformed into a grid-based map capturing accessible areas. The grid map is then used to generate a topological path planning solution. Subsequently, the solution is translated into a sequence of coordinates that define the calculated optimal path for the UGV to traverse. The applicability of the developed system was validated in routing scenarios in a walnuts’ orchard using a UGV. The contribution of the proposed system entails noise reduction techniques for the accurate representation of a semi-deterministic agricultural environment for enabling accuracy in the route planning of utilized automated machinery.

1. Introduction

During the last two decades, a series of autonomous mechanized systems have been developed for various agricultural field operations, such as seeding, spraying, weeding, and harvesting [1,2]. For such systems’ operation, the corresponding automated agriculture-based processes must be planned and organized in a manner that, on the one end, reduce prominent risks while, on the other end, optimize productivity and efficiency. However, the diversity of the natural outdoor environments and the vast amount of diversified data types required to plan and deploy autonomous agri-field operations still comprise significant bottlenecks. Indicatively, challenges exist even at the seemingly superficial level of calculating an autonomous vehicle’s optimal route planning (i.e., generating a safe path between a source and a destination point) [3]. These comprise non-deterministic polynomial-time (NP-hard) problems with high complexity, particularly in large-scale agricultural areas involving many obstacles [4].
Regarding the navigation of mobile (ground) robots in semi-structured agricultural environment or orchards, four main categories of route planning approaches are prevalent, namely (i) graph search-based planners; (ii) sampling-based planners; (iii) interpolating curve planners; and (iv) numerical optimization approaches [5]. Owing to the dynamic nature of agricultural environments, scenery information is required for guiding the autonomous navigation of ground vehicles and the execution of in-field automated operations. To this end, route planning can be either: (i) deterministic (i.e., the entire path is generated, but the world representation occurs off-line); or (ii) sensor-based (i.e., the path is generated in real-time). However, both applications are characterized by prevalent inherent limitations. First, deterministic route planning can potentially lead to a global optimum, but the feasibility of the execution depends on the “quality” of the available a priory information [6]. Second, sensor-based route planning processes involve the time and space complexity of data processing that requires substantial computational resources for informing the robots’ navigation (e.g., detecting obstacles for planning and executing avoidance maneuvers in real-time) [7]. Furthermore, due to orchards’ semi-structured environment, sources of noise in the retrieved data are also possible, distorting the received signals that could lead to inaccuracies in the detection ability regarding orchard objects. GPS-based guidance is also challenging as tree foliage can cause a signal outage or introduce navigation errors [8].
In the Agriculture 4.0 field, the complementary role of unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs) has been recognized as catalytic to operations efficiency, human safety and health, and environmental stewardship [9]. Specifically, the authors in [9] articulated several scenarios and proposed the concept of heterogeneous cooperation between unmanned systems in vineyards. However, research on collaborative machinery for route planning is scarce, with real-world applications lacking. In this regard, this research develops a cooperative system between UAVs and UGVs in terms of data interfacing for route planning purposes. This concept has not been explored in the extant literature yet. Therefore, this research attempts to tackle the following research question: How can one enable UAV-supported route planning for UGVs in semi-deterministic agricultural environments?
To address the enunciated research query, this study developed an integrated system for the path planning of field robots in orchard operations. The system comprises: (i) a UAV that maps the orchard; (ii) a farm management information system (FMIS) that generates the UAV’s flight plan, receives, and processes the UAV-gathered data, and generates the “initial” world representation that is transmitted to a UGV; and (iii) a UGV that receives data from the FMIS and calculates the optimal path while avoiding obstacles in the agri-field. The functionality of the developed digital-robotic platform was tested in navigation scenarios investigated in a real-world walnut orchard.
The rest of this paper is organized as follows: Section 2 outlines the materials and methods pertinent to this study. Section 3 briefly describes the relevant research background. Section 4 details the developed robotic platform, the involved data processing steps, and the systems’ integrated functionality. Section 5 demonstrates the applicability of the proposed system by investigating different navigation scenarios. Finally, Section 6 presents conclusions, implications, limitations, and future research avenues.

2. Materials and Methods

This research adopted the methodology by [10] to develop and apply a path generation algorithm for a UGV in an agri-field. In particular, [10] identified the crop rows in an agricultural wheat holding by converting the mosaic to grayscale to prevent shading. Our study differs noticeably by transforming the retrieved crop images into the Hue Saturation Value (HSV) color space. In addition, our study used the initial Hough Transform approach to identify the cultivation lines. The use of the Hough Transform allows for recognizing vegetation overlap areas in linear cultivations. From a technology perspective, the real-world implementation of the generated path planning algorithm in our research involved engagement with a UGV, a UAV, and an FMIS. Specifically, a customized UAV was deployed, which was based on the DJI Spreading Wings S1000+ Professional Octocopter equipped with an open-source flight controller (DJI, Shenzhen, China), Pixhawk Cube (registered trademark of Lorenz Meier), an SP80 RTK GPS (i.e., Real-Time Kinematic) (Trimble Inc., Sunnyvale, California), and a Sony RX100 III RGB camera (i.e., red, green, blue) (Sony, Minato, Tokyo, Japan). The research case study concerned a walnut orchard. Considering the agricultural digitalization landscape and following the work of [11] for efficiently planning operations on bespoke agri-field layouts, the generated path algorithm was then imported to the Robot Operating System (ROS), an open-source meta-framework used for mobile robotic vehicle navigation [12]. The analysis process to enable optimized route planning involves the following stages (Figure 1):
  • Area coverage plan generation
  • Orchard mapping
  • Orchard representation
  • Field tracks’ extraction
  • Path planning
  • System integration.

3. Research Background

A limited number of studies exist in the extant literature that has technically investigated path planning based on imagery data. Indicatively, the authors [13] investigated the generation of path navigation tracks in wheat crops. The photographic material was transformed to grayscale during image processing, and a filter was applied to isolate the areas containing shades of green. The discrete areas of interest (e.g., paths, cultivation lines) were then separated, and the Hough linear transform was applied.
Alternatively, methods exist to recognize trees from aerial photographs. Indicatively, the authors of [14] tested a method that utilizes the AdaBoost algorithm to classify aerial photographs [15]. In addition, the authors of [16] examined tree identification by employing the method of [17] to transform RGB photography on a grayscale by distinguishing areas of interest (i.e., cultivation lines). The points (i.e., pixels) within the cultivation lines were defined as a positive class and the remaining points as a negative class.
Following the findings of [18], the exploitation of point cloud data potentially offers satisfactory results. The differentiation of the altitude variations in an orchard in tandem with the object-based image analysis algorithm [19] produced satisfactory results in trees’ identification from aerial imagery.
Beyond the scope of autonomous vehicles but within the pathfinding field, the authors of [20] presented a solution for finding a walking path in nuclear facilities to minimize the radioactive exposure time of workers. This developed implementation considered conflict detection and radiation measurement in humans. Furthermore, the authors of [21] implemented a generalized pathfinding algorithm for multiple robotic vehicles in planar environments. The implementation of the algorithms A*, particle swarm optimization, and probabilistic roadmap method (PRM) generated multi-vehicle tracks and near-optimal four-robot path planning solutions. Additionally, the authors of [22] implemented a pathfinding method for robots in 3D space. In particular, the study considered a robotic arm working as a sewing machine, and the generated optimal path led to an accelerated sewing process. In the work of [23], reinforcing learning was combined with pathfinding. The combination of reinforcing learning with pathfinding allowed self-learning of the algorithm to produce a movement path of an autonomous mobile robot within completely unknown areas. Noteworthy, the authors of [24] proposed a novel way to avoid obstacles in path planning problems. The proposed approach creates a virtual frame around the mobile robot that extends to prevent collisions. Moreover, the authors of [25] presented a pathfinding algorithm in dynamically changing vehicle traffic networks. The particularity of the implementation was that the pathfinding process focused on the reduction of bursts and fuel consumption of the vehicle and not on the length of the path.

4. System Description

The architecture of the developed real-world system for UGV route planning in semi-deterministic agricultural environments supported by a UAV comprises three sub-systems, namely: (i) FMIS; (ii) UAV; and (iii) UGV. Figure 2 represents the suggested system’s architecture while indicating its structural components. The FMIS acts as the integration platform that allows communications and analytics, whereas the combinatory application of a UAV and a UGV leads to minimized distance and time required by the UGV to map the orchard area.

4.1. Area Coverage Plan Generation

The implemented FMIS (Figure 3) was the commercial FMIS farmB (i.e., farmB Digital Agriculture P.C., version 3.28.0, Greece). The backend and frontend infrastructures were provided by farmB for the various developments of the project needs. The FMIS facilitated the communication between the two agricultural vehicles (i.e., UAV and UGV) through application programming interface (API) communications. For the generation of a flight path for the UAV, a range of parameters were imported to the FMIS, including: (i) front and side overlap; (ii) altitude/ground resolution; (iii) coverage flight direction; (iv) requirement for peripheral coverage; (v) type of turns; and (vi) sequence type of flight lines. Following the parameters’ specification, the automated areal coverage plan was generated by importing the agricultural area of interest to the FMIS. Then, the FMIS generated the coverage mission plan and transmitted it to the ROS-enabled UAV via the transmission interface plugged into the computer that acted as the ground station for the UAV flight. A ROS-enabled UAV requires a Linux-running computer to implement the essential ground station functions such as flight software (QGroundControl, Dronecode Project, Inc., a Linux Foundation Collaborative Project, version 4.1.1) and micro air vehicle link (MAVLink) protocols. In addition, the proposed system implements ROS executables to achieve communication between the UAV and the ground station computer.
Regarding the FMIS, Figure 3 depicts different route planning simulations. Specifically, Figure 3a considers the longest side of the agri-field for the orientation of the actual path that the UGV will traverse (path depicted in white color). In this case, the track’s direction was horizontal, i.e., parallel to the longest side of the orchard. On the other end, Figure 3b considers the shortest side of the agri-field for the UGV’s path. In this case, the orientation of the routing path was vertical to the longest side of the orchard. The path in red color denotes the area that the UGV will traverse while entering/exiting the agri-field.

4.2. Orchard Mapping

The coverage mission plan guides the flight of the UAV over the indicated agri-field area. The utilized UAV, equipped with an RGB camera, captures geo-located images via an RTK GPS. At the end of the flight operations, the collected data were transmitted to the FMIS to generate the selected agri-field’s orthomosaic.
An optimized breadth-first search (BFS) algorithm was applied to classify the generated orthomosaic to extract the tracks that inform the automated navigation of the UGV in the orchard. Specifically, the resulting data were saved to the server of the FMIS. At the same time, a dedicated representational state transfer (REST) API was implemented to communicate the data to the UGV’s integrated computer and convert the metric map to a topological map to improve the path planning process (i.e., no use of GPS coordinates or pixels). Based on the FMIS imported parameters (Section 4.1), the number of the gathered images from the selected semi-deterministic agricultural field was 354.
The node distance in the grid of the topological map did not affect the path planning process of the ground vehicle in orchard rows. The optimal path criteria depend on the steps to be executed on the topological grid. The grid’s nodes were equally weighted, while the grid rows were always three: top, medium, and bottom. The generated topological map correlated with the GPS coordinates of the metric map, thus creating a valid and robot-recognizable path. To this effect, the ground vehicle can navigate across unexplored terrains quickly. Any present obstacles in the created path were evaded using a corresponding real-time obstacle-avoidance module.

4.3. Orchard Representation

Orchard mapping was conducted via the UAV, which captured images of the trees in the orchard. To analyze the gathered imagery data, a series of processing steps were conducted to represent the orchard for informing the effective navigation of the ground vehicle, including: (i) image processing; (ii) trees’ identification; and (iii) trees’ clustering.

4.3.1. Image Processing

Agricultural fields are governed by dynamically changing environmental conditions and the emergence of non-cultivated vegetation (e.g., weeds). The latter phenomenon results in noisy imagery data, mainly due to a lack of orchard care. Therefore, to tackle this challenge, a three-step analysis was implemented involving.
  • A mosaic transformation process to eliminate the noise by using a color space transformation.
  • A mask based on the green scale in the HSV color space to isolate the trees in the agricultural field.
  • A transformation of the masked areas in grayscale.
First, the input images were selected based on the color diversity from two months (i.e., August (Figure 4a(i)) and November (Figure 4b(i)) to choose the color space that generates better results. The color transformations in the HSV, Luminance, Croma:Blue, Chrome:Red (YCbCr), and L*a*b* color spaces are illustrated in Figure 4a(ii–iv),b(ii–iv), respectively. The assigned color codes helped easily perceive the shading effect and the vegetation areas. Notably, the HSV transformation provided more accurate results than the other transformations (i.e., YCbCr and L*a*b*). For clarification purposes, L*a*b* is a color space defined by the Commission Internationale de l’Eclairage (CIE); L* indicates lightness, a* is the red/green coordinate, and b* is the yellow/blue coordinate in the color space.
Second, the appropriate color range was selected. The transformation of the orchard into the HSV color space was dominated by two colors (i.e., green and blue). The orchard trees were depicted in green color, while the areas of low interest (i.e., the rest of the agri-field) were displayed in blue color. The areas of high interest were isolated from the rest of the orchard (Figure 5a(i),b(i)). This procedure was performed on a function that receives an RGB image as input, converts the image to HSV, and isolates the areas of interest. The final steps of the image processing include: (i) noise removal (Figure 5a(ii),b(ii)); and (ii) transformation of the mosaic into a grayscale (Figure 5a(iii),b(iii)). Figure 5c presents the mosaic of the agricultural field used in this study following the image processing.

4.3.2. Trees’ Identification

The identification of the trees in the orchard was based on the circular Hough transform (CHT). For allowing UGVs’ maneuvers and avoiding collisions, the various radii of the conceivable circles around each orchard tree were considered based on an implementation of the CHT transformation, following established evidence in extant studies [26,27,28,29,30]. Specifically, an iterative method was used to determine all of the possible orchard objects’ radii values. An iteration loop was set with an upper limit equal to the mosaics’ hypotenuse. A mosaics’ hypotenuse is the distance between the two opposite sides of the mosaic; we used this measure to ensure that we considered the maximum possible radius for the iterative method. For each iteration, the CHT was executed to determine if a point constitutes the center of a walnut tree. At the end of the abovementioned procedure, the recognized points were clustered.
A function was developed to identify trees correctly in environments with increased noise, recognizing noise in identified clusters based on their area. It was observed that the clusters follow a normal distribution with the land area of actual trees being within the interval [µσ, µ + σ], where µ is the average value of the land area and σ is the standard deviation. In the following schematic (Figure 6), the y-axis represents the number of trees in an area with the most variation in the canopy, and the x-axis represents the calculated canopy area in pixels. In this context, any areas outside the above interval were defined as noise and were excluded.
After noise removal (for August and November), the recognized trees are presented in Figure 7. The percentage of noise was significantly higher in orchard images in November due to environmental conditions. The noise at the upper and lower headlands of the orchard’s mosaic, due to the geomorphologic properties of the field, did not affect the results.

4.3.3. Trees’ Clustering

Considering that the analysis aimed to generate tracks within the orchard, the identified trees had to be classified based on their cultivated row. Empirically, it was decided to arrange the cultivation lines to be parallel with the field’s lengthiest side to reduce the number of UGV turns during path navigation.
The geomorphological peculiarities for orchard fields can affect the accuracy of the generated tracks. Moreover, in many use cases, orchards’ cultivation lines are parallel to each other but not with Earth’s latitude or longitude. To this end, to classify the identified trees into distinct rows based on the orchard’s geomorphological characteristics, the coordinates of identified trees were converted into a relative coordinate system parallel to the Earth’s latitude and longitude.

4.4. Field Tracks’ Extraction

The extraction of the orchard tracks considered the recognized trees along with the mean distance between trees at the same rows and across different cultivation lines. The disadvantage of this approach is the dependence on the identified trees. Therefore, two main challenges emerged: (i) failure to identify trees within the orchard; and (ii) failure to identify trees at the upper and lower boundaries of the orchard. In the first case, the cost of the recognition failure was negligible due to low values. In the second case, the cost of recognition failure was higher since the orchard paths for the UGV were reduced. In addition, most use cases involve orchards characterized by an irregular topological arrangement. Therefore, the areas that favor the movement of a UGV have different lengths. This study decomposed the selected orchard into two rectangular sections of different lengths and widths. Figure 8 depicts the identified trees (i.e., red points) and the calculated tracks (i.e., blue points and lines). From a technical point of view, we clarify that the green points in Figure 8 represent convex points from the route planning algorithm and do not represent any agent/entity/point in the physical world. The green points were used for debugging purposes to generate the tracks comprising the UGV’s route plan (i.e., blue points and lines). We purposefully inserted the green points in Figure 8 to demonstrate the elaborated technique and the complexity of the coding and programming efforts.
In order to extract both the coordinates of the identified trees and the calculated track, proper exploitation of the mosaic’s metadata was essential. In particular, two 2D arrays were produced based on the reference matrix in which the values were correlated with each pixel of the mosaic. The final coordinates were in the universal transverse mercator (UTM) format.

4.5. Path Planning

The path planning was based on the method developed in [26] for the in-field path planning task of transport units and also expanded to include inter-field route planning [27] and implemented as well in the controlled traffic farming (CTF) [28,29] and orchards environments [30]. A cartesian grid (x,y) was considered for UGV’s path planning based on field orientation. The vertical y’y axis represents the orientation of the field’s lines, thus the motion direction of the vehicles. This axis’ value space is p = {−1, 0, 1} where the top headline of the field is represented by the number ‘1’, the bottom headline by the number ‘−1’, and the rest points between the headlines are represented by ‘0’. The horizontal x’x axis is indexed, A = {1, 2, 3, …, i}, representing the field’s working line (Figure 9). Each working line in the field can be represented as a sequence of line segments or N i points, forming Z j i with i ,   j , where i A and i , j     { 1 ,   Ν i } . This sequence of line segments can be represented as a vector {( Z 1 x   i ,   Z 1 y   i ), ( Z 2 x   i ,   Z 2 y   i ), …, ( Z N i x   i ,   Z N i y   i )}, where Z j x i and Z j y i indicate the x and y coordinates of the Ζ j i point.
The agri-field was represented as a topological grid with nodes. Each node denoted the following states: (i) reserved point/obstacle; (ii) free/accessible point; (iii) starting point of a track; and (iv) ending point of a track. Each node was associated with four movements within the space: Up, Down, Right, Left. Also, there was a direct connection for every adjacent node to which traffic was allowed (Figure 10). The algorithm’s required input data was the number of cultivation lines to calculate the shortest path in a grid that connects the starting to the ending node. In the case where more than one path is found, the distance of each path was calculated, and the path with the lowest cost was ultimately selected.
In this study, the BFS algorithm was used. The algorithm started at the tree root and explored all neighboring nodes at the current depth level before moving on to the next depth level. The BSF applies the opposite approach to depth first search, which explores the highest-depth nodes before exploring the nearest nodes. Provided that the produced grid consists of three rows (i.e., top, middle, bottom), the BFS produces a topological-based path by comparing and rejecting all possible alternative paths.
After that, the generated path was converted to an actual GPS path correlating the nodes of the graph with the actual coordinates extracted from the UAV’s captured data, while a keyhole markup language (KML) file was produced to enable interoperability with other platforms (Figure 11a). The KML is a file format used to display geographic data in an Earth browser [31]. The input data consisted of UTM coordinates that form the crop lines and the corresponding pixels of the coordinates on the images. The object class included the necessary techniques responsible for data integrity, eliminating any faulty coordinates or empty entities to produce smooth and reliable input data. After that, three class objects were created describing the grid-based map and the lines of the field (Figure 11b). Overall, in Figure 11b, the graph to algorithmically calculate the optimal UGV path was generated: (i) the upper and lower green rectangles refer to headland points denoting the start and end point of tracks connecting these points; and (ii) the middle red rectangle captures the middle points in the respective tracks. Obstacle declaration is generally manual; in our case, the user had to state the location of every entity (i.e., actual obstacle, parked or used machinery, human). Considering the existing topology map, the nodes belonging to the UGV’s actual path were colored in blue, while obstacles were indicated in red (Figure 11c). Following the design stage of all nodes, a dashed blue line connecting these points was produced, representing the robot vehicle’s actual path to traverse. In Figure 11c, the planned path overlaid the actual orchard’s layout.

4.6. System Integration

The final map was inserted into the ROS in the portable gray map (PGM) image format to navigate the autonomous ground vehicle in the orchard. The file type was encoded with 8 or 16 bytes per pixel. The ‘costmap_2d’ core package, a part of ROS, was used to interconnect the generated map with ROS. The input of the package required the creation of a PGM image file to define the scale of the generated map. Notably, the ‘costmap_2d’ core package uses the vehicle’s dimensions to deliver larger virtual dimensions to obstacles, thus making more straightforward the UGV’s navigation. In detail, the outline of each obstacle was divided into two distinct areas. The first area (touching the actual contour of the barrier) was defined as a forbidden area, and despite the commands that could be received, the UGV could not access it. The second area referred to the minimum distance the UGV can approach. Figure 12 shows the generated map.
Furthermore, a GPS receiver was adapted to the robotic vehicle and was connected in series. The KML file of the software was transferred to the UGV receiver to access it from the software of the robotic vehicle. The Thorvald robotic vehicle had a built-in computer as its functions were performed through ROS software [32]. Properly adapted ROS software packages recognize the coordinates in the KML, and the computer gives the corresponding commands to mobilize the vehicle. In this research, the vehicle’s navigation in the field was recorded using the data generated by the software (Figure 13).

5. Navigation Scenarios and Route Planning Results

The route planning algorithm was tested in diverse use-case scenarios with a particular scope (i.e., to plan the optimal route of a UGV provided a starting and an ending point). The difference lies in the different initial and final points. Following the adopted methodological approach, the analysis in each scenario generated four outputs, including:
  • A mosaic depicting the navigation path.
  • A topological map illustrating the UGV’s navigation path.
  • A graph indexing the optimal path points.
The route planning algorithm produced the optimal path by avoiding crossing the cultivation lines. The UGV should follow the first line and cross the headlines to the target point. The integrated UGV sensors enabled in-field navigation safety. In real-world cases, the vehicle recognizes any obstacles; hence the navigation path can be dynamically changed. In Section 5.1 and Section 5.2, two scenarios are described in more detail.

5.1. Scenario #1

In Scenario #1, the path’s starting point was defined by index 4 and the final point by index 12, while in the agri-field, an obstacle existed at the point with index 5 (Figure 14). The algorithm selected the cultivation line on the left to access and traverse the agri-field (the current –second– cropping line is blocked by an obstacle). Therefore, the third cultivation line was crossed, and the UGV arrived at the ending point. Figure 14 presents the calculated path generated by the developed platform. The path consisted of the desired points that had to be parsed by the automated vehicle.

5.2. Scenario #2

In Scenario #2, the path’s starting point was defined by index 1 and the final point by index 10, while the agri-field obstacles existed at the points with index 4 and index 5 (Figure 15). The algorithm selected the first cultivation line to access the agri-field (an obstacle blockeds the second cropping line). Therefore, the first cultivation line was crossed until the path passed through the third cultivation line, and finally, the UGV arrived at the destination.

6. Conclusions

Developments and advancements in the agricultural landscape dictate that automation of operations (e.g., harvesting) could support farmers in increasing efficiency, farm productivity, and welfare. However, the direct application of autonomous machinery in agri-field logistics could be challenging due to varied field orientation issues and random obstacles between the cultivation lines. In this regard, this research developed an integrated system that leverages the technical capabilities of a UAV, a UGV, and an FMIS for the optimal route planning of a ground vehicle in bespoke agri-field layouts while considering obstacles. The system leveraged mapping and image processing capabilities and routing algorithms to generate the optimal path for a UGV to navigate across an agri-field between two indicated points without disruptions.

6.1. Academic and Practical Implications

This research developed an integrated system for the UAV-supported route planning of UGVs in semi-deterministic agricultural environments. Our methodology differs from the case of [13], considering that the algorithm detects the paths on either side of the cultivation lines through the Hough Transform. Specifically, the developed approach considers and helps tackle essential issues in automated agricultural operations in orchards, namely noise, that results in challenges during the agri-field data gathering process. Weather conditions variability during a crop growing season and changes in the soil geomorphology of the cultivated plants lead to distortions to the gathered data, thereby challenging the required operations planning decisions. In addition, a lack of orchards’ care contributes to the growth of uncultivable plants and the phenomenon of vegetation cover, thus leading to noisy data.
Based on the literature investigation, our study findings indicate that the application of machine learning algorithms offers satisfactory noise resistance results but is associated with high computational costs for model training. In addition, it is required to capture plenty of agri-field photos to create a data set to avoid overfitting. To this end, our system applies pattern recognition algorithms and produces satisfactory results at a low computational cost without requiring large data sets. The applicability of the developed system was demonstrated in different real-world use-case scenarios.

6.2. Limitations and Future Research

In conducting this research, some limitations are evident that provide interesting directions for future research. The system could consider more route planning algorithms for generating truly optimal solutions. For example, in this study the slope of the field and soil conditions that can affect the path of the ground vehicle were not considered.
In the future, as part of our ongoing research efforts, we aim to enable the real-time interaction of drones and ground vehicles for mapping and planning purposes. The synergistic operations of the two types of vehicles can provide flexibility that will allow addressing complex agri-field layouts and dynamically occurring challenges during operations. Human-robot synergy can help tackle key agricultural challenges regarding resource use efficiency and sustainability [33,34,35], providing promising research avenues.

Author Contributions

Conceptualization, D.K. (Dimitrios Katikaridis), D.B., S.P. and C.G.S.; methodology, D.K. (Dimitrios Katikaridis), V.M., N.T., P.B. and D.B.; software, D.K. (Dimitrios Katikaridis) and V.M.; validation, D.K. (Dimitrios Kateris), P.B. and V.M.; data curation, D.K. (Dimitrios Katikaridis) and V.M.; writing—original draft preparation, D.K. (Dimitrios Katikaridis), N.T. and P.B.; writing—review and editing, D.B., C.G.S., P.B., S.P. and N.T.; visualization, V.M., D.K. (Dimitrios Kateris) and N.T.; supervision, D.B., C.G.S. and S.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhang, K.; Lammers, K.; Chu, P.; Li, Z.; Lu, R. System design and control of an apple harvesting robot. Mechatronics 2021, 79, 102644. [Google Scholar] [CrossRef]
  2. Bak, T.; Jakobsen, H. Agricultural Robotic Platform with Four Wheel Steering for Weed Detection. Biosyst. Eng. 2004, 87, 125–136. [Google Scholar] [CrossRef]
  3. Yang, L.; Qi, J.; Song, D.; Xiao, J.; Han, J.; Xia, Y. Survey of Robot 3D Path Planning Algorithms. J. Control Sci. Eng. 2016, 2016, 1–22. [Google Scholar] [CrossRef]
  4. Kiani, F.; Seyyedabbasi, A.; Nematzadeh, S.; Candan, F.; Çevik, T.; Anka, F.A.; Randazzo, G.; Lanza, S.; Muzirafuti, A. Adaptive Metaheuristic-Based Methods for Autonomous Robot Path Planning: Sustainable Agricultural Applications. Appl. Sci. 2022, 12, 943. [Google Scholar] [CrossRef]
  5. González, D.; Pérez, J.; Milanés, V.; Nashashibi, F. A Review of Motion Planning Techniques for Automated Vehicles. IEEE Trans. Intell. Transp. Syst. 2016, 17, 1135–1145. [Google Scholar] [CrossRef]
  6. Ayadi, N.; Maalej, B.; Derbel, N. Optimal path planning of mobile robots: A comparison study. In Proceedings of the 2018 15th International Multi-Conference on Systems, Signals and Devices, SSD 2018, Yassmine Hammamet, Tunisia, 19–22 March 2018; Institute of Electrical and Electronics Engineers Inc.: Yassmine Hammamet, Tunisia, 2018; pp. 988–994. [Google Scholar]
  7. Gao, X.; Li, J.; Fan, L.; Zhou, Q.; Yin, K.; Wang, J.; Song, C.; Huang, L.; Wang, Z. Review of wheeled mobile robots’ navigation problems and application prospects in agriculture. IEEE Access 2018, 6, 49248–49268. [Google Scholar] [CrossRef]
  8. Durand-Petiteville, A.; Le Flecher, E.; Cadenat, V.; Sentenac, T.; Vougioukas, S. Tree Detection with Low-Cost Three-Dimensional Sensors for Autonomous Navigation in Orchards. IEEE Robot. Autom. Lett. 2018, 3, 3876–3883. [Google Scholar] [CrossRef]
  9. Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperation of unmanned systems for agricultural applications: A theoretical framework. Biosyst. Eng. 2021, in press. [Google Scholar] [CrossRef]
  10. Jiang, G.; Wang, X.; Wang, Z.; Liu, H. Wheat rows detection at the early growth stage based on Hough transform and vanishing point. Comput. Electron. Agric. 2016, 123, 211–223. [Google Scholar] [CrossRef]
  11. Tsolakis, N.; Bechtsis, D.; Bochtis, D. Agros: A robot operating system based emulation tool for agricultural robotics. Agronomy 2019, 9, 403. [Google Scholar] [CrossRef]
  12. Quigley, M.; Gerkey, B.P.; Conley, K.; Faust, J.; Foote, T.; Leibs, J.; Berger, E.; Wheeler, R.; Ng, A. ROS: An Open-Source Robot Operating System; In ICRA Workshop on Open Source Software. In Proceedings of the IEEE International Conference on Robotics and Automation, Koba, Japan, 12–17 May 2009; pp. 1–6. [Google Scholar]
  13. Zhang, H.; Chen, B.; Zhang, L. Detection Algorithm for Crop Multi-Centerlines Based on Machine Vision. Trans. ASABE 2008, 51, 1089–1097. [Google Scholar] [CrossRef]
  14. Yang, L.; Wu, X.; Praun, E.; Ma, X. Tree detection from aerial imagery. In Proceedings of the GIS: ACM International Symposium on Advances in Geographic Information Systems, Seattle, DC, USA, 4–6 November 2009; ACM Press: New York, NY, USA, 2009; pp. 131–137. [Google Scholar]
  15. Freund, Y.; Schapire, R.E. A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting. J. Comput. Syst. Sci. 1997, 55, 119–139. [Google Scholar] [CrossRef]
  16. Guerrero, J.M.; Pajares, G.; Montalvo, M.; Romeo, J.; Guijarro, M. Support Vector Machines for crop/weeds identification in maize fields. Expert Syst. Appl. 2012, 39, 11149–11155. [Google Scholar] [CrossRef]
  17. Otsu, N. Threshold selection method from gray-level histograms. IEEE Trans. Syst. Man. Cybern 1979, 9, 62–66. [Google Scholar] [CrossRef]
  18. Torres-Sánchez, J.; de Castro, A.I.; Peña, J.M.; Jiménez-Brenes, F.M.; Arquero, O.; Lovera, M.; López-Granados, F. Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis. Biosyst. Eng. 2018, 176, 172–184. [Google Scholar] [CrossRef]
  19. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef]
  20. Wang, Z.; Cai, J. Probabilistic roadmap method for path-planning in radioactive environment of nuclear facilities. Prog. Nucl. Energy 2018, 109, 113–120. [Google Scholar] [CrossRef]
  21. Nazarahari, M.; Khanmirza, E.; Doostie, S. Multi-objective multi-robot path planning in continuous environment using an enhanced genetic algorithm. Expert Syst. Appl. 2019, 115, 106–120. [Google Scholar] [CrossRef]
  22. Chen, X.; Zhang, Y.; Xie, J.; Du, P.; Chen, L. Robot needle-punching path planning for complex surface preforms. Robot. Comput. Integr. Manuf. 2018, 52, 24–34. [Google Scholar] [CrossRef]
  23. Low, E.S.; Ong, P.; Cheah, K.C. Solving the optimal path planning of a mobile robot using improved Q-learning. Rob. Auton. Syst. 2019, 115, 143–161. [Google Scholar] [CrossRef]
  24. Han, J.; Seo, Y. Mobile robot path planning with surrounding point set and path improvement. Appl. Soft Comput. 2017, 57, 35–47. [Google Scholar] [CrossRef]
  25. Guo, D.; Wang, J.; Zhao, J.B.; Sun, F.; Gao, S.; Li, C.D.; Li, M.H.; Li, C.C. A vehicle path planning method based on a dynamic traffic network that considers fuel consumption and emissions. Sci. Total Environ. 2019, 663, 935–943. [Google Scholar] [CrossRef] [PubMed]
  26. Bochtis, D.D.; Sørensen, C.G.; Vougioukas, S.G. Path planning for in-field navigation-aiding of service units. Comput. Electron. Agric. 2010, 74, 80–90. [Google Scholar] [CrossRef]
  27. Jensen, M.A.F.; Bochtis, D.; Sørensen, C.G.; Blas, M.R.; Lykkegaard, K.L. In-field and inter-field path planning for agricultural transport units. Comput. Ind. Eng. 2012, 63, 1054–1061. [Google Scholar] [CrossRef]
  28. Bochtis, D.D.; Sørensen, C.G.; Busato, P.; Hameed, I.A.; Rodias, E.; Green, O.; Papadakis, G. Tramline establishment in controlled traffic farming based on operational machinery cost. Biosyst. Eng. 2010, 107, 221–231. [Google Scholar] [CrossRef]
  29. Bochtis, D.D.; Sørensen, C.G.; Green, O.; Moshou, D.; Olesen, J. Effect of controlled traffic on field efficiency. Biosyst. Eng. 2010, 106, 14–25. [Google Scholar] [CrossRef]
  30. Bochtis, D.; Griepentrog, H.W.; Vougioukas, S.; Busato, P.; Berruto, R.; Zhou, K. Route planning for orchard operations. Comput. Electron. Agric. 2015, 113, 51–60. [Google Scholar] [CrossRef]
  31. Google Developers. Keyhole Markup Language. Available online: https://developers.google.com/kml (accessed on 5 August 2022).
  32. Grimstad, L.; From, P.J. The Thorvald II agricultural robotic system. Robotics 2017, 6, 24. [Google Scholar] [CrossRef]
  33. Anastasiadis, F.; Tsolakis, N.; Srai, J.S. Digital technologies towards resource efficiency in the agrifood sector: Key challenges in developing countries. Sustainability 2018, 10, 4850. [Google Scholar] [CrossRef]
  34. Bechtsis, D.; Tsolakis, N.; Vouzas, M.; Vlachos, D. Industry 4.0: Sustainable material handling processes in industrial environments. Comput-Aided Chem. En. 2017, 40, 2281–2286. [Google Scholar]
  35. Tsolakis, N.; Aivazidou, E.; Srai, J.S. Sensor applications in agrifood systems: Current trends and opportunities for water stewardship. Climate 2019, 7, 44. [Google Scholar] [CrossRef]
Figure 1. Research methodology steps.
Figure 1. Research methodology steps.
Agronomy 12 01937 g001
Figure 2. System’s architecture.
Figure 2. System’s architecture.
Agronomy 12 01937 g002
Figure 3. The FMIS digital platform interface for the generation of the UAV area coverage plans: (a) UGV’s track’s direction parallel to the longest side of the orchard; (b) UGV’s path orientation vertical to the longest side of the orchard.
Figure 3. The FMIS digital platform interface for the generation of the UAV area coverage plans: (a) UGV’s track’s direction parallel to the longest side of the orchard; (b) UGV’s path orientation vertical to the longest side of the orchard.
Agronomy 12 01937 g003
Figure 4. Color space representation and transformation: RGB [a(i),b(i)]; HSV [a(ii),b(ii)]; YCbCr [a(iii),b(iii)]; and L*a*b* [a(iv),b(iv)].
Figure 4. Color space representation and transformation: RGB [a(i),b(i)]; HSV [a(ii),b(ii)]; YCbCr [a(iii),b(iii)]; and L*a*b* [a(iv),b(iv)].
Agronomy 12 01937 g004
Figure 5. Image processing phases: isolated areas of interest [a(i),b(i)]; noise removal [a(ii),b(ii)]; grayscale transformation [a(iii),b(iii)]; and agri-field mosaic following the image processing [(c)].
Figure 5. Image processing phases: isolated areas of interest [a(i),b(i)]; noise removal [a(ii),b(ii)]; grayscale transformation [a(iii),b(iii)]; and agri-field mosaic following the image processing [(c)].
Agronomy 12 01937 g005
Figure 6. Distribution of the identified trees.
Figure 6. Distribution of the identified trees.
Agronomy 12 01937 g006
Figure 7. Identified trees for two distinct phases of the orchard: (a) August; and (b) November.
Figure 7. Identified trees for two distinct phases of the orchard: (a) August; and (b) November.
Agronomy 12 01937 g007
Figure 8. Extracted UGV tracks in the orchard.
Figure 8. Extracted UGV tracks in the orchard.
Agronomy 12 01937 g008
Figure 9. Path planning of UGV tracks in the orchard: (a) green lines indicate the field’s headlines and orange lines indicate the intermediate points between headlines; and (b) topological grid.
Figure 9. Path planning of UGV tracks in the orchard: (a) green lines indicate the field’s headlines and orange lines indicate the intermediate points between headlines; and (b) topological grid.
Agronomy 12 01937 g009
Figure 10. Agri-field map to topological grid representation (the enumerated nodes denote the upper, middle, and lower points that define navigation tracks).
Figure 10. Agri-field map to topological grid representation (the enumerated nodes denote the upper, middle, and lower points that define navigation tracks).
Agronomy 12 01937 g010
Figure 11. Path planning technique: (a) structure and information/data flow; (b) UTM coordinates to the topological map; and (c) path generation.
Figure 11. Path planning technique: (a) structure and information/data flow; (b) UTM coordinates to the topological map; and (c) path generation.
Agronomy 12 01937 g011
Figure 12. Generated map based on the ‘costmap_2d’ ROS package.
Figure 12. Generated map based on the ‘costmap_2d’ ROS package.
Agronomy 12 01937 g012
Figure 13. Unmanned vehicle (type Thorvald) in the orchard.
Figure 13. Unmanned vehicle (type Thorvald) in the orchard.
Agronomy 12 01937 g013
Figure 14. Route planning, Scenario #1: (a) initial map; (b) topological grid; (c) graph; and (d) optimal path plan.
Figure 14. Route planning, Scenario #1: (a) initial map; (b) topological grid; (c) graph; and (d) optimal path plan.
Agronomy 12 01937 g014
Figure 15. Route planning, Scenario #2: (a) initial map; (b) topological grid; (c) graph; and (d) optimal path plan.
Figure 15. Route planning, Scenario #2: (a) initial map; (b) topological grid; (c) graph; and (d) optimal path plan.
Agronomy 12 01937 g015
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Katikaridis, D.; Moysiadis, V.; Tsolakis, N.; Busato, P.; Kateris, D.; Pearson, S.; Sørensen, C.G.; Bochtis, D. UAV-Supported Route Planning for UGVs in Semi-Deterministic Agricultural Environments. Agronomy 2022, 12, 1937. https://doi.org/10.3390/agronomy12081937

AMA Style

Katikaridis D, Moysiadis V, Tsolakis N, Busato P, Kateris D, Pearson S, Sørensen CG, Bochtis D. UAV-Supported Route Planning for UGVs in Semi-Deterministic Agricultural Environments. Agronomy. 2022; 12(8):1937. https://doi.org/10.3390/agronomy12081937

Chicago/Turabian Style

Katikaridis, Dimitrios, Vasileios Moysiadis, Naoum Tsolakis, Patrizia Busato, Dimitrios Kateris, Simon Pearson, Claus Grøn Sørensen, and Dionysis Bochtis. 2022. "UAV-Supported Route Planning for UGVs in Semi-Deterministic Agricultural Environments" Agronomy 12, no. 8: 1937. https://doi.org/10.3390/agronomy12081937

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop