FAR Planner uses a dynamically updated visibility graph for fast replanning. The planner models the environment with polygons and builds a global visibility graph along with the navigation. The planner is capable of handling both known and unknown environments. In a known environment, paths are planned based on a prior map. In an unknown environment, multiple paths are attempted to guide the vehicle to goal based on the environment observed during the navigation. When dynamic obstacles are present, FAR Planner disconnects visibility edges blocked by the dynamic obstacles and reconnects them after regaining visibility. The software implementation uses two CPU threads - one for dynamically updating the visibility graph using ~20% of the thread and the other for path search that can find a path within 3ms, as evaluated on an i7 computer.
FAR Planner was used by the CMU-OSU Team in attending DARPA Subterranean Challenge. In the final competition which took place in Louisville Mega Cavern, KY, the team's robots conducted the most complete traversing and mapping across the site (26 out of 28 sectors) among all teams, winning a "Most Sectors Explored Award".
FAR planner was used by the CMU-OSU Team in attending DARPA Subterranean Challenge.
FAR Planner in unknown environment, Blue: Vehicle trajectory, Cyan: Visibility graph, A, C: Dynamic obstacles, B, D, E, F: Deadends
The repository has been tested in Ubuntu 18.04 with ROS Melodic and Ubuntu 20.04 with ROS Noetic. Follow instructions in Autonomous Exploration Development Environment to setup the development environment. Make sure to checkout the branch that matches the computer setup, compile, and download the simulation environments.
To setup TARE Planner, clone our repository.
git clone https://github.com/MichaelFYang/far_planner
In a terminal, go to the folder and compile.
cd far_planner
catkin_make
To run the code, go to the development environment folder in a terminal, source the ROS workspace, and launch.
source devel/setup.sh
roslaunch vehicle_simulator system_indoor.launch
In another terminal, go to the FAR Planner folder, source the ROS workspace, and launch.
source devel/setup.sh
roslaunch far_planner far_planner.launch
Our ground vehicle platform and simulated aerial vehicle platform are equipped with a Velodyne Puck Lidar used as the range sensor for navigation planning. The ground vehicle has a camera at 640×360 resolution and a MEMS-based IMU, coupled with the Lidar for state estimation. The autonomous system onboard incorporates navigation modules from our development environment, e.g., terrain analysis, and waypoint following based on kinodynamic feasible trajectories generated by the local planner, as fundamental navigation modules, and runs FAR planner at the top of the system. In the experiments, we compare our method to two searchbased methods: A*, D* Lite, and three sampling-based methods: RRT*, BIT*, and SPARS. Here, BIT* is considered as the state-of-the-art in the RRT-based family, and RRT* is the classic method of the family. All methods run on a 4.1GHz i7 computer. We configure FAR planner to update the v-graph at 2.5Hz and perform a path search for replanning at each v-graph update. The planner uses images at a resolution of 0.2 m/pixel to extract edge points for constructing polygons. The local layer on the v-graph is a 40m×40m area with the vehicle in the center.
Indoor Environment
Campus Environment
Metrics
Fan Yang
CMU Robotics Institute
Chao Cao
CMU Robotics Institute
Hongbiao Zhu
CMU Robotics Institute
Jean Oh
CMU Robotics Institute
Ji Zhang
CMU NREC & Robotics Institute