• Introduction
    • Drone racing is getting popular as a new-generation hobby as well as a rising professional racing sport. Recent drone racing games show the agility of the drone flying through a zig-zagging, narrow, or confined racing circuit. An onboard camera and the head-mount display goggle provide pilots with the First Person View (FPV) and pilots demonstrate their amazing control techniques during the racing. Autonomous drone flight through such a daring environment at high speed still remains an ongoing challenge. Nonetheless, drones that can negotiate with complex surroundings can be not only as a sports vehicle but a killer application of future drones, where drones can fly through obstacles and search for survivors in a scene of accident and more. We have successfully organized ADR in previous IROS 2016 Daejeon and IROS 2017 Vancouver. We propose the third ADR (Autonomous Drone Racing) in IROS 2018 Madrid to make the ADR as a signature test-bed for autonomous vision-based drone navigation.
  • Summary of ADR 2016 and 2017
    • The inaugural ADR (Autonomous Drone Racing) was held in IROS 2016 and continued in IROS 2017. They were indoor autonomous flight challenges which contained five testing elements; a high-speed flight on a straight path through open gates, sharp turns, horizontal zig-zag path, and spiral upward path through closed gates, and a dynamic obstacle. In ADR IROS 2016, there were 4 open gates, 22 closed gates, and total 26 including one dynamic gate. The detail information of the track is available online. To facilitate the localization, each track gate has a QR code that contains the identification number of the gate. Fig. 1 shows the overall layout of the ADR 2016 and a scene of the actual ADR environment.

      Fig. 1 (a) ADR 2016 – An overall layout of the ADR gates

      Fig. 1 (b) A scene of the actual racing In IROS 2016 Daejeon

    • ADR racing track was revised as shown in Fig. 2, where the open gates were replaced by tree-like obstacles and the 360 degree-spiral-up gates in ADR 2016 was replaced by 90-degree-spiral-up gates in ADR 2017. The new tree-like open gates were for testing faster flight in a straight path. The number of closed gates was reduced to 9 including one dynamic gate. The detail information of the ADR 2017 track is available online.

      Fig. 2. (a) ADR 2017 – Top view of ADR gate arrangement

      Fig. 2 (b) A scene of the actual racing area in IROS 2017 Vancouver

      In both competitions, the size of the robot was limited to be 1m x 1m x 1m with all of the components fully extended, but its type was not specified in popular quadrotors. The drone was allowed to have any type of onboard sensors which may include vision (visible/IR), lidar, laser, radar, ultrasound or others. However, no external sensors were allowed during the racing.In 2016, a total 11 teams registered the competition, but due to issues like fundraising, technical readiness, visa, and others, three teams (Team KIRD of KAIST Korea, Team MAV-lab of TU Delft, Netherlands, and Team ETHZ ASL/ADRL, Switzerland) finally competed in the arena. Instead of autonomous flight, Team Coanda of Sungkyunkwan University showed interesting demo runs of their UFO-like drone. In 2017, a total 14 teams registered the competition, but as in 2016, 7 teams (Team KAIST Korea, Team MAV-lab of TU Delft, Netherlands, Team QuetzalC++ of INAOE, Mexico, Team First Commit of enthusiasts in Bay-area, Team Drone bot of UNIST, Korea, Team Robotics and Perception Group of U of Zurich, Swiss, Team LOBO DRONE of Univ. New Mexico, USA) showed up in the event and 5 teams finally competed in the arena.

      The competition began by placing the drone at the take-off point with the system ready. Once the team declared “start”, the drone was not intervened by the team members in any way until it completed the attempt by landing. In other words, the drone was operated completely autonomously without any form of human intervention during the trial. Each team was given to thirty minutes. For the given thirty minutes, each team attempted robot’s passing through each gate sequentially from the starting position as many times as they wanted. During each attempt, the passing gate ID and time were recorded and the team’s official score was selected from the best attempt.


      In 2016, during the competition, it seemed that all the team struggled in recognizing gates due to the issue of drone pose stabilization and cluttered gates in monotonic orange color. One team’s strategy was to use the stereo vision-based direct visual servoing, not the global navigation. This kind of approach suffered from the cluttered gates because small offset from the safe planned path caused collision either to the safety net or to the neighboring gate. Also, all the teams used visual recognition for finding gates and the small view angle of the camera caused the failure of recognition after sharp turns.  No team used QR code for localization. All the competing teams reached to the region of the sharp-turn path through closed gates and the best performance was reaching through gate 10 at 01:26.5 lap time. By the way, a human pilot could complete all the 26 gates in the track in 01:31.1 at his second trial. Team KIRD passed through gate 7 at 01:10.35. All the participating teams felt that the racing track was very challenging mainly due to the density of closed gates and lack of visual cues.


      In 2017, teams were more experienced and organized. Participating teams implemented more advanced technologies for the competition. However, state-of-art visual SLAM could not be successfully implemented. The team first commit reached gate 8 (which is the fourth closed gate) at 01:56.5. Team MAV-lab reached gate 7 at 00:25.7. Team Robotics and Perception Group reached gate 8 at 00:35.8. Team QuetzalC++ reached gate 9 at 03:11.6 and won the competition.


      There were some interesting findings in the 2017 event. The winner team used an optic flow and timing-based tracking control rather than mapping and localization. It shows that the competition needs to be revised so that it becomes more suitable to test technologies of autonomous recognition, planning, and control. Second, Team MAV-lab successfully implemented Kalman-filter-like tracking algorithms so that they could fly very fast through open gates. Third, Team Robotics and Perception Group from U of Zurich had the most up-front technologies in terms of team’s publication performance, but could not implement them for the ADR competition. It proves that the state-of-art lab technologies are not mature enough to be implemented in more realistic out-of-lab situations. Finally, Team First-commit was not from a research institute, but a group of robot-enthusiasts from San Francisco Bay area. Although they were engineers in high-tech companies, the robotic technologies are not monopolized by research institutes anymore.

    • From previous two events, we have verified that autonomous drone navigation is possible if the environmental information is perfectly known and optic flow works properly. However, such cases are not realistic in many useful practical applications such as first responding to disaster sites or autonomous drone taxi in cities. In IROS 2018, we would like to revise the competition based on findings from our previous experience.  The first one would be replacing the open gates with a new element to test fast-flight capability. Second, we would like to make the map information fuzzier. In previous events, we arranged the gates at designed locations as accurately as possible. In IROS 2018, we may use a set of regions of gate locations rather than exact locations or gate locations with a significant amount of error. This way, strategies of online planning and control based on real-time recognition would be praised than strategies of offline planning and tracking. Finally, we will discuss coloring of the gates with participants. We think that recognition technologies are very much dependent on the computing power which is drastically improving recently. Therefore, in our opinion, recognition parts can be eased in 2018 and let teams focused on more stable and robust navigation.