Vision-Guided Flight for Micro Air Vehicles (MAVs)

S. Ettinger, P. Ifju and M. C. Nechyba

(Note: videos below are given in both quicktime and mpeg formats; however, the quicktime videos tend to be higher quality and are therefore recommended. If you do not have quicktime installed, you can download the free player (Mac/Windows) from Apple.)
Researchers from the Departments of Aerospace & Mechanical Engineering and Electrical & Computer Engineering at the University of Florida are doing on-going work on vision-guided autonomous flight for Micro Air Vehicles (MAVs). This page briefly describes some of our results to date.


Micro Air Vehicles

Click on any of the thumbnails below for a larger picture of some of our recent six-inch and 24-inch MAV designs.

Six-inch MAV
24-inch UAVs
Sample flight video of six-inch MAV
(320 x 240, 2:40)

(high-quality, quicktime, 39.3 Mb)

(lower-quality, mpeg, 8.9 Mb)
Frisbee launch of six-inch MAV
(320 x 240, 0:15)

Our MAVs can be launched several ways. This video clip shows the stylish "frisbee" launch method for the six-inch MAV.


(high-quality, quicktime, 13.3 Mb)

(lower-quality, mpeg, 792 kb)
Automated launch of six-inch MAV
(320 x 240, 0:49)

Here the six-inch MAV is launched into flight using an automated launcher.


(high-quality, quicktime, 47.6 Mb)

(lower-quality, mpeg, 3 Mb)
Sample flight of 24-inch UAV
(320 x 240, 0:38)

(high-quality, quicktime, 32.7 Mb)

(lower-quality, mpeg, 2 Mb)
Acrobatic flight of 24-inch UAV
(320 x 240, 0:25)

(high-quality, quicktime, 20.2 Mb)

(lower-quality, mpeg, 1.3 Mb)
Two 24-inch UAVs flying concurrently
(320 x 240, 0:23)

Here, two UAVs are flown at the same time over the same arena. Since our UAVs are inexpensive, relatively quick to manufacture and set up, they could serve as one cost-effective platform for a cooperative-control testbed.


(high-quality, quicktime, 18.2 Mb)

(lower-quality, mpeg, 992 kb)
In the line of fire
(320 x 240, 15Hz, 0:09)

Here, we illustrate the 24-inch MAV buzzing one member of our research team.


(high-quality, quicktime, 3.1 Mb)

(lower-quality, mpeg, 472 kb)


Vision-Based Horizon Tracking

As shown in the diagram below, our MAVs are equipped with small video cameras that transmit their video signals to an NTSC antenna, which can feed into a PC equipped with a video capture card.

Using this setup, we have captured video of remotely piloted human-controlled flights, and have developed a vision-based horizon-tracking algorithm based on that data. Our algorithm identifies the horizon by finding the line which minimizes the intra-class variance of the pixels above and below the line (that is, the intra-class variance of the sky and ground pixels). The basic assumptions underlying this approach are (1) that the horizon can be approximated by a straight line, and (2) that the ground and sky have different appearance. Further details can be found in:

S. M. Ettinger, M. C. Nechyba, P. G. Ifju and Martin Waszak, Vision-Guided Flight Stability and Control for Micro Air Vehicles, to appear in Proc. IEEE Int. Conference on Intelligent Robots and Systems, October, 2002 (7 pages, 2.6 Mb).

Some still-frame and movie examples are shown below:

Still-frame examples
(click on the thumbnails for larger images)

Sample image with identified horizon Corresponding optimization function in line-parameter space Distribution of sky (blue crosses) and ground pixels (green circles) in RGB space
Horzion detection with uneven horizon Horizon detection with severe video transmission noise Distribution of sky (blue crosses) and ground pixels (green circles) in RGB space (noisy image)

 

Horizon tracking video
(click on the thumbnails to download/view the movies)

Human-piloted flight with horizon tracking
(320 x 240, 1:02)

(high-quality, quicktime, 27 Mb)

(lower-quality, mpeg, 3.5 Mb)
Detailed horizon tracking example
(640 x 560, 30Hz, 0:08)

This augmented video shows the optimization surface in line-parameter space and a sample of the color distributions for the sky and ground respectively, for a short human-piloted video clip.


(high-quality, quicktime, 12.8 Mb)

(lower-quality, mpeg, 2.1 Mb)
Detailed horizon tracking example #2
(640 x 560, 30Hz, 0:09)

This augmented video demonstrates a newer horizon detection algorithm on a short human-piloted video clip from the 6th international MAV competition near Provo, Utah. The newer algorithm maximizes the inter-class difference of the pixels above and below the line. (Note: the newer algorithm outperforms the earlier algorithm for this and other test cases.)


(high-quality, quicktime, 16.2 Mb)

(lower-quality, mpeg, 4 Mb)


Vision-Based Flight Stability and Autonomy

Over the past year, we have put our vision-tracking algorithm in the control loop and have achieved self-stabilized flights based strictly on our vision-based horizon-tracking algorithm. No other sensors were used during these flights. The examples below illustrate recent self-stabilized flights above the University of Florida campus by two novice pilots. For these flights, the human pilots controlled the MAVs heading, while the horizon tracking algorithm stabilized the MAV along the desired heading. We are currently working towards completely autonomous MAV flights with the addition of on-board GPS for navigation.

 

Self-stabilized, vision-based MAV flight
(click on the thumbnails to download/view the movies)

Self-stabilized MAV flight over the UF campus with superimposed horizon
(320 x 240, 1:11)

(high-quality, quicktime, 33 Mb)

(lower-quality, mpeg, 3.6 Mb)
Second self-stabilized MAV flight over the UF campus (higher altitude)
(320 x 240, 1:36)

(high-quality, quicktime, 25 Mb)

(lower-quality, mpeg, 4.4 Mb)

 

The two figures below illustrate that self-stabilized flights tend to be much smoother than human-piloted flights. Figure 1 compares the bank angle of a typical human-piloted flight with a typical self-stabilized flight (over an approx. one minute period of time). Figure 2 draws the same comparison for the the pitch ratio (ratio of image that is sky).

Comparison between human-piloted and self-stabilized flights
(click on the thumbnails for larger plots)



Figure 1 Figure 2


Last updated Spetember 2, 2002 by
Michael C. Nechyba