Cooperative Flight Guidance of Autonomous Unmanned Aerial VehiclesReal-Time and Embedded Systems Lab (mLAB)
Document TypeConference Paper
Date of this Version1-1-2011
AbstractAs robotic platforms and unmanned aerial vehicles (UAVs) increase in sophistication and complexity, the ability to determine the spatial orientation and placement of the platform in real time (localization) becomes an important issue. Detecting and extracting locations of objects, barriers, and openings is required to ensure the overall effectiveness of the device. Current methods to achieve localization for UAVs require expensive external equipment and limit the overall applicable range of the platform. The system described herein incorporates leader-follower unmanned aerial vehicles using vision processing, radio-frequency data transmission, and additional sensors to achieve flocking behavior. This system targets search and rescue environments, employing controls, vision processing, and embedded systems to allow for easy deployment of multiple quadrotor UAVs while requiring the control of only one. The system demonstrates a relative localization scheme for UAVs in a leader-follower configuration, allowing for predictive maneuvers including path following and estimation of the lead UAV in situations of limited or no line-of-sight.
Citation InformationWilliam H Etter, Paul Martin and Rahul Mangharam. "Cooperative Flight Guidance of Autonomous Unmanned Aerial Vehicles" (2011)
Available at: http://works.bepress.com/rahul_mangharam/28/