Skip to main content
Cooperative Flight Guidance of Autonomous Unmanned Aerial Vehicles
Real-Time and Embedded Systems Lab (mLAB)
  • William H Etter, Jr, University of Pennsylvania
  • Paul Martin, University of Pennsylvania
  • Rahul Mangharam, University of Pennsylvania
Document Type
Conference Paper
Date of this Version

Suggested Citation:
Etter, W., Martin, P., Mangharam, R. (2011). Cooperative Flight Guidance of Autonomous Unmanned Aerial Vehicles CPS Week Workshop on Networks of Cooperating Objects (CONET) April 11-14, 2011. Chicago, IL.

As robotic platforms and unmanned aerial vehicles (UAVs) increase in sophistication and complexity, the ability to determine the spatial orientation and placement of the platform in real time (localization) becomes an important issue. Detecting and extracting locations of objects, barriers, and openings is required to ensure the overall effectiveness of the device. Current methods to achieve localization for UAVs require expensive external equipment and limit the overall applicable range of the platform. The system described herein incorporates leader-follower unmanned aerial vehicles using vision processing, radio-frequency data transmission, and additional sensors to achieve flocking behavior. This system targets search and rescue environments, employing controls, vision processing, and embedded systems to allow for easy deployment of multiple quadrotor UAVs while requiring the control of only one. The system demonstrates a relative localization scheme for UAVs in a leader-follower configuration, allowing for predictive maneuvers including path following and estimation of the lead UAV in situations of limited or no line-of-sight.
Citation Information
William H Etter, Paul Martin and Rahul Mangharam. "Cooperative Flight Guidance of Autonomous Unmanned Aerial Vehicles" (2011)
Available at: