This paper concerns the problem of actively searching for and localizing ground features by a coordinated team of air and ground robotic sensor platforms. The approach taken builds on well known Decentralized Data Fusion (DDF) methodology. In particular, it brings together established representations developed for identification and linearized estimation problems to jointly address feature detection and localization. This provides transparent and scalable integration of sensor information from air and ground platforms. As in previous studies, an Information theoretic utility measure and local control strategy drive the robots to uncertainty reducing team configurations. Complementary characteristics in terms of coverage and accuracy are revealed through analysis of the observation uncertainty for air and ground on-board cameras. Implementation results for a detection and localization example indicate the ability of this approach to scalably and effciently realize such collaborative potential.
Available at: http://works.bepress.com/george_pappas/122/