Detailed bathymetric maps of the sea floor with centimeter level resolution can be produced by underwater vehicles using multibeam sonars and structured light laser imaging. Over spatial scales up to tens of thousands of square meters it is possible to produce maps gridded to sub centimeter levels. This level of accuracy demands detailed treatments of the sensor relative data, the vehicle navigation data and the vehicle to sensor position and rotational offsets. The presented results will show comparisons between these two sensor modalities. Data have a been collected during recent field programs to the Kolumbo volcanic crater and the Southern Aegean Sea. Our data processing and map making technique is based on the Simultaneous Localization and Mapping (SLAM) concept, which is an active research area in both the marine and land robotics communities. The SLAM method provides a common framework for addressing both sensor and navigation errors in a self consistent manner. Using automated patch registration and filter techniques both the multibeam and laser data can be processed by the same algorithm. Structured light imaging has been a common machine vision technique for 3D shape estimation in industrial applications, but has had limited use underwater. By using a camera to image a projected laser line on the sea floor it is possible to determine the 3D profile of the bottom with sub centimeter resolution. Sequential images taken during a survey can be processed and merged into a bathymetric map in a similar manner as individual multibeam sonar pings. The resulting maps can be gridded down to 2.5 millimeter resolution and clearly show objects just a few centimeters in size. The structured light data have been compared to multibeam sonar data taken with BlueView Technologies sonars operating at both 1375 kHz and 2250 kHz. Such high frequency sonars offer centimeter resolution over ranges to 30 and 10 meters respectively. The difference between the broader footprint acoustic sensors and the structured light most noticeable at the edges of objects with complex shapes. Recent fieldwork mapping submerged archaeological sites has been beneficial in this evaluation. These sites have natural looking sea floor features with simple known geometries that can make identifying errors easier than on natural and unstructured terrain. The calibration of these systems is also a significant part of their application. The design and execution of in-situ calibration exercises, such as patch tests, are critical to obtaining accurate results. To calibrate the laser system we have developed a stereo vision method that automatically selects points on the laser line and solves for relative laser and camera geometry. This allows calibrations to be done in the field using a calibrated stereo rig without a complex tank set up. We also use iterative point matching schemes common in the terrestrial laser scanning literature to obtain the relative offsets between the sonars, cameras and laser. Obtaining these offsets is critical for fusing the acoustic and photographic data in common framework.
AGU session number OS11C-07.