Autonomous Systems



Marine Robotic Systems

at the ARC Centre of Excellence for Autonomous Systems

Stefan B. Williams, Oscar Pizarro, Ian Mahon, Paul Rigby, Matthew Johnson-Roberson

ARC Centre of Excellence for Autonomous Systems

School of Aerospace, Mechanical and Mechatronic Engineering

University of Sydney, Sydney, NSW 2006 Australia


1.  Introduction

The ARC Centre of Excellence for Autonomous Systems (CAS) was established in January 2003. The Centre brings together three leading research groups; the Australian Centre for Field Robotics (ACFR) at the University of Sydney, the Artificial Intelligence Group at The University of New South Wales (UNSW) and the Mechatronics and Intelligent Systems Group at the University of Technology, Sydney (UTS).  The objectives of the Centre are to undertake innovative fundamental research based around the four themes of perception, control, learning and systems, and to build persuasive experimental demonstrations of complete autonomous systems that integrate the results of these fundamental research themes.


Centre efforts in marine systems are led by the ACFR.  The main objectives of Centre activity in the marine area are in ecological monitoring of structurally complex marine habitats such as the Great Barrier Reef; in defence, for tasks such as autonomous navigation, mapping and search; and for monitoring of effects of oil and gas exploration activities in Australian coastal waters. The core Centre research being exploited in this area are developments in perception, sensing, data fusion and Simultaneous Localisation and Mapping navigation methods. The primary area of research related to this program is in the development of algorithms and methods capable of modelling natural reef environments using data collected by an autonomous robotic system capable of surveying and documenting change on the Great Barrier Reef. 


2.  Autonomous Underwater Vehicle

The ACFR has an ocean going Autonomous Underwater Vehicle (AUV) capable of undertaking high resolution survey work (see Figure 1).  This experimental platform is a modified version of a mid-size robotic vehicle called Seabed built at the Woods Hole Oceanographic Institution [11]. As shown in Figure 2, the submersible is equipped with a mechanically scanned low frequency terrain-aiding sonar, a depth sensor, Doppler Velocity Log (DVL) including a compass with integrated roll and pitch sensors, Ultra Short Baseline Acoustic Positioning System (USBL), forward looking obstacle avoidance sonar, a conductivity/temperature sensor and a high resolution stereo camera pair and strobes.  The vehicle is controlled by an on-board, PC-104 based computing platform which is used for sampling sensor information and running the vehicle's low-level control algorithms.  This platform is intended primarily as a research platform on which to test the novel navigation and sensing strategies being developed as part of this work.  This vehicle undertook its first engineering field trials on the Great Barrier Reef around the Heron Island Research Station and the University of Sydney's One Tree Island Research Station.  


The vehicle has recently returned from deployment on board the AIMS Research Vessel R/V Cape Ferguson.  A series of trials were undertaken to assess benthic habitats off the Ningaloo Reef, WA.  These trials were aimed at evaluating the effectiveness of using an AUV for conducting biodiversity assessment in waters beyond diver depths.  The particular focus of these deployments was on documenting sponge habitats in 40m and 80m depths.  Sample images from one of these dives at a depth of 80m are shown in Figure 3.


The primary focus of our research work is on developing a theoretical framework with which to fuse information from a variety of sensors available in the underwater domain for the construction of detailed environmental models.  This requires the selection of a general solution to the problem of data fusion appropriate for use with subsea sensors.  Work to date in this area has concentrated on methods for building a model of the environment while simultaneously estimating the path taken by a vehicle based on observations taken with its on-board sensing systems [13]. 


Recent development efforts have focused on the integration a high resolution stereo camera pair into the vehicle's sensor suite.  The system consists of two high resolution Prosilica cameras, one colour and one greyscale, interfaced via a firewire bus to a PC104 stack inside the housing.  The cameras are synchronized to a pair of high speed strobes and capture images at up to 5Hz, storing them locally on a hard drive.  The images are used to construct dense point maps from the information collected by the vision system.  A number of approaches to building these models have been investigated including dense stereo vision algorithms [1][5] and feature based methods [2][4].  Figure 4 shows results of applying a dense feature based method for reproducing a local terrain model.


The vehicle is equipped with a pair of sonars.  The first is a mechanically scanned imaging sonar operating at 675 kHz.  This sonar is used for scanning the sea floor below the vehicle and returns terrain profiles.  Methods for fusing this information with visual information are described in [13].  In addition to the scanning sonar, we have integrated a forward looking obstacle avoidance sonar.  This is used for estimating the altitude of the vehicle as well as the forward distance to obstacles.  This information is used in the control of the altitude and forward velocity of the vehicle to minimize the risk of colliding with the reef.  We have recently been awarded funds for the purchase and integration of a low cost, interferometric multibeam sonar on the vehicle's sensor suite.  Combining the information from sonar and vision is one of our areas of active research as shown in Figure 5.


3.  Seafloor Classification

The coral reef environment is rich in both texture and colour. Many of these features are unique and can serve to identify different species of coral. Exploiting these intrinsic properties allows for the automatic segmentation and classification of coral from underwater images.  Work in this area has been investigating methods for automatically classifying terrain based on the visual imagery collected by the vehicle.  Figure 6 shows an example of an image that have been automatically classified by a statistical learning algorithm.  The algorithm is trained to differentiate between different classes of coral, sand and rock based on colour and textural cues extracted from a series of training images.  By combining classified data with the detailed three dimensional models of the reef we anticipate being able to estimate percent cover of various organisms on the reef.  We are refining these models to the point where we can monitor change in the composition of the reef over a period of time.


4.  Information Based Interest Features

Another interesting outcome of this work has been the sheer volume of data collected with the AUV.  At present, we collect high resolution stereo pairs at a rate of 2Hz (although this can be varied through the software systems on board the vehicle).  This corresponds to approximately 10Mb of image data per second or 36Gb of image data per hour of operations resulting in 7200 images on disk per hour of mission time.  This volume of data makes it difficult to quickly scan the images to identify features of interest.  To facilitate quicker assessment of the data, we have been investigating techniques for automatically identifying features of interest within our data sets.  Figure 7 shows an example of the output of a system that calculates statistics based on the distribution of colour and textural cues within an image sequence.  By calculating the Probability Distribution Function (PDF) over image statistics, it is able to identify those areas of the image sequence which are most unique and therefore contain the most information relative to the data set.      


5.  Conclusion

This document has presented the state of the marine robotics program at the ARC Centre of Excellence for Autonomous Systems.  Current work has been focused on integrating a comprehensive suite of sensors onto the Autonomous Underwater Vehicles and the development of terrain aided navigation algorithms suitable for deployment in unstructured underwater environments.


Future work will concentrate on the development of techniques suitable for generating higher resolution terrain models and tying these models to a priori, lower resolution models. This will require the development of methods capable of tracking and identifying large numbers of points within the SLAM framework. Alternatively, higher order features or richer underlying models might be used to increase the efficiency of the tracking process. In addition, we are exploring the unique requirements for coordinated navigation, mission planning and decision making for AUV systems. By providing the vehicle with the ability to reason about the information gathered by its sensors in real time, the proposed techniques will facilitate adaptive control and autonomous decision making during vehicle deployments.


The authors wish to acknowledge the support provided under the ARC Centre of Excellence Program by the Australian Research Council and the New South Wales government.  We would also like to thank DSTO, BAe Systems' Advanced Technology Centre and the Great Barrier Reef Research foundation for their on-going support of this program.



[1]   Birchfield, S. and Tomasi, C. (1999), "Depth Discontinuities by Pixel-to-Pixel Stereo", International Journal of Computer Vision, 35(3): 269-293

[2]   Bouguet J.-Y (2000), "Pyramidal Implementation of the Lucas Kanade Feature Tracker", OpenCV documentation,

[3]   Cover, T.M. and Thomas, J.A. (1991). "Elements of Information Theory.", Wiley Series in Telecommunications., Wiley,New York

[4]   Lucas, B.D., and Kanade, T. (1981) "An Iterative Image Registration Technique with an Application to Stereo Vision", Proceedings of 7th International Joint Conference on Artificial Intelligence, pages 674-679.

[5]   Konolige, K. (1997), "Small vision system. hardware and implementation", Proceedings International Symposium on Robotics Research, pages 111-116,

[6]   Makarenko, A., Williams, S.B., Bourgault, F. and Durrant-Whyte, H.F. (2002) "An Experiment in Integrated Exploration", Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Vol. 1, pages 534-539

[7]   M

fig_1_a.jpg (17k)

Specifications and sensor placement on the Seabed AUV



Vehicle Specifications

Depth rating:    700m

Size:    2.0 m(L) x 1.5 m(H) x 1.5 m(W)

Mass:    200kg

Maximum Speed:    1.2 m/s

Batteries:    1.6 kWh Li-ion pack

Propulsion:    3x150 W brushless DC thrusters


Attitude/Heading:    Tilt (

sensorplacement.jpg (28k)