EPSRC Reference: |
EP/J014990/1 |
Title: |
Constant-time wide-area monocular SLAM using absolute depth hinting |
Principal Investigator: |
Murray, Professor DW |
Other Investigators: |
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
Engineering Science |
Organisation: |
University of Oxford |
Scheme: |
Standard Research |
Starts: |
01 December 2012 |
Ends: |
31 May 2016 |
Value (£): |
393,464
|
EPSRC Research Topic Classifications: |
|
EPSRC Industrial Sector Classifications: |
Aerospace, Defence and Marine |
Construction |
|
Related Grants: |
|
Panel History: |
Panel Date | Panel Name | Outcome |
07 Mar 2012
|
EPSRC ICT Responsive Mode - Mar 2012
|
Announced
|
|
Summary on Grant Application Form |
Understanding the visual environment is key to allowing machines interact with us and the space we occupy, whether the machine is to take fully autonomous action or just provide us with extra information and advice. A core competence is the ability to reconstruct a 3D representation of a moving camera's surroundings and to locate the camera relative to them from moment to moment. Over the last years, enormous practical strides have been made in this problem of visual simultaneous localization and mapping (visual SLAM), to the point now where robust live reconstruction is possible on modest hardware using stereo cameras and, more challengingly, using just a single camera.
We are concerned here with single camera visual SLAM, of particular importance when the payload and power manifest has to be kept small. The state of the art allows reconstructions containing several tens of thousands of 3D point locations imaged from a few hundred viewpoints to be optimized on the fly. Though this sounds large, in practice these numbers restrict the operational scope to modestly-sized environments. Our aim in this proposal is to tackle the two chief impediments to increasing that scope.
First is algorithmic computational complexity. Polynomial complexity in the number of map points and/or the number of camera positions gradually stifles live operation as the environment expands. This means that gains in processor speed do not lead to proportional gains in map size, and we cannot merely wait for cpu development to catch up. Here we will pursue our recent work on constant-time exploration in monocular SLAM, in which only a local region around the camera needs to be re-optimized frame-by-frame.
Second is monocular vision's inherent depth/speed scaling ambiguity. Using image motion alone only relative depth, rather than absolute depth, is observable. As the camera moves around, uncertainty builds up not only in position and orientation, but also in the scale of the surroundings. This leads to added difficulty when returning to a previously visited location: not only do the surroundings appear translated and twisted, but they also appear the wrong size. But a human one-eyed observer does not suffer in the same way: whether deprived of stereo vision or not, we use other visual clues to maintain our sense of scale, clues from objects, object classes, and from low-level image traits. It is these that this project plans to glean to provide partial information about absolute depth, sufficient to remove that extra degree of freedom in the solution.
We aim to produce a SLAM algorithm (i) that functions at video frame-rate; (ii) that functions in on-average constant time quite independently of the size of the map it is constructing, and (ii) that behaves gracefully whatever quality of depth information is provided to it.
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
|
Further Information: |
|
Organisation Website: |
http://www.ox.ac.uk |