EPSRC Reference: |
EP/H012338/1 |
Title: |
Topology-based Motion Synthesis |
Principal Investigator: |
Komura, Professor T |
Other Investigators: |
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
inst. of Perception Action and Behaviour |
Organisation: |
University of Edinburgh |
Scheme: |
Standard Research |
Starts: |
30 September 2010 |
Ends: |
28 February 2014 |
Value (£): |
458,934
|
EPSRC Research Topic Classifications: |
Artificial Intelligence |
Control Engineering |
Robotics & Autonomy |
|
|
EPSRC Industrial Sector Classifications: |
Manufacturing |
Healthcare |
Information Technologies |
|
|
Related Grants: |
|
Panel History: |
Panel Date | Panel Name | Outcome |
08 Sep 2009
|
Materials, Mechanical and Medical Engineering
|
Announced
|
|
Summary on Grant Application Form |
One of the major drivers of research in the area of humanoid robotics is the desire to achieve motions involving close contact between robots and the environment or people, such as while carrying an injured person, handling flexible objects such as the straps of a knapsack or clothes. Currently, these applications seem beyond the ability of existing motion synthesis techniques due to the underlying computational complexity in an open-ended environment. Traditional methods for motion synthesis suffer from two major bottlenecks. Firstly, a significant amount of computation is required for collision detection and obstacle avoidance in the presence of numerous close contacts between manipulator segments and objects. Secondly, any particular computed solution can easily become invalid as the environment changes. For instance, if the robot were handling an object such as a knapsack, even small deformations of this flexible object and minor changes in object dimensions (e.g., between an empty bag and a stuffed bag) might require complete re-planning in the current way of solving the problem. Similar issues arise in the area of computer animation, where there is a need for real-time control of characters - moving away from static sequences of pre-programmed motion. Although it may seem that this world is much more contained, as it is created by an animation designer, there is in fact a strong desire to create games and simulation systems where the users get to interact with the world continually and expect the animation system to react accordingly. This calls for the same sort of advances in motion synthesis techniques as outlined above.The fundamental problem lies in the representation of the state of the world and the robot. Typically, motion is synthesizes in a complete configuration or state space represented at the level of generalized coordinates enumerating all joint angles and their 3D location/orientation with respect to some world reference frame. This implies the need for large amounts of collision checking calculations and randomized exploration in a very large search space. Moreover, it is very hard to encode higher level, semantic, specifications at this level of description as the individual values of the generalized coordinates do not tell us anything unless further calculations are carried out to ensure satisfaction of relevant constraints. This is particularly inconvenient when searching for a motion in a large database. The focus of this research is to alleviate these problems by developing methods that exploit the underlying topological structure in these problems, e.g., in the space of postures. This allows us to define a new search space where the coordinates are based on topological relationships, such as between link segments. We refer to this space in terms of 'topology coordinates'. In preliminary work, we have shown the utility of this viewpoint for efficient motion synthesis with characters that are in close contacts. We have also demonstrated that this approach is more efficient for categorizing semantically similar motions. In this project, we will develop a more general framework of such techniques that will be applicable to a large class of tasks carried out by autonomous humanoid robots and virtual animated characters. Moreover, we will implement our techniques on industrially relevant platforms, through our collaborators at Honda Research Institute Europe GmbH and Namco Bandai, Japan.
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
|
Further Information: |
|
Organisation Website: |
http://www.ed.ac.uk |