EPSRC Reference: |
EP/I032533/1 |
Title: |
Bioinspired Control of Electro-Active Polymers for Next Generation Soft Robots |
Principal Investigator: |
Porrill, Professor J |
Other Investigators: |
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
Psychology |
Organisation: |
University of Sheffield |
Scheme: |
Standard Research |
Starts: |
03 January 2012 |
Ends: |
02 January 2016 |
Value (£): |
1,081,051
|
EPSRC Research Topic Classifications: |
Control Engineering |
Microsystems |
Robotics & Autonomy |
|
|
EPSRC Industrial Sector Classifications: |
No relevance to Underpinning Sectors |
|
|
Related Grants: |
|
Panel History: |
Panel Date | Panel Name | Outcome |
16 Feb 2011
|
Materials, Mechanical and Medical Engineering
|
Announced
|
|
Summary on Grant Application Form |
The current generation of robots is typically built of hard, inflexible and insensitive materials, in contrast to animal bodies which use soft and elastic elements with the capacity for sensing and shape change. A revolution is beginning in robotics in which new soft-smart materials more similar to animal tissues are being introduced. One such material is the class of electroactive polymers (EAPs)-flexible plastics that can change shape when an electric current is passed through them and thus can act rather like animal muscles. However, in order to use these new materials in robotics we need to understand how to control them. Soft-smart materials pose particular problems because their response to stimulation may be complex, their material properties can change over time, and they are often fabricated with wide tolerances. Since these are the same problems posed by animal tissues it is natural to look to biology as a source of inspiration. In mammals, a part of the brain called the cerebellum, or little brain , is particularly associated with skilled control of movement, the integration of sensing with action, and the capacity to adapt to change. In this project we therefore seek to derive new ways of controlling soft materials from our knowledge of cerebellar function. We believe this work can unlock the potential of such materials leading to machines with the fluidity and grace of movement we associate with animals. This change will have many knock-on effects. Robots with soft body parts and skilled movement will be able to go places that are currently impossible for them. For instance, exploring disaster sites or remote worlds, picking their way through boulders, squeezing though crannies, whilst automatically adapting to wear-and-tear. Robots with softer bodies and flexible limbs will also be safer for humans to interact with, and thus can play a wider role in our homes, schools and hospitals.The first step will be to explore how soft materials, in this case EAP actuators, can be used as artificial muscles on two robot platforms. One will acquire tactile information from its surroundings by copying the facial whiskers of animals such as mice and rats. These animals move their whiskers back-and-forth at high-speed while exploring-in our robot these accurate, high-velocity whisking movements will be performed by EAP actuators. Our second platform will have an EAP-controlled artificial eye designed to acquire visual information during robot movement. To provide a stable visual image this robot needs to generate rapid changes in camera position that counteract movements of its body and head. The second step is the development of cerebellar-inspired control algorithms. This will lead to a software cerebellar chip that can be plugged into different tasks without prior knowledge of all the requirements-the chip will learn online how best to control the soft materials. These algorithms will be shown to be effective in solving two fundamental problems for active sensing in our EAP-based robots-recognising spurious sensory signals caused by your own movement, and controlling sensor position so as to maintain a steady gaze.The final objective is to demonstrate the versatility of the cerebellar chip by applying it to a task that combines soft-control of both tactile and visual sensing systems. Here we will train a new robot to orient accurately to novel targets whilst it also learns about the properties of its sensing and actuation systems. The underlying representation will be a head-centred sensory map of the environment that is adaptively generated and modified by experience. Performing this task in a life-like way will not only demonstrate the capacity of cerebellar algorithms to effectively control smart materials but will also show that these two components combined can bring about a step-change in robotics that will lead to a future in which we have more versatile, agile, and user-friendly robots.
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
|
Further Information: |
|
Organisation Website: |
http://www.shef.ac.uk |