EPSRC logo

Details of Grant 

EPSRC Reference: EP/N03211X/1
Title: Morphological computation of perception and action
Principal Investigator: Nanayakkara, Dr T
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Active8 Robots Shadow Robot Company Ltd Southern Scientific
Department: Informatics
Organisation: Kings College London
Scheme: Standard Research
Starts: 11 July 2016 Ends: 31 January 2017 Value (£): 307,919
EPSRC Research Topic Classifications:
Development (Biosciences) Med.Instrument.Device& Equip.
Robotics & Autonomy Robotics & Autonomy
EPSRC Industrial Sector Classifications:
Healthcare
Related Grants:
Panel History:
Panel DatePanel NameOutcome
13 Apr 2016 Engineering Prioritisation Panel Meeting 13 April 2016 Announced
Summary on Grant Application Form
Living beings share the same embodiment for sensing and action. For instance, the spindle sensors that provide the feeling of a joint angle and speed are embedded on the muscles that actuate this joint. The tendon sensors that provide the feeling of force too are directly involved in actuation of the joint. Do the function of these sensors change when the muscles are activated to take action? Does the co-activation of antagonistic muscles play a role not only in actuation, but also in perception? This project will investigate these questions through targeted experiments with human participants and controllable stiffness soft robots that provide greater access to internal variables.

Recent experiments we have conducted on localising hard nodules in soft tissues using soft robotic probes have shown that tuning the stiffness of the probe can maximise information gain of perceiving the hard nodule. We have also noticed that human participants use distinct force-velocity modulation strategies in the same task of localising a hard nodule in a soft tissue using the index finger. This raises the question as to whether we can find quantitative criteria to control the internal impedance of a soft robotic probe to maximise the efficacy of manipulating a soft object to perceive its hidden properties like in physical examination of a patient's abdomen.

In this project, we will thus use carefully designed probing tasks done by both human participants and a soft robotic probe with controllable stiffness to access various levels of measurable information such as muscle co-contraction, change of speed and force, to test several hypotheses about the role of internal impedance in perception and action. Finally, we will use a human-robot collaborative physical examination task to test the effectiveness of a new soft robotic probe with controllable stiffness together with its stiffness and behaviour control algorithms. We will design and fabricate the novel soft robotic probe so that we can control the stiffness of its soft tissue in which sensors will be embedded to obtain embodied haptic perception. We will also design and fabricate a novel soft abdomen phantom with controllable stiffness internal organs to conduct palpation experiments. The innovation process of the above two designs - the novel probe and the abdomen phantom - will be done in collaboration with three leading industrial partners in the respective areas. The new insights will make a paradigm shift in the way we design soft robots that can share the controllable stiffness embodiment for both perception and action in a number of applications like remote medical interventions, robotic proxies in shopping, disaster response, games, museums, security screening, and manufacturing.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: