EPSRC Reference: |
EP/V062506/1 |
Title: |
COHERENT: COllaborative HiErarchical Robotic ExplaNaTions |
Principal Investigator: |
Coles, Dr AI |
Other Investigators: |
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
Informatics |
Organisation: |
Kings College London |
Scheme: |
Standard Research - NR1 |
Starts: |
01 April 2021 |
Ends: |
31 March 2024 |
Value (£): |
188,319
|
EPSRC Research Topic Classifications: |
|
EPSRC Industrial Sector Classifications: |
No relevance to Underpinning Sectors |
|
|
Related Grants: |
|
Panel History: |
|
Summary on Grant Application Form |
For robots to build trustable interactions with users two aspects will be crucial during the next decade. First, the ability to produce explainable decisions combining reasons from all the levels
of the robotic architecture from low to high level; and second, to be able to effectively communicate such decisions and re-plan according to new user inputs in real-time along with the execution.
COHERENT will develop a novel framework to combine explanations originated at the different robotic levels into a single explanation. This combination is not unique and may depend on several factors including the step into the action sequence, or the temporal importance of each information source. Robotic tasks are interesting because they entail performing a sequence of actions, and thus the system must be able to deliver these explanations also during the execution of the task, either because the user requested or actively because an unforeseen situation occurs. COHERENT will propose effective evaluation metrics oriented to the special case of explanations in HRI systems. The proposed measures, based on trustworthiness and acceptance, will be defined together with the definition of benchmark tasks that are repeatable and enable the comparison of results across different explainable developments.
We will demonstrate our framework for hierarchical explanation components through a manipulation task of assisting a human to fold clothes. Cloth manipulation is a very rich example that requires considering bi-manual manipulations, environmental constraints, and perception of textiles for its state estimation. Eventually, the robot can even require the user to help in doing difficult actions by providing relevant information, so interaction opportunities are multiple. We will build on previous results on cloth manipulation to develop explainable machine learning techniques from the perception, learned movements, task planning and interaction layers, based on a novel generic representation, the Cohesion Graph, that is shared across the layers. The COHERENT framework will be integrated into the standard planning system ROSplan to increase its visibility and adoption.
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
|
Further Information: |
|
Organisation Website: |
|