EPSRC logo

Details of Grant 

EPSRC Reference: EP/S033718/2
Title: HEAP: Human-Guided Learning and Benchmarking of Robotic Heap Sorting
Principal Investigator: Neumann, Professor G
Other Investigators:
Kucukyilmaz, Dr A
Researcher Co-Investigators:
Project Partners:
Department: School of Computer Science
Organisation: University of Nottingham
Scheme: Standard Research - NR1
Starts: 01 January 2020 Ends: 28 February 2022 Value (£): 403,985
EPSRC Research Topic Classifications:
Artificial Intelligence Image & Vision Computing
Robotics & Autonomy
EPSRC Industrial Sector Classifications:
Information Technologies
Related Grants:
Panel History:  
Summary on Grant Application Form
This project will provide scientific advancements for benchmarking, object recognition, manipulation and human-robot interaction. We focus on sorting a complex, unstructured heap of unknown objects --resembling nuclear waste consisting of a set of broken deformed bodies-- as an instance of an extremely complex manipulation task. The consortium aims at building an end-to-end benchmarking framework, which includes rigorous scientific methodology and experimental tools for application in realistic scenarios.

Benchmark scenarios will be developed with off-the-shelf manipulators and grippers, allowing to create an affordable setup that can be easily reproduced both physically and in simulation. We will develop benchmark scenarios with varying complexities, i.e., grasping and pushing irregular objects, grasping selected objects from the heap, identifying all object instances and sorting the objects by placing them into corresponding bins. We will provide scanned CAD models of the objects that can be used for 3D printing in order to recreate our benchmark scenarios. Benchmarks with existing grasp planners and manipulation algorithms will be implemented as baseline controllers that are easily exchangeable using ROS.

The ability of robots to fully autonomously handle dense clutters or a heap of unknown objects has been very \textit{limited} due to challenges in scene understanding, grasping, and decision making. Instead, we will rely on semi-autonomous approaches where a human operator can interact with the system (e.g. using tele-operation but not only) and giving high-level commands to complement the autonomous skill execution. The amount of autonomy of our system will be adapted to the complexity of the situation. We will also benchmark our semi-autonomous task execution

with different human operators and quantify the gap to the current SOTA in autonomous manipulation. Building on our semi-autonomous control framework, we will develop a manipulation skill learning system that learns from demonstrations and corrections of the human operator and can therefore learn complex manipulations in a data-efficient manner. To improve object recognition and segmentation in cluttered heaps, we will develop new perception algorithms and investigate interactive perception in order to improve the robot's understanding of the scene in terms of object instances, categories and properties.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.nottingham.ac.uk