EPSRC logo

Details of Grant 

EPSRC Reference: EP/D506549/1
Title: Understanding Biological Motion using Moving Light Displays
Principal Investigator: Campbell, Dr N
Other Investigators:
Benton, Dr C Gibson, Dr D Scott-Samuel, Professor N
Researcher Co-Investigators:
Project Partners:
Department: Computer Science
Organisation: University of Bristol
Scheme: Standard Research (Pre-FEC)
Starts: 01 February 2006 Ends: 31 July 2009 Value (£): 354,778
EPSRC Research Topic Classifications:
Image & Vision Computing Vision & Senses - ICT appl.
EPSRC Industrial Sector Classifications:
Creative Industries
Related Grants:
Panel History:  
Summary on Grant Application Form
The major inspiration for this work is that the automatic extraction and understanding of biological motions is largely an unsolved problem. Because of the complexity and variability of natural motions (and hence the image sequences that encode them), automatic computer vision based analysis of such data is under-developed. Current techniques rely on highly constrained data collection methods or intrusive physical interaction. Traditional motion capture relies on the placement of trackable locators (often affixed to lycra suits) which is often impractical and sometimes impossible. Apart from the expense and the need for a constrained filming environment, locators cannot easily be placed on wild or small creatures. To be able to automatically extract characteristic motion from existing film footage is desirable since many applications could benefit from such information. Such applications include media database searching and understanding, accurate automated surveillance systems, medical based gait analysis, computer animation and robotics.We propose that a computer vision based approach to the understanding of the Moving Light Display phenomenon will improve, in a general sense, the interpretation of motions shown in image sequences of animated animals and humans. The MLD (also known as the Point Light Display, PLD), consists of a small set of point lights that describe an action or behaviour (walking or dancing for example), of a subject. A great deal of work has been carried out on the analysis of MLDs and the design of stimuli for psychophysical experiments. Psychologists have shown that high-level information can be extracted from the temporally varying point light set moving such that actions or behaviour such as gait, personal identity, gender and even mood can be determined. It has also been shown that an average observer can deal with a significant amount of MLD perturbation such as noise, occlusion, translation and rotation (in both 2D and 3D). This observation implies that a large amount of information can be extracted from a small set of moving point lights.It is interesting to note that all the major advances in human tracking utilising computer vision over the last ten years, have been based around the appearance (colour, texture etc.) of the subject. This is not surprising since, using a constrained or known background and making assumptions about coherent clothing, strong edges or skin colour, such approaches have led to some impressive algorithms and systems. One of the main claims here, however, is that by doing so, the field at large has missed an important step. A small number of point lights representing an action, motion or emotion is easily perceived by humans, but we still have almost no underlying theorems as to why this should be so. Crucially, we have no 'manual' as how to transfer this skill of humans so as to provide an engineering solution to a range of important visual inspection tasks. This proposal aims to address this gap in our knowledge.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.bris.ac.uk