EPSRC Reference: |
EP/G003017/1 |
Title: |
Complexity and Decidability in Unconventional Computational Models |
Principal Investigator: |
coecke, Professor B |
Other Investigators: |
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
Computer Science |
Organisation: |
University of Oxford |
Scheme: |
Standard Research |
Starts: |
01 October 2008 |
Ends: |
29 February 2012 |
Value (£): |
174,422
|
EPSRC Research Topic Classifications: |
Complexity Science |
Fundamentals of Computing |
New & Emerging Comp. Paradigms |
|
|
EPSRC Industrial Sector Classifications: |
No relevance to Underpinning Sectors |
|
|
Related Grants: |
|
Panel History: |
Panel Date | Panel Name | Outcome |
21 Apr 2008
|
ICT Prioritisation Panel (April 2008)
|
Announced
|
|
Summary on Grant Application Form |
While today's computing has been an amazing success story, there is a growing realization that it covers only a small region within the landscape of the computational potential offered to us by nature. The British Computer Science community strongly acknowledges this fact: the UK Computing Research Committee, in its recent Grand Challenges Exercise, has put Journeys in non-classical computation forward as one of the key milestones for advance in knowledge and technology. However, once we enter the domain of unconventional computational paradigms, we find ourselves in largely unexplored territory where the well established methods and standards of Turing computing may become inapplicable or insufficient.An example that nicely illustrates this is Blakey's factorizing algorithm, which has constant space and time complexities, to be implemented on an unconventional physical computing device. At first sight, this constant complexity seems to challenge the standard cryptographic schemes that are based on the hardness of factorization, for which problem all known classical algorithms run in exponential time. Also, the whole endeavour towards quantum computing would be at stake, since it is strongly motivated by the fact that the run-time of Shor's quantum factorizing algorithm, to be implemented on a quantum computer, grows only polynomially. Of course there is a catch: the `precision' with which Blakey's system must be initialized and measured increases as an exponential function of the size of the number being factorized. That Blakey's system enjoys both time and space complexities that are constant in the size of the input value serves not to highlight the power of the method but to expose the fact that traditional complexity theory seems not to address all relevant resources. Motivated by this example, and others, we aim to address the following questions. For a given model of computation, what are the relevant and required complexity measures to quantify the hardness of problems and the cost (with respect to various resources) of solution methods? How can the complexity of computing devices, possibly from differing computation models and possibly with respect to differing complexity measures, be meaningfully and consistently compared? Is there a general, abstract, compositional theory for adjoining additional resources affecting the complexity of performing certain tasks in various computational models? The distinct computational models we wish to consider all have their own solution methods for given problems. To compare their respective efficiency in solving a particular problem---factorization, say---, we need a general, model-independent scheme for assigning measures of complexity; this scheme should be sufficiently flexible to account for a variety of resources that have an impact on this complexity. Steps in this direction have already been taken by Blakey, in particular with regard to precision. There is also a variety of related work, usually addressing specific problems and computational models, which will be of use. We want in particular to test and apply this framework to areas where we have great expertise. These include real-time and probabilistic automata, unconventional quantum computational architectures, and ground-state computing devices, among others. From a foundational viewpoint, an ultimate aim is to shed light on whether complexity is inherent in problems as opposed to solution methods (regardless of whether the latter be physical or algorithmic).
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
|
Further Information: |
|
Organisation Website: |
http://www.ox.ac.uk |