EPSRC Reference: |
EP/G012261/1 |
Title: |
WG 2008: 34th International Workshop on Graph-Theoretic Concepts in Computer Science |
Principal Investigator: |
Broersma, Professor H |
Other Investigators: |
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
Computer Science |
Organisation: |
Durham, University of |
Scheme: |
Standard Research |
Starts: |
30 June 2008 |
Ends: |
29 December 2008 |
Value (£): |
19,847
|
EPSRC Research Topic Classifications: |
Fundamentals of Computing |
|
|
EPSRC Industrial Sector Classifications: |
No relevance to Underpinning Sectors |
|
|
Related Grants: |
|
Panel History: |
|
Summary on Grant Application Form |
The annual international workshops on Graph-Theoretic Concepts in Computer Science, known within the community as WG, have a long tradition of bringing together researchers from (theoretical) computer science and discrete mathematics who are working on graph-theoretic concepts and their use in computer science.Graphs in this context are abstract mathematical structures used to model pairwise relations between objects from a certain collection. A graph represents a collection of vertices and a collection of edges that join certain pairs of vertices. A graphmay be undirected, meaning that the relations between pairs of vertices are symmetric, or its edges may be directed from one vertex to another; and there are many other variations in the types of graphs that are commonly considered. For instance, a graph structure can be extended by assigning a weight to each (directed) edge of thegraph. Graphs with weights, or weighted graphs, are used to represent structures in which pairwise connections have some properties that can be captured by numerical values. For example, if a graph represents a computer network the weight of an edge could represent the capacity of the corresponding link. In fact, a directed graph with weighted edges in the context of graph theory is also often called a network. Because the objects that are represented by the vertices of a graph can be from any plausible collection, structures that can be represented as graphs are ubiquitous, and therefore many problems of practical interest can be and have been modelled as graph problems.The number of concepts that can be defined on graphs is also very large, and many such concepts generate deep problems or famous conjectures (for instance the notorious Four Colour Problem). In fact, many of these concepts or theoretical questions arise from practical problems (and not just from the mathematicians' imagination) and from the urge to solve real-life problems modelled by graphs. Moreover, as these models often involve very large graphs and cannot be solved by hand, researchers in algorithmic graph theory try (if possible) to find efficient algorithms for solving these problems. Within computer science the most obvious graph-theoretic model is that of a computer network, in which vertices represent computers and edges (or directed edges) represent bidirectional (or unidirectional) links between pairs of computers. This model and itsvariants have led to the development and application of many graph-theoretic concepts and graph algorithms, e.g. for routing, crawling, clustering, etc., but also for structural measures of vulnerability or other performance measures. More recent applications can be found in the rich and popular areas of sensor networks, ad-hoc networks and dynamic networks, where physical links as well as wireless (and often temporary) connections can be modelled as edges, yielding graphs that change constantly over time. Another application of graph-theoretic concepts within computer science is related to the internet, in which the structure of websites can be represented by a directed graph: the vertices are the web pages available at the internet and a directed edge from a page A to a page B exists if and only if A contains a hyperlink to B. The development of algorithms to handle (large) graphs is therefore of major interest in computer science. Studying web graphs gives insight into algorithms for crawling, searching or ranking web pages. Here is another example of an interesting connection between a very applied area from everyday life and a central issue in graph theory. In spectral graph theory researchers try to understand, estimate and find eigenvectors and eigenvalues of graphs. The study of random walks on graphs was one of the first applications of spectral graph theory. A more recent application which almost all of us use in our daily life is Google's page rank algorithm which is based on such random walks on the web graph.
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
|
Further Information: |
|
Organisation Website: |
|