Work Area: Algorithms for Parallelism and Efficiency
Keywords randomised algorithms, randomised complexity of computational problems, derandomising algorithms, theory of computation
Start Date: 24 July 92 / Duration: 36 months / Status: running
[ participants / contact ]
Abstract RAND addresses fundamental issues of the design of efficient randomised algorithms as well as fundamental complexity questions of randomised computation concerning their time and space efficiency, computation with limited randomness resources, and the problems of deterministic simulation of randomised computation.
In recent years, randomised algorithms and randomised complexity theory have become major subjects of study in the design of computer algorithms and in the theory of computation. For some computational problems, it now appears that randomised or pseudo-randomised algorithms are more efficient than deterministic ones (in terms of hardware size, running time, circuit depths, transparent descriptions, etc). Recent and very striking examples are the new randomised approximation algorithms for enumerating the number of perfect matchings in certain classes of graphs or for some enumeration problems in the boolean algebra and finite fields areas, and estimating the volume of convex bodies. Solutions to these problems have applications ranging from circuit design and coding theory to statistical mechanics and quantum field theory.
In this new and fast-growing area, we are confronted with a number of very fundamental questions for which the classical theory of computation knows few answers. There are still questions remaining, such as how much randomness is necessary to accomplish certain levels of efficiency in algorithms, and how efficient the deterministic simulation of some classes of algorithms can be.
The Group will address the above-mentioned problems and concentrate on the design of efficient (both sequential and parallel) randomised algorithms for some selected combinatorial, algebraic and geometric problems foundations of randomised complexity of computational problems; randomised approximation problems; computation with limited randomness resources (de-randomisation methods); and computational learning theory.
A Workshop on "Randomised Algorithms" was conducted in March '93 in Bonn, and another one on "Randomness and Computation" in July '93 in Edinburgh (partially supported by RAND) as well as a number of specialised research seminars. Exchange of research visits to participating sites and other leading research centers has took place, as well as the exchange of postdoctoral students.
Substantial progress has been made in all the main research areas of RAND, especially in the design of efficient randomised algorithms, randomised approximation algorithms, foundations of randomised complexity theory, derandomising algorithms, and computational learning theory.
RAND's research should shed some light on the relative power of randomisation as a computational resource for designing efficient (both sequential and parallel) algorithms, as well as facilitating the design of efficient pseudo-random generators for a number of selected algorithmic applications.
RAND Workshop's Proceedings has been published in the Research Report Series of the Department of Computer Science, University of Bonn, ISSN 0177-3623, as well the Annotated RAND Bibliography. The research papers has been published in the conferences' proceedings and the referred international journals.
Universität Bonn - D
Institut für Informatik
D - 53117 BONN
CNRS-LRI-Université de Paris Sud - F
University of Lund - S
University of Oxford - UK
University of Edinburgh - UK
University of Leeds - UK
Prof. M. Karpinski
tel +49/228 550 224
fax +49/228 550 440
RAND - 7097, August 1994
please address enquiries to the ESPRIT Information Desk
html version of synopsis by Nick Cook