Work Area: Neural Networks
Keywords neural networks, computational learning, learning algorithms and complexity
Start Date: to be announced / Status: starting
[ participants / contact ]
Abstract The Working Group will address the key questions of the computational power of networks and the complexity of learning within a rigourous framework. Analysis of computational power provides a semantics that will give insight into which networks should be used to solve a particular problem, while computational learning furnishes the underlying theory required for applying the technology. Although the focus is on neural networks, the analysis of learning will also be applied to the inference of structure from data in constraint networks, Bayesian networks and recurrent networks.
The aim of NEUROCOLT is to develop a fundamental understanding of learning and of when and how it can be implemented algorithmically. Particular classes of adaptive systems include neural networks with discrete and continuous activations, genetic algorithms and other paradigms.
The group's activities will be divided into three areas. Foundations of Learning will investigate the frameworks within a study of learning that can be undertaken. There are already a number of different settings that have been analysed. Of particular interest will be representational issues, the influence of different probability distributions governing data occurrence, learning via queries, as well as relations between the different frameworks. The understanding of learning already developed has revealed the crucial role of finding compact description of the data. This Minimum Description Length (MDL) principle will be studied in connection with genetic programming, as will Bayesian and stochastic methods for structural induction. In the area of Algorithmics , the emphasis will be on neural learning for continuous and other networks, also within the framework of agnostic on-line learning. In addition, the algorithmic implications of MDL and genetic programming, as well as the principled analysis of Bayesian and Constraint Networks will be studied. Neural Networks and Continuous Complexity Theory will study neural network representational power, with the focus again on neural networks with real activations, their behaviour as recurrent networks, and their analysis through the development of a theory of complexity over the reals. This theory will also applied to other analog models of computation.
These three work areas will bring together the wide range of research being undertaken within the Working Group. The main interactions will take place via group visits to collaborating sites focussing on a particular topic. There will also be a yearly week-long meeting, when work area groups and all project sites will share results and plan future activities. In addition, a Working Group Technical Report Series will be initiated and made publically available via remote ftp. The group will organise one international conference in the area of Computational Learning Theory and Continuous Complexity.
In the short term the group's work will provide fundamental insights into the principles involved in developing adaptive learning systems, both in terms of the algorithmic implications of the adaption process, as well as in the design of systems with sufficient capabilities for particular tasks. These insights will allow the development of concrete applications in a wide range of domains, including those where analog values are involved.
University of London - UK
Royal Holloway and Bedford New College
UK- Egham TW20 OEX (Surrey)
Institut für Gründlagen der Informationsverarbeitung - A
University of Mons-Hainaut - B
RWTH Aachen - D
Universitat Pompeu Fabra - E
ENS-LYON - F
Universita degli Studi di Milano - I
CWI - NL
University of Helsinki - SF
London School of Economics - UK
Dr. John Shawe-Taylor
tel +44/784 443430
fax +44/ 784 443420
NEUROCOLT - 8556, August 1994
please address enquiries to the ESPRIT Information Desk
html version of synopsis by Nick Cook