What can and what cannot be computed, now or in the future? We are extending the classical theory of digital computation, expanding the conception of computation to deal with any form of discrete and continuous data, any physical form of data representation, and any physical means of computation.
Theory of data and data-centric computing
Data are everywhere and data sets are vast. Data are collected and generated in the scientific analysis of the natural world; the functioning of machines and social organizations; the movements of sports people; in the files of the police and secret service, credit agencies and hospitals; in the customer records of the supermarket; in the packets and pages distributed across the internet and the phone networks; in the videos of our streets from CCTV cameras and satellites; in the discourse of our social media. Data are a commodity. In data-centric computation the primary object of scientific study is data.
At Swansea, research is underway on data and developing the new paradigm of data-centric computation. Typical questions are: How do you collect data through measurement processes? How do you specify and represent data, both mathematically and physically? How do you compute and communicate with data? How do you elicit knowledge and information from data? We are also interested in the question: How do you visualize data? Can the diverse types of data, with their distinct applications, be unified conceptually and practically? Surprisingly, data in its digital and analogue forms can be unified by a general mathematical theory of data based on algebra and logic. It is a theory with profound implications, useful software tools and many applications.
Computing with continuous and analogue data
The first obstacle when taking the step from classical computability over discrete data to computability over continuous data is the problem of how to represent the data. Continuous data can, in general, only be approximated. Thus, a successful theory of computations over continuous data must include a theory of approximations and a theory of representations of successively approximated data. Dr Jens Blanck has made significant foundational contributions to this theory. Using domain theory as a general theory of data approximation, he has made a deep investigation of the nature of representations and of reductions between different representations, which gives insight into which operations are computable with respect to different representations of the data.
Another problem under consideration is the unification of computational theories, methods and tools for computing with analogue and digital data. Professors John Tucker and Jeff I Zucker (McMaster University, Canada) have developed an extensive and general theory of computable functions on algebras, which model any kind of data. Continuous data types such as the real numbers, waveforms and signals, spectra, infinite data streams, state spaces, function spaces, etc., can be modelled using topological algebras. The aim is to create a comprehensive computability theory for functions on topological algebras, from which exact methods for the specification, computation and reasoning with real numbers can be derived.
A third problem is: What are the technological limits of analogue computation? The team is working toward a theory of computing with analogue fields and has created a completely general network model of analogue computing systems distributed in space and operating in continuous time with data from a metric space. The model allows researchers to analyse analogue computation as an experimental process, develop case studies of mechanical and electronic systems, write equational specifications of analogue networks, and prove theorems about the unique solution and computability of the well-posed analogue networks.
Computability and physical systems
In the search for new physical foundations for computation and information processing, Dr Edwin Beggs (Mathematics Department) and Professor John Tucker are developing a theory of functions computable by experiments with physical systems. A fundamental problem is to identify what exactly is an experimental procedure on analogue data, how can it be used to compute, and how does it compare with algorithmic procedures based on digital data. A methodology for tackling such questions has been developed. Examples of exotic low dimensional mechanical systems have been created that can compute all functions and all real numbers. Results lead to an analysis of the construction and observation of physical systems in experiments and the use of new types of programming languages to express experimental procedures.
Building upon these studies of physical machines, with Professor Felix Costa (Lisbon), they have developed a theory of combining physical systems with algorithms. They add physical experiments to algorithms as oracles, generalising an idea of Alan Turing. In a large programme of research they have studied several experiments - mechanical, optical, electrical, atomic - and classified the boost in computational power the systems produce. Remarkably, they have found that (i) a wide range of experiments lead to the same computational class under practical constraints of polynomial time; (ii) this class is called P/log* and proves that such systems routinely break the Turing Barrier, a theoretical limit on the power of computers discovered in 1936. This implies that new technologies may exist that go beyond those we know at the present. The theory can also be used to investigate the nature of physical measurements. Instead of thinking of the system boosting the algorithm, the algorithm can be thought of as controlling the system. New conceptions of the measurement process arise that show that uncertainty is common and inherent in simple experiments.