![](pics/logo.gif) |
MFCS'98 Tutorials
(August 23, 29, 1998)
|
![](pics/fi-hr.gif)
|
|
August 23 |
Abstract state machines.
By E. Borger (Pisa) and Yu. Gurevich (Ann Arbor)
Abstract:
An abstract state machine (ASM) is a modern computation model. You
don't have to code your structures with strings anymore. ASMs
simulate abstract algorithms and practical computer systems
step-per-step on their natural obstraction level (rather than
implement the algorithms on lower abstraction levels).
It turnes out that having a clean mathematical model goes a long way,
and ASMs have been used successfully to specify real-life programming
languages (e.g. Prolog, C++, VHDL, Java) and architectures (e.g. PVM,
Transputer, DLX, APE100, Java VM), to validate standard language
implementations (e.g. of Prolog on the WAM, of Occam on the
Transputer), to verify various distributed and real-time protocols,
(Kermit, Kerberos)etc. They also have been used in theoretical
computer science.
Plenty of information about all that can be found at the ASM home
pages
http://www.eecs.umich.edu/gasm/ and
http://www.uni-paderborn.de/cs/asm
In the tutorial we will explain the theory of ASMs (Yuri Gurevich) and
survey ASM applications (Egon Boerger).
|
![](pics/fi-hr.gif)
|
August 23 |
The Theorema system: An introduction with demos.
By B. Buchberger (RISC - Linz) and T. Jebelan (RISC - Linz)
Abstract:
The Theorema Project aims at integrating proving support into computer
algebra systems. The emphasis is on proof generation for routine parts of
proofs, structured proof presentation in natural language, and smooth
interaction with the existing solving and computing facilities of computer
algebra systems. Our present system frame is Mathematica 3.0.
We will first give an overview on the Theorema Project and then present
more details about the following aspects of the Theorema system:
- A predicate logic prover that imitates the proof style of humans (in
particular the proof style of the authors).
- A couple of of special provers for various "areas" of mathematics (at
present, for equalities over natural numbers, lists, and polynomials),
where an "area" is defined by a functor that generates the domains in the
area. that are currently being developed. The main tool is simplification
together with setting up the induction recursively over the universally
quantified variables.
- Automatic generation of knowledge bases by using the information
contained in the functors.
- Stuctured proof presentation in (technical) natural language by using the
nested cells feature of Mathematica.
- Theory generation versus theorem proving.
- Special proof techniques for proofs in analysis: The interplay between
quantifier proving and rewriting.
The tutorial will be accompanied by demos.
|
![](pics/fi-hr.gif)
|
August 29 |
Approximations algorithms.
By J. Diaz (Rome), and A.
Marchetti-Spaccamela (Rome).
- Approximations preserving reductions and non
approximability
(by P. Crescenzi)
Comparing the complexity of different combinatorial optimization
problems has been an extremely active research area during the last 23
years. This has led to the definition of several approximation
preserving reducibilities and to the development of powerful reduction
techniques. We first review the main approximation preserving
reducibilities that have appeared in the literature and suggest which
one of them should be used. Successively, we give some hints on how
to prove new non-approximability results by emphasizing the most
interesting techniques among the new ones that have been developed in
the last few years.
- Parallel approximation algorithms
(by J. Diaz)
Various problems in Computer Science are hard to solve realistically,
and they have to be approximated. The course presents a series of
paradigms to approximate in parallel (NC or RNC) some problems. Some
of the paradigms: extremal graph properties, rounding, interval
partitionoing and separtion, primal-dual, graph-approximation and
non-approximability.
Reference: J. Diaz, M. J. Serna, P. Spirakis, J. Toran: Paradigms for
fast parallel approximability. Cambridge International Series in
Parallel Computation.
- On-line approximation algorithms for combinatorial
approximation problems.
(by A. Marchetti-Spaccamela)
Recently considerable attention has been dedicated in the literature
to on-line problems (in which the input is disclosed one piece at a time).
We will concentrate on on-line algorithms for well known combinatorially
hard problems focusing the attention on graph problems.
The considered problems include well known problems such as
graph coloring, independent set, traveling salesman and problems that
arise in the design and management of high speed optical network
(e.g. edge disjoint path, steiner tree and suitable variations).
|
![](pics/fi-hr.gif)
|
August 29 |
Quantum logic and quantum computing
By C.H. Bennett (IBM Watson, Yorktown Heights) and K. Svozil (Wien)
- Quantum information
(by C.H. Bennett)
An expanded theory of information transmission and processing has
emerged over the past few years, encompassing the processing and
transmission of intact quantum states, the interaction of quantum and
classical information, the quantitative theory of entanglement, and
the use of quantum information processing to speed up certain
classical computations. We review this field, concentrating on the
parallels with and differences from classical information theory,
which is now best seen as a part of the new theory.
- Quantum logic/automaton logic
(by K. Svozil)
The audience for course is anyone interested in the foundations of
quantum mechanics and, somewhat related to it, the behavior of finite,
discrete and deterministic systems. The concepts are developed in an
expository, jargon-free style.
In the first part of the course, quantum logic is introduced as
pioneered by Garrett Birkhoff and John von Neumann in the thirties.
They organized it top-down: The starting point is von Neumann's
Hilbert space formalism of quantum mechanics. In a second step, certain
entities of Hilbert spaces are identified with propositions, partial
order relations and lattice operations - Birkhoff's field of expertise.
These relations and operations can then be associated with the logical
implication relation and operations such as and, or, and
not. Thereby, a "nonclassical", nonboolean logical structure is
induced which originates in theoretical physics. If theoretical physics
is taken as a faithful representation of our experience, such an
"operational" logic derives its justification by the phenomena
themselves. In this sense, one of the main ideas behind quantum logic
is the quasi-inductive construction of the logical and algebraic order
of events from empirical facts.
This is very different from the "classical" logical approach, which is
also top-down: There, the system of symbols, the axioms, as well
as the rules of inference are mentally constructed, abstract objects of
our thought. Insofar as our thoughts can pretend to exist independent
from the physical Universe, such "classical" logical systems can be
conceived as totally independent from the world of the phenomena.
The applicability of such mentally constructed objects of our thoughts
to the natural sciences often appears unreasonable. Quantum logic is an
example that indeed it is unreasonable to naively apply
abstractly invented concepts to the phenomena. As it turns out, neither
is "classical" Boolean logic a faithful representation of the
relations and operations among physical information, nor can it
a priori be expected that the "classical" logical tautologies
correspond to any physical truth.
The second part of the course deals with several quasiclassical
analogues of quantum logic, in particular Wright's generalized urn
model, the logical structure of automata, and of complementarity games.
|
![](pics/fi-hr.gif) |
|
|
|
![](pics/fi-hr.gif) |
|
|
mfcs98@fi.muni.cz |