The Complexity of Constraints: Dichotomies and Beyond

ABSTRACT. This talk is devoted to the recent results on the Dichotomy Conjecture for the Constraint Satisfaction Problem
(CSP) posed by Feder and Vardi in 1993. We will briefly
review the research leading up to the dichotomy theorem for the CSP,
outline the solution algorithm, and discuss remaining open problems and
future research directions.

ABSTRACT. Additive
number theory is the study of the additive properties of sets of
natural numbers. One of the most famous results in this field is
Lagrange's theorem from 1770: every natural number is the sum of
four squares.

In this talk I will show how logic and finite automata can be used to
mechanically construct proofs of theorems in additive number theory of
genuine interest to number theorists. For example, we can prove
that every natural number is the sum of four binary palindromes (numbers
whose base-2 representation reads the same forwards and backwards).

I will give examples of what can be proven use this idea, mention some
open problems, and talk about future applications and limitations.
This represents joint work with Jason Bell, Dirk Nowotka, Tim Smith,
Aayush Rajasekaran, Finn Lidbetter, P. Madhusudan, Kathryn Hare, Daniel
Kane, and Carlo Sanna.

11:10

Amaury Pouly

Continuous Models of Computation: Computability, Complexity, Universality

ABSTRACT. In
1941, Claude Shannon introduced a continuous-time analog model of
computation, namely the General Purpose Analog Computer (GPAC). The GPAC
is a physically feasible model in the sense that it can be implemented
in practice through the use of analog electronics or mechanical devices.
It can be proved that the functions computed by a GPAC are precisely
the solutions of a special class of differential equations where
the right-hand side is a polynomial. Analog computers
have since been replaced by digital counterpart. Nevertheless, one can wonder how the GPAC could be compared to Turing machines.

A few years ago, it was shown that Turing-based paradigms and the GPAC
have the same computational power. However, this result did not
shed any light on what happens at a computational complexity level. In
other words, analog computers do not make a difference about what can be
computed; but maybe they could compute faster than a digital computer. A
fundamental difficulty of continuous-time model is to define a proper
notion of complexity. Indeed, a troubling problem is that many models
exhibit the so-called Zeno's phenomenon, also known as space-time
contraction.

In this talk, I will present results from my thesis that give several
fundamental contributions to these questions. We show that the GPAC has
the same computational power as the Turing machine, at the complexity
level. We also provide as a side effect a purely analog,
machine-independent characterization of P and Computable Analysis.

I will present some recent work on the universality of polynomial
differential equations. We show that when we impose no restrictions at
all on the system, it is possible to build a fixed equation that is
universal in the sense it can approximate arbitrarily well any
continuous curve over R, simply by changing the initial condition of the
system.

If time allows, I will also mention some recent application of this work
to show that chemical reaction networks are strongly Turing complete
with the differential semantics.

11:50

Jan Křetínský

A Journey from LTL to Your Favourite Automaton

ABSTRACT.
The automata-theoretic approach to LTL model checking relies on
translations of LTL formulae into automata. Efficiency of and techniques
used in these translations differ depending on the target automata
class. For instance, deterministic automata were traditionally produced
using Safra's construction, which is quite complex and practically not
very efficient. In this work, we present a unified translation of LTL
formulas into deterministic Rabin automata, limit-deterministic Büchi
automata, and nondeterministic Büchi automata. The translations yield
automata of asymptotically optimal size (double or single exponential,
respectively). All three translations are derived from one single Master
Theorem of purely logical nature. The Master Theorem decomposes the
language of a formula into a positive boolean combination of languages
that can be translated into automata by elementary means. In particular,
Safra's, ranking, and breakpoint constructions used in other
translations are not needed.