Invited speakers

There will be four invited talks at FCT 2017. The invited speakers are:

There will also be one invited talk in memoriam of Zoltan Ésik given by

Thomas Colcombet
Thomas Colcombet is currently a full time senior researcher at the CNRS, working in the Institut de Recherche en Informatique Fondamentale, Paris. After studying at the Ecole Normale Supérieure de Lyon, he received a PhD degree from the university of Rennes. His research is in automata theory in a broad sense, and in particular its connections to algebra, category theory, topology, model theory and algorithmic logic as well as game theory.





Automata and Program Analysis
Based on joint work with Laure Daviaud and Florian Zuleger.
We show how recent results concerning quantitative forms of automata help providing refined understanding of the properties of a system (for instance, a program). In particular, combining size change abstraction together with results concerning the asymptotic behavior of tropical automata, we shall see how to obtain extremely fine complexity analysis of some pieces of code.






Martin Dietzfelbinger
Martin Dietzfelbinger is a full professor of Computer Science (for complexity theory and efficient algorithms) at Technische Universität Ilmenau. He received a Diplom (Master’s degree) Mathematics from the Ludwig Maximilians Universität Munich in 1983, was awarded a PhD degree in Computer Science from the University of Illinois at Chicago in 1987, and completed his hablitation at the University of Paderborn in 1992. Before moving to Ilmenau in 1998, he was a professor in the Computer Science Department of the University of Dortmund for several years. Nowadays, his main research interests lie in understanding the power of randomization in data structures and algorithms. A substantial part of his work deals with different aspects of foundations and applications of hashing in data structures and algorithmics.

Optimal Dual-Pivot Quicksort: Exact Comparison Count
Based on joint work with Martin Aumüller, Daniel Krenn, Clemens Heuberger, and Helmut Prodinger.

Quicksort, proposed by Hoare in 1961, is a venerable sorting algorithm - it has been thoroughly analyzed, it is taught in basic algorithms classes, and it is routinely used in practice. Can there be anything new about Quicksort today? Dual-pivot quicksort refers to variants of classical quicksort where in the partitioning step two pivots are used to split the input into three segments. Algorithms of this type had been studied by Sedgewick (1975) and by Hennequin (1991), with no further consequences. They received new attention starting from 2009, when a dual-pivot algorithm due to Yaroslavskiy, Bentley, and Bloch replaced the well-engineered quicksort algorithm in Oracle's Java 7 runtime library. An analysis of a variant of this algorithm by Nebel and Wild from 2012, where the two pivots are chosen randomly, showed there are about \(1.9n \ln n\) comparisons on average for \(n\) input numbers. (Other works ensued. Standard quicksort has \(2n \ln n\) expected comparisons. It should be noted that on modern computers parameters other than the comparison count will determine the running time.) In the center of the analysis is the partitioning procedure. Given two pivots, it splits the input keys in "small" (smaller than small pivot), "medium" (between the two pivots), "large" (larger than large pivot). We identify a partitioning strategy with the minimum average number of key comparisons in the case where the pivots are chosen from a random sample. The strategy keeps count of how many large and small elements were seen before and prefers the corresponding pivot. The comparison count is closely related to a "random walk" on the integers which keeps track of the difference of large and small elements seen so far. An alternative way of understanding what is going on is a Pólya urn with three colors. For the fine analysis it is essential to understand the expected number of times this random walk hits zero. The expected number of comparisons can be determined exactly and as a formula up to lower terms: It is \(1.8n \ln n + 2.38..n + 1.675 \ln n + O(1)\). Extensions to larger numbers of pivots will be discussed.






Juraj Hromkovič
Juraj Hromkovič is professor of Information Technology and Education at the Department of Computer Science at ETH Zurich since January 2004. Born in Bratislava in 1958, he studied computer science at the Comenius University, where he received his PhD in 1986 and his habilitation in 1989. From 1990 to 1994, he was visiting professor at the University of Paderborn, from 1994 to 1997 professor for parallel computing at Christian Albrechts University Kiel, from 1997 to 2003 professor for algorithmics and complexity at RWTH Aachen. In 2001, he was elected member of the Slovak Academic Society. Since 2010, he is member of Academia Europaea. In 2015, he was honored by the Slovak state award Goodwill Envoy. In 2017, he got the Pribina Cross of the first order from the President of the Slovak Republic. His research and teaching interests focus on informatics education, algorithmics for hard problems, complexity theory with special emphasis on the relationship between determinsm, randomness, and nondeterminism. One of his main activities is writing textbooks which make complex recent discoveries and methods accessible for students and practitioners, and so contributing to the speed up of the transformation of new paradigmatic research results into educational folklore. In order to introduce the subject informatics to the school education, he founded the Centre for Computer Science Education in 2005. He is responsible for the master program Lehrdiplom Informatik at ETH devoted to the education of computer science teachers.

What one has to know when attacking P vs. NP
Based on joint work with Peter Rossmanith.

Mathematics was developed as a strong research instrument with fully verifiable argumentations. We call any consistent and sufficiently powerful formal theory that enables to algorithmically verify for any given text whether it is a proof or not algorithmically verifiable mathematics (AV-mathematics for short). We say that a decision problem $L \subseteq \Sigma^\ast$ is almost everywhere solvable if for all but finitely many inputs $x \in \Sigma^\ast$ one can prove either “$x \in L$” or “$x \not\in L$” in AV-mathematics.

First, we formalize Rice's theorem on unprovability, claiming that each nontrivial semantic problem about programs is not almost everywhere solvable in AV-mathematics. Using this, we show that there are infinitely many algorithms (programs that are provably algorithms) for which there do not exist proofs that they work in polynomial time or that they do not work in polynomial time. We can prove the same also for linear time or any time-constructible function.

Note that, if $\textsf{P}\ne \textsf{NP}$ is provable in AV-mathematics, then for each algorithm $A$ it is provable that “$A$ does not solve SATISFIABILITY or $A$ does not work in polynomial time”. Interestingly, there exist algorithms for which it is neither provable that they do not work in polynomial time, nor that they do not solve SATISFIABILITY. Moreover, there is an algorithm solving SATISFIABILITY for which one cannot prove in AV-mathematics that it does not work in polynomial time.

Furthermore, we show that $\textsf{P}=\textsf{NP}$ implies the existence of algorithms $X$ for which the true claim “$X$ solves SATISFIABILITY in polynomial time” is not provable in AV-mathematics. Analogously, if the multiplication of two decimal numbers is solvable in linear time, one cannot decide in AV-mathematics for infinitely many algorithms $X$ whether “$X$ solves multiplication in linear time”.






Anca Muscholl
Anca Muscholl is Professor of Computer Science at the University of Bordeaux, France, since 2007. Before moving to Bordeaux she was a professor at the University of Paris VII. She received her PhD in 1994 from the University of Stuttgart, and the habilitation in 1999. Her research interests lie in the area of automata, logic, verification and control of concurrent systems. She spent the academic year 2015/16 as a Hans-Fischer Senior Fellow of the Institute of Advanced Study at the Technical University of Munich.




A tour of recent results on word transducers
Based on joint work with Félix Baschenis, Olivier Gauwin and Gabriele Puppis.
Regular word transductions extend the robust notion of regular languages from acceptors to transformers. They were already considered in early papers of formal language theory, but turned out to be much more challenging. The last decade brought considerable research around various transducer models, aiming to achieve similar robustness as for automata and languages.

In this talk we survey recent results on regular word transducers. We discuss how some of the classical connections between automata, logic and algebra extend to transducers, as well as some genuine definability questions.







Jean-Éric Pin
Jean-Éric Pin is currently director of research appointed by the CNRS (Centre National de la Recherche Scientifique) and working at IRIF (Institut de Recherche en Informatique Fondamentale), a Joint Research Unit supported by CNRS and University Paris Diderot. He is a leading scientist in the algebraic theory of automata and languages in connection with logic, topology, and combinatorics, but his research has also been influential in other areas. He wrote a number of articles in semigroup theory, most of them motivated by problems issued from automata theory. He is the author of two reference books in automata theory: Varieties of Formal Languages and the monograph Infinite Words, co-authored with D. Perrin. He is also a fellow of the EATCS.




Some results of Zoltán Ésik on regular languages
Talk in memory of Zoltán Ésik.

Zoltán Ésik published 2 books as an author, 32 books as editor and over 250 scientific papers in journals, chapters and conferences. It was of course impossible to survey such an impressive list of results and in this lecture, I will only focus on a very small portion of Zoltán's scientific work. The first topic will be a result from 1998, obtained by Zoltán jointly with Imre Simon, in which he solved an old conjecture about the shuffle operation. The second topic will be his algebraic study of various fragments of logic on words. Finally I will briefly describe the results of a new article by Zoltán, Jorge Almeida and myself on commutative languages.