# LCC'13 Program

## 1 Program

**9h.***Patrick Baillot*, "Characterizing polynomial and exponential complexity classes in elementary lambda-calculus" Abstract

**9h45.***Daniel Leivant*, "Evolvin graph structures and their implicit computational complexity", Abstract

**10h30-11h.**Coffee Break

**11h.***Amélie Gheerbrant*, "Pattern-based XML queries" Abstract

**11h45.***Hubie Chen*, "Decomposing First-Order Logic", Abstract

**12h15.***Simone Bova*, "The Complexity of Minimization in Existential Positive Logic", Abstract

**12h45-14h15.**Lunch

**14h15.***Martin Avanzini*, "Automated Complexity Analysis via Term Rewriting Techniques" Abstract

**15h.***Marco Gaboardi*, "Algebra and Coalgebra in Light Linear Logic", Abstract

**15h45.***Clément Aubert*, "Logarithmic Space and Permutations", Abstract

**16h15-16h45.**Coffee Break

**16h45.***Juha Kontinen*, "Computational Aspects of Dependence Logic", Abstract

**17h30.***Reinhard Kahle, Isabel Oitavem*, "Reading off functions from proofs", Abstract

**18h.**End of the workshop

## 2 Summary of talks

- Abstract (Patrick Baillot) :

Elementary linear logic is a simple subsystem of linear logic that had been introduced to characterize elementary complexity, that is to say computation in time bounded by a tower of exponentials of fixed height. More recently it has been shown that the same logic, extended with type fixpoints, can actually be used to capture the complexity classes of problems P and k-EXPTIME, for k>=1. However these results relied on the use of proof-nets and on a delicate analysis of their normalisation procedure.In the present work we retake such an investigation in the simpler setting of lambda-calculus. We define for that an elementary lambda-calculus, which is typable in elementary logic. We prove then that the hierarchy of complexity classes k-EXPTIME is characterized by a hierarchy of types, and extend this result from the classes of problems to the corresponding classes of functions.

Joint work with Erika De Benedetti and Simona Ronchi Della Rocca.

- Abstract (Daniel Leivant) :

Dynamic data-structures are ubiquitous in programming, and they use extensively underlying directed multi-graph structures, such as labeled trees, DAGs, and objects. This paper adapts well-established static analysis methods, namely data ramication and language-based information flow security, to programs over such graph structures. Our programs support the creation, deletion, and updates of both vertices and edges, and are related to pointer machines. The main result states that a function over graph-structures is computable in polynomial time i it is computed by a terminating program whose graph manipulation is ramified, provided all edges that are both created and read in a loop have the same label.Joint work with Jean-Yves Marion

- Abstract (Amélie Gheerbrant) :

A variety of analogs of conjunctive queries over XML documents have been studied in the literature. We focus on languages based on tree patterns, which follow the structure of tree documents. They arise naturally in many problems related to integrating and exchanging data and to handling incomplete information. In this talk, we concentrate on the problems of finding certain answers to queries, and on static analysis, in particular, query containment.The problem of computing certain answers arises when one queries incompletely specified databases, i.e., databases with missing information. As often happens, the complexity of the problem jumps when one moves from relations to XML. Nonetheless, we identified large relevant classes of queries for which efficient algorithms can be developed. Curiously, for some of these classes, no analogous results existed in the relational world. In fact, we uncovered a well-behaved class of relational queries which had been overlooked so far.

Testing for query containment is a fondamental task in query optimization and data integration. In the relational case, classical homomorphism based tools lead to reasonable complexity algorithms. In XML such techniques can be applied only for very simple queries. Beyond thoses classes of queries, they can either be adapted by using more sophisticated data structures, or – provably – they cannot be adapted at all.

The talk will be based on papers from ICDT’12 and ICDT’13.

- Abstract (Hubie Chen) :

We propose a talk in which we survey recent advances on understanding the complexity of model checking first-order sentences.

- Abstract (Simone Bova) :

We present recent work on the complexity of the problem of minimizing the number of variables in existential first-order logic: Given an existential sentence and a number k, is the sentence logically equivalent to a k-variable sentence?Joint work with Hubie Chen

- Abstract (Martin Avanzini) :

TcT, the Tyrolean complexity tool, is a tool for automatically proving polynomial upper bounds on the runtime complexity of term rewriting systems. TcT features a majority of the known techniques for the characterisation of polynomial complexity of rewrite systems.In this talk, I will briefly outline the main techniques employed in TcT. Also, I will present the complexity framework underlying TcT, which allows the formalisation, and also the combination, of the implemented techniques.

- Abstract (Marco Gaboardi) :

Algebra and coalgebra are widely used to model data types in functional programming languages and proof assistants. Their use permits to better structure the computations and also to enhance the expressivity of a language or of a proof system.The use of algebras and coalgebras in proof assistants relies on the possibility of integrate them in strongly normalizing languages as System F or the calculus of constructions without loosing the good logical propeties of the calculus. In fact, in those systems it is also possible to define weak (initial) algebras and weak (final) coalgebra by using parametric polymorphism.

In this work we study the problem of defining algebras and coalgebra in Light Linear Logic, a system characterizing the complexity class FPTIME. Due to the stratified nature of Light Linear Logic, we need to consider statified versions of (weak) algebras and coalgebras. Thanks to these notions we are able to define standard algebraic and coalgebraic data types in our language.

- Abstract (Clément Aubert) :

In a recent work, Girard proposed a new and innovative approach to computational complexity based on the proofs-as-programs correspondence. The authors recently showed [1, 2] how Girard proposal succeeds in obtain- ing characterizations of L and co-NL languages as sets of operators in the hyperfinite factor of type II1.Joint work with Thomas Seiller

- Abstract (Juha Kontinen) :

Dependence Logic, introduced by Jouko Väänänen in 2007, is a new logic incorporating the concept of dependence into first-order logic. The expressive power of dependence logic coincides with that of existential second-order logic, and the complexity class NP over finite structures. In the past few years, dependence logic has grown into a new framework in which various notions of dependence and independence can be formalized. We review recent results regarding the computational aspects of dependence logic and its variants.

- Abstract (Reinhard Kahle, Isabel Oitavem) :

We discuss different forms of reading off functions of restricted computational complexity from theories with restricted forms of induction.

Date: 2013-07-17 15:27:59 CEST

HTML generated by org-mode 6.33x in emacs 23