Here you can find all informations about the miniworkshop for theoretical computer science.
MiniWorkshop
Theoretical Computer Science in Dortmund
A biyearly miniworkshop for theoretical computer science at the University of Dortmund. In each session there will be 23 conferencelength (25 min.) talks.
Next session
Time:  16.15, July 16, 2018 
Place:  OH123.031 
Organizers:  Thomas Schwentick and Thomas Zeume 
Program:
 Nils Vortmeier: Reachability and Distances under Multiple Changes

Abstract: Recently it was shown that the transitive closure of a directed graph can be updated using firstorder formulas after insertions and deletions of single edges in the dynamic descriptive complexity framework by Dong, Su, and Topor, and Patnaik and Immerman. In other words, Reachability is in DynFO. In this talk we extend the framework to changes of multiple edges at a time, and study the Reachability and Distance queries under these changes. We show that the former problem can be maintained in DynFO(+, x) under changes affecting O(log n / log log n) nodes, for graphs with n nodes. If the update formulas may use a majority quantifier then both Reachability and Distance can be maintained under changes that affect O(log^c n) nodes, for any fixed natural number c.
 based on joint work with Samir Datta, Anish Mukherjee, and Thomas Zeume presented at ICALP 2018

Anish Mukherjee: Shortest kDisjoint Paths via Determinants

Abstract: The wellknown \(k\)disjoint path problem (\(k\)DPP) asks for pairwise vertexdisjoint paths between \(k\) specified pairs of vertices in a given graph. The shortest \(k\)DPP asks to find such paths of shortest total length. We restrict attention to the shortest \(k\)DPP instances on undirected planar graphs where all sources and sinks lie on a single face. We provide efficient sequential and parallel algorithms for the problem which goes via counting such solutions, for any fixed \(k\). This partly answers an open question of Colin de Verdiere and Schrijver ’08. Previously, efficient algorithms are known only for the wellordered case and for the general configuration when \(k \le 4\). The algorithms are based on a bijection between a shortest \(k\)tuple of disjoint paths and cycle covers. This allows us to nontrivially modify techniques relating counting cycle covers to the determinant computation. We further need to do a controlled inclusionexclusion to produce a sum of determinants allowing us to count only the ``good'' cycle covers.
 based on joint work with Samir Datta, Raghav Kulkarni, and Siddharth Iyer.

Gaetano Geck: Introduction to Iltis: An Interactive, WebBased System for Teaching Logic

Abstract: Logic is a foundation for many modern areas of computer science  modelling scenarios using logical formalisms and inferring new knowledge are important skills for goingtobe computer scientists. The Iltis project aims at providing a webbased, interactive system that supports teaching logical methods. In particular the system shall (a) support to learn to model knowledge and to infer new knowledge using propositional logic, modal logic and firstorder logic, and (b) provide immediate feedback and support to students. In this talk, we present a prototypical system that currently supports the above tasks for propositional logic. First impressions on its use in a second year logic course for computer science students are reported.
 based on joint work with Artur Ljulin, Sebastian Peter, Jonas Schmidt, Fabian Vehlken, and Thomas Zeume
Previous sessions
MW 32, January 29, 2018
 Dominik Köppl: Indexing the Bijective BWT

Abstract: A text index is a data structure built over an input text supporting an efficient retrieval for all occurrences of a given pattern without the need of traversing the whole text. Many text indices contain sufficient information to retrieve a substring of the text such that it can replace the text itself to reduce the space consumption. One efficient kind of these indices is the FMindex, which is built on the traditional BurrowsWheeler transform (BWT). We propose a selfindex that works like the FMindex, but is built on the bijective BWT instead of the (traditional) BWT. Like the FMindex, this index supports efficient backward searching.
 joint work with Hideo Bannai, Juha Kärkkäinen, und Marcin Piatkowski


Ioannis Kokkinis: The Expected Duration of Sequential Gossiping
 Abstract: A gossip protocol aims at arriving, by means of pointtopoint communications (or telephone calls), at a situation in which every agent knows all the information initially present in the network. If it is impossible to have more than one calls at the same time, the protocol is called sequential. When the calls happen at random, it makes sense to study the expected duration of the protocols. In this talk we study the expected duration of 5 sequential gossip protocols: we present an algorithm that calculates the exact value of the expected duration for a small number of agents and we show how bounds for the asymptotic behavior of the expected duration can be computed. In order to obtain our results we apply techniques from random graph theory.
 JianJia Chen: Open problem: Exact worstcase response time analysis for selfsuspending sporadic tasks
 Abstract: In computing systems, an execution entity (job/process/task) may suspend itself when it has to wait for some activities to continue/finish its execution. Our recent study in RTSS 2016 shows that, given a priority ordering of sporadic realtime tasks, validating whether the lowestpriority task can meet its deadline or not is coNPhard in the strong sense. There were several attempts in the literature using mixed integer linear programming (MILP) to solve this problem. However, none of them can provide exact solutions even with exponential time complexity. This talk will provide a short introduction about the above mentioned problem and explain why the existing solutions are in fact overapproximations. The talk hopes to bring collaborations to solve this longstanding problem in realtime systems (regardless of time complexity).
MW 31, July 24, 2017, Chair 1
 Johannes Fischer: LempelZiv Compression in a Sliding Window
 Abstract: We present new algorithms for the sliding window LempelZiv (LZ77) problem and the approximate rightmost LZ77 parsing problem. Our main result is a new and surprisingly simple algorithm that computes the sliding window LZ77 parse in O(w) space and either O(n) expected time or O(n log log w + z log log σ) deterministic time. Here, w is the window size, n is the size of the input string, z is the number of phrases in the parse, and σ is the size of the alphabet. This matches the space and time bounds of previous results while removing constant size restrictions on the alphabet size. To achieve our result, we combine a simple modification and augmentation of the suffix tree with periodicity properties of sliding windows. We also apply this new technique to obtain an algorithm for the approximate rightmost LZ77 problem that uses O(n(log z + log log n)) time and O(n) space and produces a (1+ε)approximation of the rightmost parsing (any constant ε>0). While this does not improve the best known timespace tradeoffs for exact rightmost parsing, our algorithm is significantly simpler and exposes a direct connection between sliding window parsing and the approximate rightmost matching problem.
 Thomas Liebig: Dynamic Transfer Patterns for Fast Multimodal Route Planning
 Abstract: Route planning makes direct use of geographic data and provides beneficial recommendations to the public. In realworld the schedule of transit vehicles is dynamic and delays in the schedules occur. Incorporation of these dynamic schedule changes in multimodal route computation is difficult and requires a lot of computational resources. Our approach extends the stateoftheart for static transit schedules, Transfer Patterns, for the dynamic case. Therefore, we amend the patterns by additional edges that cover the dynamics. Our approach is implemented in the opensource routing framework OpenTripPlanner and compared to existing methods in the city of Warsaw. Our results are an order of magnitude faster then existing methods.
 Joint work with Sebastian Peter, Maciej Grzenda and Konstanty JunoszaSzaniawski
 Nils Vortmeier: A Strategy for Dynamic Programs: Start over and Muddle through
 Abstract: A strategy for constructing dynamic programs is introduced that utilises periodic computation of auxiliary data from scratch and the ability to maintain a query for a limited number of change steps. It is established that if some program can maintain a query for log n change steps after an AC1computable initialisation, it can be maintained by a firstorder dynamic program as well, i.e., in DynFO. As an application, it is shown that decision and optimisation problems defined by monadic secondorder (MSO) and guarded secondorder logic (GSO) formulas are in DynFO, if only change sequences that produce graphs of bounded treewidth are allowed. To establish this result, Feferman–Vaughttype composition theorems for MSO and GSO are established that might be useful in their own right.
 Joint work with Samir Datta, Anish Mukherjee, Thomas Schwentick and Thomas Zeume
MW 30, January 19, 2017, Chair 1
 Daan Apeldoorn: Exploiting Explicit Knowledge in the Context of Learning Agents
 Abstract: According to psychological models, learned knowledge can be distinguished into implicit and explicit knowledge. The former can be exploited (e.g., to solve a task), but it cannot be verbalized easily (e.g., to explain it to another person). The latter is accessible in an explicit form: it can comprise generalized, rulebased knowledge which can be verbalized and explained to others. During a learning process, the learned knowledge starts in implicit form and gets explicit as the learning process progresses, and humans can benefit from exploiting such generalized, rulebased knowledge when learning. In this talk, the incorporation of implicit and explicit knowledge is investigated in the context of learning agents. It will be shown experimentally that learning agents are also able to benefit from explicit knowledge, even in early phases of a learning process.
 Joint work with Gabriele KernIsberner
 Henrik Björklund (Umeå Universitet, Sweden): Restricted Hyperedge Replacement Grammars and Parsing Complexity
 Abstract: Hyperedge Replacement Grammars are a useful and expressive formalism for generating graph languages. Unfortunately, the uniform parsing problem for such grammars is NPhard. We investigate restrictions which allow polynomial time parsing while still retaining enough expressive power to generate interesting languages. In particular, our search for suitable restrictions is guided by applications in natural language processing.
 Dominik Köppl: Computing All Distinct Squares in Linear Time for Integer Alphabets
 Abstract: Given a string on an integer alphabet, we present an algorithm that computes the set of all distinct squares belonging to this string in time linear to the string length. As an application, we show how to compute the tree topology of the minimal augmented suffix tree in linear time. Asides from that, we elaborate an algorithm computing the longest previous table in a succinct representation using compressed working space.
 Joint work with Hideo Bannai and Shunsuke Inenaga
MW 29, June 27, 2016, Chair 1
 Andre Droschinsky: Faster Algorithms for the Maximum Common Subtree Isomorphism Problem
 Abstract: The maximum common subtree isomorphism problem asks for the largest possible isomorphism between subtrees of two given input trees. This problem is a natural restriction of the maximum common subgraph problem, which is NPhard in general graphs. Confining to trees renders polynomial time algorithms possible and is of fundamental importance for approaches on more general graph classes. Various variants of this problem in trees have been intensively studied. We consider the general case, where trees are neither rooted nor ordered and the isomorphism is maximum w.r.t. a weight function on the mapped vertices and edges. For trees of order n and maximum degree D our algorithm achieves a running time of O(n^2 D) by exploiting the structure of the matching instances arising as subproblems. Thus our algorithm outperforms the best previously known approaches. No faster algorithm is possible for trees of bounded degree and for trees of unbounded degree we show that a further reduction of the running time would directly improve the best known approach to the assignment problem. Combining a polynomialdelay algorithm for the enumeration of all maximum common subtree isomorphisms with central ideas of our new algorithm leads to an improvement of its running time from O(n^6 + T n^2) to O(n^3 + T n D), where n is the order of the larger tree, T is the number of different solutions, and D is the minimum of the maximum degrees of the input trees. Our theoretical results are supplemented by an experimental evaluation on synthetic and realworld instances.
 Joint work with Nils M. Kriege and Petra Mutzel
 Anish Mukherjee (Chennai Mathematical Institute): Spaceefficient Approximation Scheme for Maximum Matching in Sparse Graphs
 Abstract: We present a Logspace Approximation Scheme (LSAS), i.e. an approximation algorithm for maximum matching in planar graphs (not necessarily bipartite) that achieves an approximation ratio arbitrarily close to one, using only logarithmic space. This deviates from the well known Baker’s approach for approximation in planar graphs by avoiding the use of distance computation  which is not known to be in Logspace. Our algorithm actually works for any “recursively sparse” graph class which contains a linear size matching and also for certain other classes like bounded genus graphs.
The scheme is based on an LSAS in bounded degree graphs which are not known to be amenable to Baker’s method. We solve the bounded degree case by parallel augmentation of short augmenting paths. Finding a large number of such disjoint paths can, in turn, be reduced to finding a large independent set in a bounded degree graph. The bounded degree assumption allows us to obtain a Logspace algorithm.  Joint work with Samir Datta and Raghav Kulkarni
 Abstract: We present a Logspace Approximation Scheme (LSAS), i.e. an approximation algorithm for maximum matching in planar graphs (not necessarily bipartite) that achieves an approximation ratio arbitrarily close to one, using only logarithmic space. This deviates from the well known Baker’s approach for approximation in planar graphs by avoiding the use of distance computation  which is not known to be in Logspace. Our algorithm actually works for any “recursively sparse” graph class which contains a linear size matching and also for certain other classes like bounded genus graphs.
 Martin Schuster: Transducerbased Rewriting Games for Active XML
 Abstract: Contextfree games are twoplayer rewriting games that are played on nested strings representing XML documents with embedded function symbols. These games were introduced to model rewriting processes for intensional documents in the Active XML framework, where input documents are to be rewritten into a given target schema by calls to external services.
This talk is based on a paper which studies the setting where dependencies between inputs and outputs of service calls are modelled by transducers. The paper defines transducer models operating on nested words and studies their properties, as well as the computational complexity of the winning problem for transducerbased contextfree games in several scenarios. While the complexity of this problem is quite high in most settings (ranging from NPcomplete to undecidable), some tractable restrictions are also identified.
 Abstract: Contextfree games are twoplayer rewriting games that are played on nested strings representing XML documents with embedded function symbols. These games were introduced to model rewriting processes for intensional documents in the Active XML framework, where input documents are to be rewritten into a given target schema by calls to external services.
MW 28, December 10, 2015, Chair 1
 Christian Eichhorn: Sceptical Inference Based on Crepresentations and its Characterization as a Constraint Satisfaction Problem
 Abstract: The axiomatic system P is an important standard for plausible, nonmonotonic inferences that is, however, known to be too weak to solve benchmark problems like irrelevance, or subclass inheritance (socalled drowning problem). Spohn's ranking functions which provide a semantic base for system P have often been used to design stronger inference relations, like Pearl's system Z, or crepresentations. While each crepresentation shows excellent inference properties and handles particularly irrelevance and subclass inheritance properly, it is still an open problem which crepresentation is the best. In this paper, we focus on the generic properties of crepresentations and consider the sceptical inference relation (cinference) that is obtained by taking all crepresentations of a given knowledge base into account. In particular, we show that cinference preserves the properties of solving irrelevance and subclass inheritance which are met by every single crepresentation. Moreover, we characterize sceptical cinference as a constraint satisfaction problem so that constraint solvers can be used for its implementation.
 Joint work with Christoph Beierle and Gabriele KernIsberner
 Joachim Biskup: Optimality and Complexity of InferenceProof Data Filtering and CQE
 Abstract: The ample literature on confidentialitypreserving data publishing  and controlled query evaluation (CQE) in particular  leaves several questions open. Are the greedy datafiltering algorithms adopted in the literature maximally cooperative? Can novel secure view formats or answer distortion methods improve security or cooperativeness? What is the inherent complexity of confidentialitypreserving data publishing under different constraints, such as cooperativeness and availability? Can the theoretical results on CQE be systematically extended to more general settings? In this paper we answer the above questions using a completely generic, abstract data filtering framework, independent from any syntactic details and data source encodings, and compatible with all possible distortion methods. Some of the main results are: Refusalbased filterings can be adopted as a normal form for all kinds of filterings; greedy refusalbased filterings are optimal; cooperativeness checks and some availability checks are coNPhard in the simplest case.
 Joint work with Piero A. Bonatti, Clemente Galdi, and Luigi Sauro
 Dominik Köppl: Efficiently Finding All Maximal αgapped Repeats
 Abstract: For α≥1, an αgapped repeat in a word w is a factor uvu of w such that uv≤αu; the two factors u in such a repeat are called arms, while the factor v is called gap. Such a repeat is called maximal if its arms cannot be extended simultaneously with the same symbol to the right or, respectively, to the left. In this paper we show that the number of maximal αgapped repeats that may occur in a word is upper bounded by 18αn. This allows us to construct an algorithm finding all the maximal αgapped repeats of a word in O(αn); this is optimal, in the worst case, as there are words that have Θ(αn) maximal αgapped repeats. Our techniques can be extended to get comparable results in the case of αgapped palindromes, i.e., factors uvuT with uv≤αu.
 Joint work with Paweł Gawrychowski, Tomohiro I, Shunsuke Inenaga, and Florin Manea
MW 27, June 10th, 2015, Chair 1
 Claudio Moraga: Design of circuits for Reversible Computing
 Abstract: Reversible digital circuits are characterized by a lower heat dissipation as compared with classical digital circuits. Therefore they are achieving increasing interest as the saturation of the Moore's Law is approaching. Furthermore in the quantum mechanics formalisms, all quantum computing operations must be reversible: elementary circuits ("gates") realize unitary matrices. Therefore advances in the design of reversible computing circuits are also in the perspective of quantum computing important, hoping that a "quantum technology" will some day be available. Since it has been shown that to decide whether a reversible circuit with a bounded number of auxiliary lines is realizable with a given number of gates is NPcomplete, heuristicsfor the design are being developed. Particular aspects of the design of reversible digital circuits will be covered in the talk.
 Johannes Fischer: Computing and Approximating the LempelZiv77 Factorization in Small Space
 Abstract: The LempelZiv77 algorithm greedily factorizes a text of length n into z maximal substrings that have previous occurrences, which is particularly useful for text compression. We review two recent algorithms for this task:



 A lineartime algorithm using essentially only one integer array of length n in addition to the text.
 An even more spaceconscious algorithm using O(z) space, computing a 2approximation of the LZ77 parse in O(n lg n) time w.h.p.


 Gaetano Geck: ParallelCorrectness and Transferability for Conjunctive Queries
 Abstract: A dominant cost for query evaluation in modern massively distributed systems is the number of communication rounds. For this reason, there is a growing interest in singleround multiway join algorithms where data is first reshuffled over many servers and then evaluated in a parallel but communicationfree way. The reshuffling itself is specified as a distribution policy. We introduce a correctness condition, called parallelcorrectness, for the evaluation of queries w.r.t. a distribution policy. We study the complexity of parallelcorrectness for conjunctive queries as well as transferability of parallelcorrectness between queries. We also investigate the complexity of transferability for certain families of distribution policies, including, for instance, the Hypercube distribution.
MW 26, December 8th, 2014, Chair 1
 Christian Eichhorn: LEG networks for ranking functions
(joint work with Gabriele KernIsberner) Abstract: When using representations of plausibility for semantical frameworks, the storing capacity needed is usually exponentially in the number of variables. Therefore, networkbased approaches that decompose the semantical space have proven to be fruitful in environments with probabilistic information. For applications where a more qualitative information is preferable to quantitative information, ordinal conditional functions (OCF) offer a convenient methodology. Here, Bayesianlike networks have been proposed for ranking functions, so called OCFnetworks. These networks not only suffer from similar problems as Bayesian networks, in particular, allowing only restricted classes of conditional relationships, it also has been found recently that problems with admissibility may arise. In this paper we propose LEG networks for ranking functions, also carrying over an idea from probabilistics. OCFLEG networks can be built for any conditional knowledge base and filled by local OCF that can be found by inductive reasoning. A global OCF is set up from the local ones, and it is shown that the global OCF is admissible with respect to the underlying knowledge base.
 Henrik Björklund: Abstract Meaning Representations and Graph Grammars
 Abstract: Abstract Meaning Representation (AMR) is a formalism for describing the semantics of natural language sentences as directed acyclic graphs. We discuss various types of graph grammars for generating AMRs and the computational complexity of their corresponding parsing problems.
 Thomas Zeume: Maintaining Reachability in Dynamically Changing Graphs using FirstOrder Logic
(joint work with Samir Datta, Raghav Kulkarni, Anish Mukherjee and Thomas Schwentick) Abstract: In this talk we study the dynamic complexity of Reachability. As elementary change operations we allow insertion and deletion of edges of a graph. We show that Reachability is in DynFO, where DynFO allows firstorder definable updates of a polynomialsize auxiliary data structure. This result confirms a two decade old conjecture of Immerman and Patnaik (1997).
MW 25, June 30th, 2014, Chair 1
Wim Martens: Efficient Separability of Regular Languages by Subsequences and Suffixes
(joint work with Wojciech Czerwiński and Tomáš Masopust )
 Abstract: When can two regular word languages K and L be separated by a simple language? We investigate this question and consider separation by piecewise and suffixtestable languages and variants thereof. We give characterizations of when two languages can be separated and present an overview of when these problems can be decided in polynomial time if K and L are given by nondeterministic automata.
Samir Datta: Dynamic Complexity of Reachability and Matching
(joint work with William Hesse and Raghav Kulkarni)
 Abstract: We study some wellknown graph problems such as reachability and matching in the context of dynamic complexity. In particular, we show the maintainability of:
(a) maximum matching in nonuniform DynTC^0,
(b) digraph reachability in nonuniform DynAC^0[2], and,
(c) embedded planar digraph reachability in DynFO.
For (a) we show the technique from [Hesse] to maintain directed reachability in DynTC^0 can in fact be generalized using the characterization of determinants by [MahajanVinay] to maintain the determinant of a matrix in DynTC^0.
For (b) we extend this technique with the help of two more ingredients namely isolation (cf. [AllenderReinhardtZhou]) and truncated approximation using rational polynomials to achieve a DynAC^0[2] bound.
For (c) we use the duality between cuts and cycles in embedded planar graphs to maintain the number of crossings between carefully chosen primal and dual paths.
Sven Rahmann: NonNegative Matrix Factorization: A Need for Theory and Exact Algorithms
 Abstract: see here
MW 24, November 18th, 2013, Chair 1
 Matthias Niewerth: Reasoning about XML Constraints based on XMLtorelational mappings
(joint work with Thomas Schwentick) Abstract: We introduce a simple framework for the specification of constraints for XML documents in which constraints are specified by (1) a mapping that extracts a relation from every XML document and (2) a relational constraint on the resulting relation. The mapping has to be generic with respect to the actual data values and the relational constraints can be of any kind. Besides giving a general undecidability result for firstorder definable mappings and a general decidability result for MSO definable mappings for restricted functional dependencies, we also study the complexity of the implication problem for XML constraints that are specified by tree pattern queries and functional dependencies. Furthermore, we highlight how other specification languages for XML constraints can be formulated in the framework.
 Jakob Rehof: Subtype entailment
 Abstract: The subtype entailment problem, which has been open since 1996, is the question of decidability of the following decision problem: Given a finite set C of subtyping constraints (partial order constraints between simple type expressions) and a subtyping constraint c, does the set C entail the constraint c (in the sense that every valuation satisfying all constraints in C must also satisfy c)? Entailment is considered over a particular partial order (subtyping relation) on trees, and valuations map type variables to trees. In the talk we will focus on a reduction of the entailment problem to the universality problem for a special pushdown automaton model.
 Kristian Kersting: Towards Spectral Color Refinement (joint work with Martin Mladenov and Martin Grohe)
 Abstract: Colour refinement is a basic algorithmic routine for graph isomorphism testing that can be used as a subroutine in almost all practical isomorphism solvers. It partitions the vertices of a graph into "colour classes" in such a way that all vertices in the same colour class have the same number of neighbours in every colour class. Recently it has been shown to also have major implication on inference within probabilistic models. For instance, applied to loopy belief propagation (LBP), it partitions the random variables of the graphical model in such a way that all variables in the same colour class have the same computations trees. Consequently, one can construct a lifted model, where each node represents a set of random variables that all pass the same messages during LBP. Running LBP on this network can be significantly faster.
MW 23, June 24th, 2013, Chair 1
 Ahmet Kara: Dynamic Communicating Automata and Branching HighLevel MSCs
(joint work with Benedikt Bollig, Aiswarya Cyriac, Loïc Hélouët and Thomas Schwentick) Abstract: We study dynamic communicating automata (DCA), an extension of classical communicating finitestate machines that allows for dynamic creation of processes. The behavior of a DCA can be described as a set of message sequence charts (MSCs). While DCA serve as a model of an implementation, we propose branching highlevel MSCs (bHMSCs) on the specification side. Our focus is on the implementability problem: given a bHMSC, can one construct an equivalent DCA? As this problem is undecidable, we introduce the notion of executability, a decidable necessary criterion for implementability. We show that executability of bHMSCs is EXPTIMEcomplete. We then identify a class of bHMSCs for which executability effectively implies implementability.
 Marc Gillé: OBDDBased Representation of Interval Graphs
 Abstract: Symbolic/Implicit OBDDbased graph algorithms deals with the characteristic function χ E of the edge set of a graph G = (V, E) given as an OBDD to get a (hopefully) compact representation and solve graph optimization problems by mainly using functional operations. While the representation size can not be small in general, the idea is that it can be provable small for special graph classes and then also lead to fast algorithms. In this paper, we show that the OBDD size of unit interval graphs is O(V / log V ) and the OBDD size of interval graphs is O(V  log V ) which both improve a known result from Nunkesser and Woelfel [23]. Furthermore, we can show that using our variable order and node labeling for interval graphs the worstcase OBDD size is Ω(V  log V ). We use the structure of the adjacency matrices to prove these bounds. This method may be of independent interest and can be applied to other graph classes. We also develop a maximum matching algorithm on unit interval graphs using O(log V ) operations and evaluate this algorithm empirically.
 Marco Wilhelm: Probabilistic Knowledge Representation Using Gröbner Basis Theory
(joint work with Gabriele KernIsberner and Christoph Beierle) Abstract: An often used methodology for reasoning with probabilistic conditional knowledge bases is provided by the principle of maximum entropy (socalled MaxEnt principle) that realises an idea of informational economy. We exploit the fact that MaxEnt distributions can be computed by solving nonlinear equation systems that reflect the conditional logical structure of these distributions. Further, we apply the theory of Gröbner bases that is well known from computational algebra to the polynomial system which is associated with a MaxEnt distribution, in order to obtain results for reasoning with maximum entropy. A necessary condition for knowledge bases to be consistent, as well as approaches to answering MaxEnt queries, can be presented. Finally, we discuss computational methods to establish general MaxEnt inference rules.
MW 22, February 20th, 2013, Chair 1
 Melanie Schmidt: Coresets, kmeans Clustering and Projective Clustering
 Abstract: Coresets are a technique to compress huge amounts of data in a way that still allows us to solve a specific optimization problem. The kmeans clustering problem is a geometric clustering problem, where we are given a set of points P in the Euclidean space and ask for k centers that minimize the sum of the squared distances of every point to its closest center. Here, a coreset is a (small) weighted set of points, that has approximately the same cost as the original input for every set of centers. This implies that solving the problem on the coreset yields an approximation for the original problem. The coreset presented in this talk is of constantsize, more specifically, the number of points in the coreset only depends on k and the precision parameter.
 Moritz Martens: NPCompleteness of Intersection Type Matching
 Abstract: Type matching problems occur in a number of contexts, including library search, component composition, and inhabitation. The intersection type matching problem under the standard notion of subtyping for intersection types will be considered: Given types tau and sigma, where sigma is a constant type, does there exist a type substitution S such that S(tau) is a subtype of sigma? We show that the matching problem is NPcomplete. NPhardness holds already for the restriction to atomic substitutions. Membership in NP is challenging, and we provide an algorithm engineered for efficiency. Our algorithm is based on a nondeterministic polynomial time normalization procedure for subtype constraint systems with intersection types.
 Thomas Zeume: On the quantifierfree dynamic complexity of Reachability
(joint work with Thomas Schwentick) Abstract: The dynamic complexity of the reachability query is studied in the dynamic complexity framework of Patnaik and Immerman, restricted to quantifierfree update formulas. We show that, with this restriction, the reachability query cannot be dynamically maintained, neither with binary auxiliary relations nor with unary auxiliary functions, and that ternary auxiliary relations are more powerful with respect to graph queries than binary auxiliary relations. We also establish that Reachability cannot be maintained with auxiliary relations or functions of arbitrary arity in the scenario, where update sequences are applied to a nonempty graph and the initialization of the auxiliary relations is permutationinvariant. Furthermore, the inexpressibility results are transferred to some other queries and some normal forms are established.
MW 21, November 26th, 2012, Chair 1
 Joachim Biskup: SignatureBased InferenceUsability Confinement for Relational Databases under Functional and Join Dependencies
 Abstract: Inference control of queries for relational databases confines the information content and thus the usability of data returned to a client, aiming to keep some pieces of information confidential as specified in a policy, in particular for the sake of privacy. In general, there is a tradeoff between the following factors: on the one hand, the expressiveness offered to administrators to declare a schema, a confidentiality policy and assumptions about a client’s a priori knowledge; on the other hand, the computational complexity of a provably confidentiality preserving enforcement mechanism. We propose and investigate a new balanced solution for a widely applicable situation: we admit relational schemas with functional and join dependencies, which are also treated as a priori knowledge, and selectproject sentences for policies and queries; we design an efficient signaturebased enforcement mechanism that we implement for an Oracle/SQLsystem. At declaration time, the inference signatures are compiled from an analysis of all possible crucial inferences, and at run time they are employed like in the field of intrusion detection.
 Martin Schuster: On optimum lefttoright strategies for contextfree games
 Abstract: Active contextfree games are twoplayer games on strings over finite alphabets with one player trying to rewrite the input string to match a target specification. These games have been investigated in the context of exchanging Active XML (AXML) data. While it was known that the rewriting problem is undecidable in general, it is shown here that it is EXPSPACEcomplete to decide for a given contextfree game, whether all safely rewritable strings can be safely rewritten in a lefttoright manner, a problem that was previously considered by Abiteboul et al. Furthermore, it is shown that the corresponding problem for games with finite replacement languages is EXPTIMEcomplete.
 Ingo Battenfeld: Observationally induced eff ects in cartesian closed categories
 Abstract: Alex Simpson has suggested an observationallyinduced approach towards obtaining monads for computational effects in denotational semantics. The underlying idea of this approach is to use a single observation algebra as computational prototype and to obtain a computational monad as a free algebra construction derived from this prototype. Recently, it has been shown that free observationallyinduced algebras exist in the category of continuous maps between topological spaces for arbitrary prechosen computational prototypes. In this work we transfer these results to cartesian closed categories. In particular, we show that, provided the category under consideration satisfies suitable completeness conditions, it supports a free observationallyinduced algebra construction for arbitrary computational prototypes. We also show that the free algebras are obtained as certain subobjects of double exponentials involving the computational prototype as result type. Finally, we apply these results to show that in topological domain theory an observationallyinduced lower powerspace construction over a QCBspace X is given by the space of nonempty closed subsets of X topologised suitably.
MW 20, June 20th, 2012, Chair 1

Gabriele KernIsberner: A Constructive Approach to Independent and Evidence Retaining Belief Revision by General Information Sets(joint work with Patrick Krümpelmann)
 Abstract: Recent years have seen a lot of work towards extending the established AGM belief revision theory with respect to iterating revision, preserving conditional beliefs, and handling sets of propositions as new information. In particular, novel postulates like independence and evidence retainment have been brought forth as new standards for revising epistemic states by (sets of) propositional information. In this paper, we propose a constructive approach for revising epistemic states by sets of (propositional and conditional) beliefs that combines ideas from nonmonotonic reasoning with conditional belief revision. We also propose a novel principle called enforcement that covers both independence and evidence retainment, and we show our revision operator to comply with major postulates from the literature. Moreover, we point out the relevance of our approach for default reasoning.

Boris Düdder: Bounded Combinatory Logic(joint work with Moritz Martens, Jakob Rehof, and Paweł Urzyczyn)
 Abstract: In combinatory logic one usually assumes a fixed set of basic combinators (axiom schemes), usually K and S. In this setting the set of provable formulas (inhabited types) is PSPACE complete in simple types and undecidable in intersection types. When arbitrary sets of axiom schemes are considered, the inhabitation problem is undecidable even in simple types (this is known as LinialPost theorem). Bounded combinatory logic (BCL _k) arises from combinatory logic by imposing the bound k on the depth of types (formulae) which may be substituted for type variables in axiom schemes. We consider the inhabitation (provability) problem for BCL_k: Given an arbitrary set of typed combinators and a type, is there a combinatory term of type in kbounded combinatory logic? Our main result is that the problem is (k + 2)EXPTIME complete for BCL _k with intersection types, for every fixed k (and hence nonelementary when k is a parameter). We also show that the problem is EXPTIMEcomplete for simple types, for all k. Theoretically, our results give new insight into the expressive power of intersection types. From an application perspective, our results are useful as a foundation for composition synthesis based on combinatory logic.

Marianna D'Addario: Designing qUnique DNA Sequences with Integer Linear Programs and Euler Tours in De Bruijn Graphs
 Abstract: DNA nanoarchitechtures require carefully designed oligonucleotides with certain nonhybridization guarantees, which can be formalized as the quniqueness property on the sequence level. We study the optimization problem of finding a longest qunique DNA sequence. We first present a convenient formulation as an integer linear program on the underlying De Bruijn graph that allows to flexibly incorporate a variety of constraints; solution times for practically relevant values of q are short. We then provide additional insights into the problem structure using the quotient graph of the De Bruijn graph with respect to the equivalence relation of reverse complementarity. Specifically, for odd q the quotient graph is Eulerian, and finding a longest qunique sequence is equivalent to finding an Euler tour, hence solved in linear time (with respect to the output string length). For even q, selfcomplementary edges complicate the problem, and the graph has to be Eulerized by deleting a minimum number of edges. Two subcases arise, for one of which we present a complete solution, while the other one remains open.
MW 19, February 22th, 2012, Chair 1
 Chris Schwiegelshohn: Solving the minimum string cover problem
 Abstract: A string cover C of a set of strings S is a set of substrings from S such that every string in S can be written as a concatenation of the strings in C. Given costs assigned to each substring from S, the Minimum String Cover (MSC) problem asks for a cover of minimum total cost. This NPhard problem has so far only been approached from a purely theoretical perspective. We propose a flowbased ILP formulation, whose structure is particularly favorable for a Lagrangian relaxation approach. By making use of the strong bounds obtained through a repeated shortest path computation in a branchandbound manner, we show for the first time that nontrivial MSC instances can be solved to provable optimality in reasonable time.
 Sven Rahmann: Welche Herausforderungen bietet die Genomanalyse der theoretischen Informatik?
 Abstract: Wir stellen aktuelle Probleme der Genominformatik vor und diskutieren, welche grundlegenden theoretischen Fragestellungen sich dahinter verbergen (könnten).
 Ahmet Kara: Decidability Results for Infinite State Systems

Abstract: The positive achievements in the field of automatic verification of finite state systems motivated the research on algorithmic methods for infinite state systems.
In this talk we will be concerned with general conditions on infinite state systems which are sufficient for the termination of several algorithmic procedures. We will introduce wellstructured transition systems (WSTSs) as defined by Finkel and Schnoebelen. These are infinite state systems with a wellquasi ordering on the set of states which is compatible with the transition relation. For WSTSs it can be shown that algorithmic problems like reachability, i.e. the question whether a certain set of states is reachable from an initial state, are decidable. Many transition systems like Petri nets and lossy channel systems from various fields of computer science can be seen as WSTSs. We will give some examples from current research where the concept of WSTSs helps to get decidability results.

MW 18, August 9th, 2011, Chair 1

Gabriele KernIsberner: A Default Logical Semantics for Defeasible Argumentation
(joint work with Guillermo Simari)
Abstract: Defeasible argumentation and default reasoning are usually perceived as two similar, but distinct approaches to commonsense reasoning. In this paper, we combine these two fields by viewing (defeasible resp. default) rules as a common crucial part in both areas. We will make use of possible worlds semantics from default reasoning to provide examples for arguments, and carry over the notion of plausibility to the argumentative framework. Moreover, we base a priority relation between arguments on the tolerance partitioning of system Z and obtain a criterion phrased in system Z terms that ensures warrancy in defeasible argumentation.


Melanie Schmidt: Adaptive Sampling for Clustering Problems
 Abstract: Clustering is the problem to partition a given set of objects into subsets called clusters such that objects in the same cluster are similar and objects in different clusters are dissimilar. The kmeans problem is a wellknown clustering problem where similarity is measured by squared Euclidean distance. The kmeans algorithm is a very popular heuristic for this problem which unfortunately does not provide any guality guarantee. In this talk, we review recent improvements of the kmeans algorithm by Arthur and Vassilvitzki and then consider possible directions for further research. In particular, we are interested in applications to a generalization of the kmeans problem.

Peter Padawitz: Algebraic, relational and orderbased co/induction

Abstract: Algebraic induction is a proof rule for showing properties of initial models. Its soundness follows from the fact that initial models do not have proper substructures. Algebraic coinduction is a proof rule for showing equations in final models. Its soundness follows from the fact that final models do not have proper quotients. A proof by algebraic co/induction can be carried out by relational co/induction, which is a proof rule for showing properties of least resp. greatest  not only unary or binary  relations in arbitrary models. Relational co/induction specializes orderbased co/induction to the lattice of interpretations of the predicates of the underlying signature. Orderbased co/induction works in any complete lattice or co/chaincomplete poset. As an example of the use of algebraic coinduction, we show how it simplifies proofs of language equivalence, in particular of conjectures stating that a given contextfree grammar generates a given language.

MW 17, Mai 4th, 2011, Chair 1

Thomas Schwentick: Twovariable logic and Key Constraints on Data Words

Abstract: We introduce key constraints for data words and show that it is decidable whether, for a given twovariable sentence \(\varphi\) that can refer to the successor relation on positions and a set K of key constraints, there is a data string w that satisfies \(\varphi\) and respects K. Here, the formula is allowed to refer to the successor relation but not to the linear order on the positions of the word. As a byproduct, a selfcontained exposition of an algorithm that decides satisfiability of such formulas (without key constraints) in 2NEXPTIME is given.


Johannes Köster: Proteinhypernetzwerke

Abstract: Proteinprotein interactions are the fundamental building blocks of biochemical reaction systems underlying cellular functions. Importantly, the complexity and functionality of such systems emerge not from the protein interactions themselves but from the dependencies between these interactions. However, a comprehensive approach for large scale integration of such dependencies is still rather lacking.
We present an approach for endowing protein networks with interaction dependencies using propositional logic, thereby obtaining protein hypernetworks. We illustrate their potential by demonstrating that they improve protein complex prediction in yeast. By modeling the effects of protein perturbations in hypernetworks we derive a perturbation impact score, a novel measure of a protein's functional importance.


Patrick Krümpelmann: On Influence and Contractions in Defeasible Logic Programming

Abstract: We investigate the problem of contraction in Defeasible Logic Programming (DeLP), a logicbased approach for defeasible argumentation. We develop different notions of contraction based on both, the different forms of entailment implicitly existent in argumentationbased formalisms and the influence literals exhibit in the reasoning process. We give translations of widely accepted rationality postulates for belief contraction to our framework. Moreover, we discuss on the applicability of contraction for defeasible argumentation and the role of influence in this matter.

MW 16, January 19th, 2011, Chair 1
Speaker  Subject  
Thomas Zeume, LS 1  Temporal Logics on Words with Multiple Data Values  abstract  
Henrik Björklund  Recognizing Shuffled Languages  abstract  
Peter Padawitz, LS 1  How to benefit from algebra when reasoning about languages  abstract 
MW 15, July 14th, 2010, Chair 1
Speaker  Subject  
Wim Martens, LS 1  Simplifying XML Schema: SingleType Approximations of Regular Tree Languages  abstract  
JanHendrik Lochner, LS 6  Efficient Inference Control for Open Relational Database Queries  abstract  
Matthias Thimm, LS 1  Classification and Strategical Issues of Argumentation Games on Structured Argumentation Frameworks  abstract 
MW 14, July 14th, 2010, Chair 1
Speaker  Subject  
Frank Hellweg, LS 2  Testing Euclidean Spanners  abstract  
Thomas Schwentick, LS 1  TwoVariable Logic with Two Order Relations  abstract  
Christoph Schubert, LS 10  Expressivity of modal logics over general measurable spaces  abstract 
MW 13, February 23th, 2010, Chair 1:
Speaker  Subject  
Matthias Thimm, LS 1  Relational Probabilistic Conditionals and Maximum Entropy  abstract  
Tobias Marschall, LS 11  Exact Analysis of Horspool's and Sunday's Pattern Matching Algorithms with Probabilistic Arithmetic Automata  abstract  
Wim Martens, LS 1  Schema Design for XML Repositories: Complexity and Tractability  abstract 
MW 12, December 17th, 2008, Chair 1:
Speaker  Subject  
Sven Rahman, LS 11  Efficient exhaustive motif discovery  abstract 
Chunlai Zhou, Peking  Finite approximation of labelled Markov processes through filtration  
Patrick Krümpelmann, LS 6  Towards a general framework for updates in logic programming. 
MW 11, July 9th, 2008, Chair 1:
Speaker  Subject  
Lena Wiese, LS6  preCQE: preprocessing for controlled query evaluation. Inferenzkontrolle in pädikatenlogischen Datenbanken.  abstract 
ErnstErich Doberkat, LS 11  Bisimilarity of distributionally equivalent Markov transition sytems.  abstract 
Marcel Marquardt, LS 1  Dynamische Komplexität formaler Sprachen. 
MW 10, May 7th, 2008, Chair 1:
Speaker  Subject  
Felix Klaedtke, ETH Zürich  On the automata size for linear arithmetics  abstract (ca. 1 hour talk) 
R. Ramanujam, IMSC Chennai  Verifiying that an eelection is "Free and Fair"  abstract 
MW 9, April 9th, 2008, Chair 1:
Speaker  Subject 
Ingo Battenfeld  Computational effects in topoligical domain theory 
Peter Padawitz  Hyperdocuments are coalgebraic 
Matthias Thimm  On the relationship of defeasible argumentation and answer set programming 
MW 8, November 21st, 2007, Chair 1:
Speaker  Subject 
Manuela Ritterskamp  Using preferance fusion in default reasoning 
Wim Martens  Conjunctive query containment over trees 
MW 7, June 27th, 2007, Chair 2:
Speaker  Subject 
Christoph Schubert  Bisimilarity, behavioral and logical equivalance for stochastic right coalgebras 
Torben Weibert  Confidentiality policies for controlled query evaluation 
Henrik Björklund  Bounded depth data trees 
MW 6, February 28th, 2007, Chair 1:
Speaker  Subject 
Horst Wedde  Concurrency in distributed systems under autonomous and enforced actions 
ErnstErich Doberkat  Eine bemerkenswerte erzeugende Funktion 
Wim Martens  Conjunctive query containment over trees 
MW 5, December 20th, 2006, Chair 10
Speaker  Subject  
Peter Buchholz  Bisimulation für gewichtete Automaten  
Gabriele KernIsberner  Probabilistic Abducation without priors  abstract 
Henrik Björklund  Towards regular data languages  abstract 
MW 4, September 4th, 2006, math. dep. chair of Prof. Skutella:
Speaker  Subject  
Lena Wiese  On finding an inferenceproof complete database for controlled query evaluation  
Volker Weber  Boundedvariable fragments of hybrid logics  slides 
Martin Skutella  Solving evacuation problems efficiently: earliest arrival flows with multiple source  slides 
MW 3, July 10th, 2006, Chair 1:
The workshop took place directly after a guest lecture of Dr. Thomas Eiter (TU Wien), a guest of Prof. Petra Mutzel. Titel: Datenintegration and answer set programming abstract
Speaker  Subject  
Carsten Witt  Laufzeitanalysen von randomisierten Suchheuristiken für das Partitionsproblem  slides 
Peter Padawitz  Dialgebraic picture generation: A case study in multilevel data abstraction  slides 
MW 2, May 22nd, 2006, Chair 11:
Speaker  Subject 
E.E. Doberkat  Randomisierte Morphismen für aktionsmarkierte Markoffsche Transitionssysteme 
Martin Sauerhoff  Approximate Counting und Häufigkeitsmomente von langen Datenströmen 
Gabriele KernIsberner  A note on comparing semantics for conditionals 
MW 1, March 20th, 2006, Chair 1:
Speaker  Subject  
Joachim Biskup  Controlled query evaluation with open queries for a decidable relational submodel  slides 
Markus Chiamani  Nonplanar core reduction  
Thomas Schwentick  Pebble automata on trees  slides 
Recognizing Shuffled Languages