Answer Set Solving with Bounded Treewidth RevisitedThis is the author’s self-archived copy including detailed proofs. A preliminary version of the paper was presented on the workshop TAASP’16. Research was supported by the Austrian Science Fund (FWF), Grant Y698.

Answer Set Solving with Bounded Treewidth Revisitedthanks: This is the author’s self-archived copy including detailed proofs. A preliminary version of the paper was presented on the workshop TAASP’16. Research was supported by the Austrian Science Fund (FWF), Grant Y698.

Johannes K. Fichte111Also affiliated with the Institute of Computer Science and Computational Science at University of Potsdam, Germany., Michael Morak, Markus Hecher and Stefan Woltran
TU Wien, Austria
lastname@dbai.tuwien.ac.at
Abstract

Parameterized algorithms are a way to solve hard problems more efficiently, given that a specific parameter of the input is small. In this paper, we apply this idea to the field of answer set programming (ASP). To this end, we propose two kinds of graph representations of programs to exploit their treewidth as a parameter. Treewidth roughly measures to which extent the internal structure of a program resembles a tree. Our main contribution is the design of parameterized dynamic programming algorithms, which run in linear time if the treewidth and weights of the given program are bounded. Compared to previous work, our algorithms handle the full syntax of ASP. Finally, we report on an empirical evaluation that shows good runtime behaviour for benchmark instances of low treewidth, especially for counting answer sets.

\@input@

sec.aux \pdfstringdefDisableCommands\pdfstringdefDisableCommands

1 Introduction

Parameterized algorithms [14, 5] have attracted considerable interest in recent years and allow to tackle hard problems by directly exploiting a small parameter of the input problem. One particular goal in this field is to find guarantees that the runtime is exponential exclusively in the parameter, and polynomial in the input size (so-called fixed-parameter tractable algorithms). A parameter that has been researched extensively is treewidth [16, 2]. Generally speaking, treewidth measures the closeness of a graph to a tree, based on the observation that problems on trees are often easier than on arbitrary graphs. A parameterized algorithm exploiting small treewidth takes a tree decomposition, which is an arrangement of a graph into a tree, and evaluates the problem in parts, via dynamic programming (DP) on the tree decomposition.

ASP [3, 13] is a logic-based declarative modelling language and problem solving framework where solutions, so called answer sets, of a given logic program directly represent the solutions of the modelled problem. Jakl et al. [11] give a DP algorithm for disjunctive rules only, whose runtime is linear in the input size of the program and double exponential in the treewidth of a particular graph representation of the program structure. However, modern ASP systems allow for an extended syntax that includes, among others, weight rules and choice rules. Pichler et al. [15] investigated the complexity of programs with weight rules. They also presented DP algorithms for programs with cardinality rules (i.e., restricted version of weight rules), but without disjunction.

In this paper, we propose DP algorithms for finding answer sets that are able to directly treat all kinds of ASP rules. While such rules can be transformed into disjunctive rules, we avoid the resulting polynomial overhead with our algorithms. In particular, we present two approaches based on two different types of graphs representing the program structure. Firstly, we consider the primal graph, which allows for an intuitive algorithm that also treats the extended ASP rules. While for a given disjunctive program the treewidth of the primal graph may be larger than treewidth of the graph representation used by Jakl et al. [11], our algorithm uses simpler data structures and lays the foundations to understand how we can handle also extended rules. Our second graph representation is the incidence graph, a generalization of the representation used by Jakl et al.. Algorithms for this graph representation are more sophisticated, since weight and choice rules can no longer be completely evaluated in the same computation step. Our algorithms yield upper bounds that are linear in the program size, double-exponential in the treewidth, and single-exponential in the maximum weights. We extend two algorithms to count optimal answer sets. For this particular task, experiments show that we are able to outperform existing systems from multiple domains, given input instances of low treewidth, both randomly generated and obtained from real-world graphs of traffic networks. Our system is publicly available on github222See https://github.com/daajoe/dynasp..

2 Formal Background

2.1 Answer Set programming (ASP)

ASP is a declarative modeling and problem solving framework; for a full introduction, see, e.g., [3, 13]. State-of-the-art ASP grounders support the full ASP-Core-2 language [4] and output smodels input format [19], which we will use for our algorithms. Let , , be non-negative integers such that , , , distinct propositional atoms, , , , non-negative integers, and . A choice rule is an expression of the form, , a disjunctive rule is of the form and a weight rule is of the form . Finally, an optimization rule is an expression of the form . A rule is either a disjunctive, a choice, a weight, or an optimization rule.

For a choice, disjunctive, or weight rule , let , , and . For a weight rule , let map atom to its corresponding weight  in rule  if for and to otherwise, let for a set of atoms, and let be its bound. For an optimization rule , let and if , let and ; or if , let and . For a rule , let denote its atoms and its body. A program  is a set of rules. Let and let and denote the set of all choice, disjunctive, optimization and weight rules in , respectively.

A set satisfies a rule  if (i)  or for , (ii)  or for , or (iii) . is a model of , denoted by , if satisfies every rule . Further, let for .

The reduct  (i) of a choice rule  is the set of rules, (ii) of a disjunctive rule  is the singleton , and (iii) of a weight rule  is the singleton where . is called GL reduct of with respect to . A set  is an answer set of program  if (i) and (ii) there is no such that , that is, is subset minimal with respect to .

We call the cost of model for with respect to the set . An answer set  of is optimal if its cost is minimal over all answer sets.

Figure 1: Graph  with a TD of (left) and graph  with a TD of (right).
Example 1.

Let . Then, the sets , and are answer sets of .

Given a program , we consider the problems of computing an answer set (called AS) and outputting the number of optimal answer sets (called #AspO).

Next, we show that under standard complexity-theoretic assumptions #Asp is strictly harder than #SAT.

Theorem 1.

#Asp for programs without optimization is -complete.

Proof.

Observe that programs containing choice and weight rules can be compiled to disjunctive ones (normalization) without these rule types (see [8]) using a polynomial number (in the original program size) of rules. Membership follows from the fact that, given such a nice program and an interpretation , checking whether is an answer of is coNP-complete, see e.g., [12]. Hardness is a direct consequence of -hardness for the problem of counting subset minimal models of a CNF formula [6], since answer sets of negation-free programs and subset-minimal models of CNF formulas are essentially the same objects. ∎

Remark 1.

The counting complexity of #Asp including optimization rules (i.e., where only optimal answer sets are counted) is slightly higher; exact results can be established employing hardness results from other sources [10].

2.2 Tree Decompositions

Let be a graph, a rooted tree, and a function that maps each node  to a set of vertices. We call the sets bags and the set of nodes. Then, the pair  is a tree decomposition (TD) of  if the following conditions hold: (i) all vertices occur in some bag, that is, for every vertex  there is a node  with ; (ii) all edges occur in some bag, that is, for every edge  there is a node  with ; and (iii) the connectedness condition: for any three nodes , if lies on the unique path from  to , then . We call the width of the TD. The treewidth  of a graph  is the minimum width over all possible TDs of .

Note that each graph has a trivial TD  consisting of the tree  and the mapping . It is well known that the treewidth of a tree is , and a graph containing a clique of size has at least treewidth . For some arbitrary but fixed integer  and a graph of treewidth at most , we can compute a TD of width in time  [2]. Given a TD with , for a node  we say that is leaf if has no children; join if has children  and with and ; int (“introduce”) if has a single child , and ; rem (“removal”) if has a single child , and . If every node has at most two children, , and bags of leaf nodes and the root are empty, then the TD is called nice. For every TD, we can compute a nice TD in linear time without increasing the width [2]. In our algorithms, we will traverse a TD bottom up, therefore, let be the sequence of nodes in post-order of the induced subtree  of rooted at .

Example 2.

Figure 1 (left) shows a graph  together with a TD of  that is of width . Note that has treewidth , since it contains a clique on the vertices . Further, the TD in Figure 2 is a nice TD of .

2.3 Graph Representations of Programs

In order to use TDs for ASP solving, we need dedicated graph representations of ASP programs. The primal graph  of program  has the atoms of  as vertices and an edge  if there exists a rule  and . The incidence graph  of is the bipartite graph that has the atoms and rules of  as vertices and an edge  if for some rule . These definitions adapt similar concepts from SAT [17].

In: Table algorithm , nice TD  with of according to .
Out: Table: maps each TD node  to some computed table .
1 for iterate in post-order(T,n) do
2      
3      
4      
Algorithm 1 Algorithm for Dynamic Programming on TD for ASP.
Example 3.

Recall program  of Example 1. We observe that graph  () in the left (right) part of Figure 1 is the primal (incidence) graph of .

2.4 Sub-Programs

Let be a nice TD of graph representation  of a program . Further, let and . The bag-rules are defined as if is the primal graph and as if is the incidence graph. Further, the set  is called atoms below , the program below is defined as , and the program strictly below is . It holds that and .

Example 4.

Intuitively, TDs of Figure 1 enable us to evaluate by analyzing sub-programs ( and ) and combining results agreeing on . Indeed, for the given TD of Figure 1 (left), , and . For the TD of Figure 1 (right), we have and , as well as and . Moreover, for TD of Figure 2, , and .

3 ASP via Dynamic Programming on TDs

In the next two sections, we propose two dynamic programming (DP) algorithms, and , for ASP without optimization rules based on two different graph representations, namely the primal and the incidence graph. Both algorithms make use of the fact that answer sets of a given program are (i) models of and (ii) subset minimal with respect to . Intuitively, our algorithms compute, for each TD node , (i) sets of atoms—(local) witnesses—representing parts of potential models of , and (ii) for each local witness  subsets of —(local) counterwitnesses—representing subsets of potential models of  which (locally) contradict that can be extended to an answer set of . We give the the basis of our algorithms in Algorithm 1 (), which sketches the general DP scheme for ASP solving on TDs. Roughly, the algorithm splits the search space based on a given nice TD and evaluates the input program  in parts. The results are stored in so-called tables, that is, sets of all possible tuples of witnesses and counterwitnesses for a given TD node. To this end, we define the table algorithms  and , which compute tables for a node  of the TD using the primal graph  and incidence graph , respectively. To be more concrete, given a table algorithm , algorithm  visits every node  in post-order; then, based on , computes a table for node from the tables of the children of , and stores in Tables[t]. 33footnotetext: , , and

3.1 Using Decompositions of Primal Graphs

In: Bag , bag-rules and child tables Child-Tabs of node Out: Table .
1 if  then 
2  /* Abbreviations see Footnote 3. */else if , is introduced and  then
3         
4else if , is removed and  then
5      
6 else if  and with  then
7      
Algorithm 2 Table algorithm .

In this section, we present our algorithm  in two parts: (i) finding models of  and (ii) finding models which are subset minimal with respect to . For sake of clarity, we first present only the first tuple positions (red parts) of Algorithm 2 () to solve (i). We call the resulting table algorithm .

Example 5.

Consider program  from Example 1 and in Figure 2 (left) TD  of  and the tables , , , which illustrate computation results obtained during post-order traversal of by . Table  as . Since , we construct table  from  by taking  and for each  (corresponding to a guess on ). Then, introduces and introduces . , but since we have for . In consequence, for each  of table , we have since enforces satisfiability of in node . We derive tables  to similarly. Since , we remove atom  from all elements in to construct . Note that we have already seen all rules where occurs and hence can no longer affect witnesses during the remaining traversal. We similarly construct . Since , we construct table  by taking the intersection . Intuitively, this combines witnesses agreeing on . Node  is again of type rem. By definition (primal graph and TDs) for every , atoms  occur together in at least one common bag. Hence, and since , we can construct a model of  from the tables. For example, we obtain the model .

Observation 1.

Let be a program and a TD of the primal graph of . Then, for every rule  there is at least one bag in containing all atoms of .

Proof.

By Definition the primal graph contains a clique on all atoms participating in a rule . Since a TD must contain each edge of the original graph in some bag and has to be connected, it follows that there is at least one bag containing all (clique) atoms of . ∎

: