In the multicoloring problem, also known as (:)coloring or fold coloring, we are given a graph and a set of colors, and the task is to assign a subset of colors to each vertex of so that adjacent vertices receive disjoint color subsets. This natural generalization of the classic coloring problem (the case) is equivalent to finding a homomorphism to the Kneser graph , and gives relaxations approaching the fractional chromatic number.
We study the complexity of determining whether a graph has an (:)coloring. Our main result is that this problem does not admit an algorithm with running time , for any computable , unless the Exponential Time Hypothesis (ETH) fails. A time algorithm due to Nederlof [2008] shows that this is tight. A direct corollary of our result is that the graph homomorphism problem does not admit a algorithm unless ETH fails, even if the target graph is required to be a Kneser graph. This refines the understanding given by the recent lower bound of Cygan et al. [SODA 2016].
The crucial ingredient in our hardness reduction is the usage of detecting matrices of Lindström [Canad. Math. Bull., 1965], which is a combinatorial tool that, to the best of our knowledge, has not yet been used for proving complexity lower bounds. As a side result, we prove that the running time of the algorithms of Abasi et al. [MFCS 2014] and of Gabizon et al. [ESA 2015] for the monomial detection problem are optimal under ETH.
1 Introduction
The complexity of determining the chromatic number of a graph is undoubtedly among the most intensively studied computational problems. Countless variants, extensions, and generalizations of graph colorings have been introduced and investigated. Here, we focus on multicolorings, also known as (:)colorings. In this setting, we are given a graph , a palette of colors, and a number . An (:)coloring of is any assignment of distinct colors to each vertex so that adjacent vertices receive disjoint subsets of colors. The (:)coloring problem asks whether admits an (:)coloring. Note that for we obtain the classic graph coloring problem. The smallest for which an (:)coloring exists, is called the fold chromatic number, denoted by .
The motivation behind (:)colorings can be perhaps best explained by showing the connection with the fractional chromatic number. The fractional chromatic number of a graph , denoted , is the optimum value of the natural LP relaxation of the problem of computing the chromatic number of , expressed as finding a cover of the vertex set using the minimum possible number of independent sets. It can be easily seen that by relaxing the standard coloring problem by allowing times more colors while requiring that every vertex receives colors and adjacent vertices receive disjoint subsets, with increasing we approximate the fractional chromatic number better and better. Consequently, .
Another interesting connection concerns Kneser graphs. Recall that for positive integers , with , the Kneser graph has all element subsets of as vertices, and two subsets are considered adjacent if and only if they are disjoint. For instance, is the wellknown Petersen graph (see Fig. 1, right). Thus, (:)coloring of a graph can be interpreted as a homomorphism from to the Kneser graph (see Fig. 1). Kneser graphs are well studied in the context of graph colorings mostly due to the celebrated result of Lovász [29], who determined their chromatic number, initiating the field of topological combinatorics.
Multicolorings and (:)colorings have been studied both from combinatorial [7, 12, 27] and algorithmic [5, 19, 20, 25, 26, 30, 31, 34] points of view. The main reallife motivation comes from the problem of assigning frequencies to nodes in a cellular network so that adjacent nodes receive disjoint sets of frequencies on which they can operate. This makes (near)planar and distributed settings particularly interesting for practical applications. We refer to the survey of Halldórsson and Kortsarz [18] for a broader discussion.
In this paper we focus on the paradigm of exact exponential time algorithms: given a graph on vertices and numbers , we would like to determine whether is (:)colorable as quickly as possible. Since the problem is already NPhard for and , we do not expect it to be solvable in polynomial time, and hence look for an efficient exponentialtime algorithm. A straightforward dynamic programming approach yields an algorithm with running time^{1}^{1}1The notation hides factors polynomial in the input size. as follows. For each function and each , we create one boolean entry denoting whether one can choose independent sets in so that every vertex is covered exactly times. Then value can be computed as a disjunction of values over obtained from by subtracting on vertices from some independent set in .
This simple algorithm can be improved by finding an appropriate algebraic formula for the number of (:)colorings of the graph and using the inclusionexclusion principle to compute it quickly, similarly as in the case of standard colorings [2]. Such an algebraic formula was given by Nederlof [33, Theorem 3.5] in the context of a more general Multi Set Cover problem. Nederlof also observed that in the case of (:)coloring, a simple application of the inclusionexclusion principle to compute the formula yields an time exponentialspace algorithm. Hua et al. [22] noted that the formulation of Nederlof [33] for Multi Set Cover can be also used to obtain a polynomialspace algorithm for this problem. By taking all maximal independent sets to be the family in the Multi Set Cover problem, and applying the classic MoonMoser upper bound on their number [32], we obtain an algorithm for (:)coloring that runs in time and uses polynomial space. Note that by plugging to the results above, we obtain algorithms for the standard coloring problem with running time and exponential space usage, and with running time and polynomial space usage, which almost matches the fastest known procedures [2].
The complexity of (:)coloring becomes particularly interesting in the context of the Graph Homomorphism problem: given graphs and , with and vertices respectively, determine whether admits a homomorphism to . By the celebrated result of Hell and Nešetřil [21] the problem is in if is bipartite and complete otherwise. For quite a while it was open whether there is an algorithm for Graph Homomorphism running in time . It was recently answered in the negative by Cygan et al. [9]; more precisely, they proved that an algorithm with running time would contradict the Exponential Time Hypothesis (ETH) of Impagliazzo et al. [23]. However, Graph Homomorphism is a very general problem, hence researchers try to uncover a more finegrained picture and identify families of graphs such that the problem can be solved more efficiently whenever . For example, Fomin, Heggernes and Kratsch [13] showed that when is of treewidth at most , then Graph Homomorphism can be solved in time . It was later extended to graphs of cliquewidth bounded by , with an time bound by Wahlström [36]. On the other hand, needs not be sparse to admit efficient homomorphism testing: the family of cliques admits the running time as shown by Björklund et al. [2]. As noted above, this generalizes to Kneser graphs , by the time algorithm of Nederlof. In this context, the natural question is whether the appearance of in the base of the exponent is necessary, or is there an algorithm running in time for some universal constant independent of .
Our contribution.
We show that the algorithms for (:)coloring mentioned above are essentially optimal under the Exponential Time Hypothesis. Specifically, we prove the following results:
If there is an algorithm for (:)coloring that runs in time , for some computable function , then ETH fails. This holds even if the algorithm is only required to work on instances where .
Theorem 1.
If there is an algorithm for Graph Homomorphism that runs in time , for some computable function , then ETH fails. This holds even if the algorithm is only required to work on instances where is a Kneser graph with .
Corollary 2.
The bound for (:)coloring is tight, as the straightforward dynamic programming algorithm already shows. At first glance, one might have suspected that (:)coloring, as an interpolation between classical coloring and fractional coloring, both solvable in time [17], should be just as easy; Theorem 1 refutes this suspicion.
Corollary 1 in particular excludes any algorithm for testing homomorphisms into Kneser graphs with running time . It cannot give a tight lower bound matching the result of Cygan et al. [9] for general homomorphisms, because is not polynomial in . On the other hand, it exhibits the first explicit family of graphs for which the complexity of Graph Homomorphism increases with .
In our proof, we first show a lower bound for the list variant of the problem, where every vertex is given a list of colors that can be assigned to it (see Section 2 for formal definitions). The list version is reduced to the standard version by introducing a large Kneser graph ; we need and to be really small so that the size of this Kneser graph does not dwarf the size of the rest of the construction. However, this is not necessary for the list version, where we obtain lower bounds for a much wider range of functions .
If there is an algorithm for List (:coloring that runs in time , then ETH fails. This holds even if the algorithm is only required to work on instances where and for an arbitrarily chosen polynomialtime computable function such that and .
Theorem 3.
The crucial ingredient in the proof of Theorem 1 is the usage of detecting matrices introduced by Lindström [28]. We choose to work with their combinatorial formulation, hence we shall talk about detecting families. Suppose we are given some universe and there is an unknown function , for some fixed positive integer . One may think of as consisting of coins of unknown weights that are integers between and . We would like to learn (the weight of every coin) by asking a small number of queries of the following form: for a subset , what is (the total weight of coins in )? A set of queries sufficient for determining all the values of an arbitrary is called a detecting family. Of course can be learned by asking questions about single coins, but it turns out that significantly fewer questions are needed: there is a detecting family of size , for every fixed [28]. The logarithmic factor in the denominator will be crucial for deriving our lower bound.
Let us now sketch how detecting families are used in the proof of Theorem 1. Given an instance of 3SAT with variables and clauses, and a number , we will construct an instance of List :coloring for some . This instance will have a positive answer if and only if is satisfiable, and the constructed graph will have vertices. It can be easily seen that this will yield the promised lower bound.
Partition the clause set of into groups , each of size roughly ; thus . Similarly, partition the variable set of into groups , each of size roughly ; thus . In the output instance we create one vertex per each variable group—hence we have such vertices—and one block of vertices per each clause group, whose size will be determined in a moment. Our construction ensures that the set of colors assigned to a vertex created for a variable group misses one color from some subset of colors. The choice of the missing color corresponds to one of possible boolean assignments to the variables of the group.
Take any vertex from a block of vertices created for some clause group . We make it adjacent to vertices constructed for precisely those variable groups , for which there is some variable in that occurs in some clause of . This way, can only take a subset of the above missing colors corresponding to the chosen assignment on variables relevant to . By carefully selecting the list of , and some additional technical gadgeteering, we can express a constraint of the following form: the total number of satisfied literals in some subset of clauses of is exactly some number. Thus, we could verify that every clause of is satisfied by creating a block of vertices, each checking one clause. However, the whole graph output by the reduction would then have vertices, and we would not obtain any nontrivial lower bound. Instead, we create one vertex per each question in a detecting family on the universe , which has size . Then, the total number of vertices in the constructed graph will be , as intended.
Finally, we observe that from our main result one can infer a lower bound for the complexity of the Monomial Testing problem. Recall that in this problem we are given an arithmetic circuit that evaluates a homogenous polynomial over some field ; here, a polynomial is homogenous if all its monomials have the same total degree . The task is to verify whether has some monomial in which every variable has individual degree not larger than , for a given parameter . Abasi et al. [1] gave a randomized algorithm solving this problem in time , where is the degree of the polynomial, assuming that for a prime . This algorithm was later derandomized by Gabizon et al. [14] within the same running time, but under the assumption that the circuit is noncancelling: it has only input, addition, and multiplication gates. Abasi et al. [1] and Gabizon et al. [14] gave a number of applications of lowdegree monomial detection to concrete problems. For instance, Simple Path, the problem of finding a walk of length that visits every vertex at most times, can be solved in time . However, for Simple Path, as well as other problems that can be tackled using this technique, the best known lower bounds under ETH exclude only algorithms with running time . Whether the factor in the exponent is necessary was left open by Abasi et al. and Gabizon et al.
We observe that the List (:coloring problem can be reduced to Monomial Testing over the field in such a way that an time algorithm for the latter would imply a time algorithm for the former, which would contradict ETH. Thus, we show that the known algorithms for Monomial Testing most probably cannot be sped up in general; nevertheless, the question of lower bounds for specific applications remains open. However, going through List (:coloring to establish a lower bound for Monomial Testing is actually quite a detour, because the latter problem has a much larger expressive power. Therefore, we also give a more straightforward reduction that starts from a convenient form of Subset Sum; this reduction also proves the lower bound for a wider range of , expressed as a function of .
Outline.
In Section 2 we set up the notation as well as recall definitions and wellknown facts. We also discuss detecting families, the main combinatorial tool used in our reduction. In Section 3 we prove the lower bound for the list version of the problem, i.e., Theorem 1. In Section 4 we give a reduction from the list version to the standard version, thereby proving Theorem 1. Section 5 is devoted to deriving lower bounds for lowdegree monomial testing.
2 Preliminaries
Notation.
We use standard graph notation, see e.g. [10, 11]. All graphs we consider in this paper are simple and undirected. For an integer , we denote . By we denote the disjoint union, i.e., by we mean with the indication that and are disjoint. If and are instances of decision problems and , respectively, then we say that and are equivalent if either both and are YESinstances of respective problems, or both are NOinstances.
ExponentialTime Hypothesis.
The Exponential Time Hypothesis (ETH) of Impagliazzo et al. [23] states that there exists a constant , such that there is no algorithm solving SAT in time . During the recent years, ETH became the central conjecture used for proving tight bounds on the complexity of various problems. One of the most important results connected to ETH is the Sparsification Lemma [24], which essentially gives a reduction from an arbitrary instance of SAT to an instance where the number of clauses is linear in the number of variables. The following wellknown corollary can be derived by combining ETH with the Sparsification Lemma.
Theorem 4 (see e.g. Theorem 14.4 in [10]).
Unless ETH fails, there is no algorithm for SAT that runs in time , where denote the numbers of variables and clauses, respectively.
We need the following regularization result of Tovey [35]. Following Tovey, by SAT we call the variant of SAT where each clause of the input formula contains exactly different variables, and each variable occurs in at most clauses.
Lemma 5 ([35]).
Given a SAT formula with variables and clauses one can transform it in polynomial time into an equivalent SAT instance with variables and clauses.
Corollary 6.
Unless ETH fails, there is no algorithm for SAT that runs in time , where denotes the number of variables of the input formula.
List and nonuniform list (:)coloring
For integers and a graph with a function (assigning a list of colors to every vertex), an (:)coloring of is an assignment of exactly colors from to each vertex , such that adjacent vertices get disjoint color sets. The List (:coloring problem asks, given , whether an (:)coloring of exists.
As an intermediary step of our reduction, we will use the following generalization of list colorings where the number of demanded colors varies with every vertex. For integers , a graph with a function and a demand function , an (:)coloring of is an assignment of exactly colors from to each vertex , such that adjacent vertices get disjoint color sets. Nonuniform List (:coloring is then the problem in which given we ask if an (:)coloring of exists.
detecting families.
In our reductions the following notion plays a crucial role.
Definition 7.
A detecting family for a finite set is a family of subsets of such that for every two functions , , there is a set in the family such that .
A deterministic construction of sublinear, detecting families was given by Lindström [28], together with a proof that even the constant factor 2 in the family size cannot be improved.
Theorem 8 ([28]).
For every constant and finite set , there is a detecting family on of size . Furthermore, can be constructed in time polynomial in .
Other constructions, generalizations, and discussion of similar results can be found in Grebinski and Kucherov [16], and in Bshouty [3]. Note that the expression is just the product of as a vector in with the characteristic vector of . Hence, instead of subset families, Lindström speaks of detecting vectors, while later works see them as detecting matrices, that is, matrices with these vectors as rows (which define an injection on despite having few rows). Similar definitions appear in the study of query complexity, e.g., as in the popular Mastermind game [6].
While known polynomial deterministic constructions of detecting families involve some number theory or fourier analysis, their existence can be argued with an elementary probabilistic argument. Intuitively, a random subset will distinguish two distinct functions (meaning ) with probability at least . This is because any where and disagree is taken or not taken into with probability , while sums over cannot agree in both cases simultaneously, as they differ by and respectively. There are function pairs to be distinguished. In any subset of pairs, at least half are distinguished by a random set in expectation, thus at least one such set exists. Repeatedly finding such a set for undistinguished pairs, we get sets that distinguish all functions. More strongly though, when two functions differ on more values, the probability of distinguishing them increases significantly. Hence we need fewer random sets to distinguish all pairs of distant functions. On the other hand, there are few function pairs that are close, so we need few random sets to distinguish them all as well. This allows to show that in fact random sets are enough to form a detecting family with positive probability [16].
3 Hardness of List (:coloring
In this section we show our main technical contribution: an ETHbased lower bound for List (:coloring. The key part is reducing an variable instance SAT to an instance of Nonuniform List (:coloring with only vertices. Next, it is rather easy to reduce Nonuniform List (:coloring to List (:coloring. We proceed with the first, key part.
3.1 The nonuniform case
We prove the following theorem through the remaining part of this section.
Theorem 9.
For any instance of SAT with variables and any integer , there is an equivalent instance of Nonuniform List (:)coloring such that , and is 3colorable. Moreover, the instance and the 3coloring of can be constructed in time.
Consider an instance of SAT where each variable appears in at most four clauses. Let be the set of its variables and be the set of its clauses. Note that . Let . We shall construct, for some integers and :

a partition of variables into groups of size at most ,

a partition of clauses into groups of size at most ,

a function ,
such that the following condition holds:
For any , the variables occurring in clauses of are all different and they all belong to pairwise different variable groups. Moreover, the indices of these groups are mapped to pairwise different values by .  () 
In other words, any two literals of clauses in have different variables, and if they belong to and respectively, then .
Lemma 10.
Partitions , and a function satisfying ( ‣ 3.1) can be found in time .
Proof.
We first group variables, in a way such that the following holds: (P1) the variables occurring in any clause are different and belong to different variable groups. To this end, consider the graph with variables as vertices and edges between any two variables that occur in a common clause (i.e. the primal graph of ). Since no clause contains repeated variables, has no loops. Since every variable of occurs in at most four clauses, and since those clauses contain at most two other variables, the maximum degrees of is at most 8. Hence can be greedily colored with 9 colors. Then, we refine the partition given by colors to make every group have size at most , producing in total at most groups . (P1) holds, because any two variables occurring in a common clause are adjacent in , and thus get different colors, and thus are assigned to different groups.
Next, we group clauses in a way such that: (P2) the variables occurring in clauses of a group are all different and belong to different variable groups. For this, consider the graph with clauses as vertices, and with an edge between clauses if they contain two different variables from the same variable group. By (P1), has no loops. Since every clause contains exactly 3 variables, each variable is in a group with at most others, and every such variable occurs in at most 4 clauses, the maximum degree of is at most . We can therefore color greedily with colors. Similarly as before, we partition clauses into monochromatic groups of size at most each. Then (P2) holds by construction of the coloring.
Finally, consider a graph with variable groups as vertices, and with an edge between two variable groups if they contain two different variables occurring in clauses from a common clause group. More precisely, and are adjacent if there are two different variables and , and a clause group with clauses and (possibly ), such that occurs in and occurs in . By (P2), has no loops. Since a variable has at most other variables in its group, each of these variables occur in at most 4 clauses, each of these clauses has at most other clauses in its group, and each of these contains exactly 3 variables, the maximum degree of is at most . We can therefore color it greedily into colors. Let be the resulting coloring. By (P2) and the construction of this coloring, ( ‣ 3.1) holds.
The colorings can be found in linear time using standard techniques. Note that we have . Moreover, since , we get and hence . ∎
For every , the set of variables admits different assignments. We will therefore say that each assignment on is given by an integer , for example by interpreting the first bits of the binary representation of as truth values for variables in . Note that when , different integers from may give the same assignment on .
For , let be the set of indices of variable groups that contain some variable occurring in the clauses of . Since every clause contains exactly three literals, property ( ‣ 3.1) means that and that is injective over each . See Fig. 2.
For , let be a 4detecting family of subsets of , for some (we can assume does not depend on by adding arbitrary sets when ). For every , let .
We are now ready to build the graph , the demand function , and the list assignment as follows.

For , create a vertex with and .

For and , create a vertex adjacent to each for .
Let and 
For , create a vertex , adjacent to each for and to each (). Let and .
Before giving a detailed proof of the correctness, let us describe the reduction in intuitive terms. Note that vertices of type get all but one color from their list; this missing color, say , for some , defines an assignment on . For every the goal of the gadget consisting of and vertices is to express the constraint that every clause in has a literal satisfied by this assignment. Since are adjacent to all vertices in , they may only use the missing colors (of the form , where ). Since , there are such colors and of them go to . This leaves exactly colors for vertices of type , corresponding to a choice of satisfied literals from the literals in clauses of . The lists and demands for vertices guarantee that exactly chosen satisfied literals occur in clauses of . The properties of 4detecting families will ensure that every clause has exactly one chosen, satisfied literal, and hence at least one satisfied literal. We proceed with formal proofs.
Lemma 11.
If is satisfiable then is (:)colorable.
Proof.
Consider a satisfying assignment for . For , let be an integer giving the same assignment on as . For every clause of , choose one literal satisfied by in it, and let be index of the group containing the literal’s variable. Let be the (:)coloring of defined as follows, for , , :



.
Let us first check that every vertex gets colors from its list only. This is immediate for vertices and , while for it follows from the fact that gives a partial assignment to that satisfies some clause of .
Now let us check that for every vertex , the coloring assigns exactly colors to . For this follows from the fact that and . Since by property ( ‣ 3.1), is injective on , and thus on , we have . Similarly, since is injective on and , we get .
It remains to argue that the sets assigned to any two adjacent vertices are disjoint. There are three types of edges in the graph, namely , , and . The disjointness of and is immediate from the definition of , since . Fix . Since is injective on , for any two different , we have . Hence,
Since , it follows that edges of types and received disjoint sets of colors on their endpoints, concluding the proof. ∎
Lemma 12.
If is (:)colorable then is satisfiable.
Proof.
Assume that is (:)colorable, and let be the corresponding coloring.
For , we have and , so misses exactly one color from its list. Let , for some , be the missing color. We want to argue that the assignment for given by on each satisfies .
Consider any clause group , for . Every vertex in contains in its neighborhood. Therefore, the sets and are disjoint from . Since , we get that and are contained in the set of missing colors (corresponding to the chosen assignment). By property ( ‣ 3.1), this set has exactly different colors. Of these, exactly are contained in . Let the remaining colors be , for some subset of indices.
Since is disjoint from , we have for all . By definition of , for every there is a variable in that appears in some clause of . By property ( ‣ 3.1), it can only occur in one such clause, so let be the literal in the clause of where it appears. For every color , by definition of the lists for we know that gives a partial assignment to that satisfies some clause of . This means makes the literal true and occurs in a clause of . Therefore, for each , at least literals from the set occur in clauses of and are made true by the assignment .
Let be the function assigning to each clause the number of literals of in . By the above, for . Since each literal in belongs to some clause of , we have . Then,
Hence for . Let be the constant function . Note that
Since is a 4detecting family, this implies that . Thus, for every clause of we have , meaning that there is a literal from the set in this clause. All these literals are made positive by the assignment , therefore all clauses of are satisfied. Since was arbitrary, this concludes the proof that is a satisfying assignment for . ∎
The construction can clearly be made in polynomial time and the total number of vertices is . Moreover, we get a proper 3coloring of , by coloring vertices of the type by color 1, vertices of the type by color 2, and vertices of the type by color 3. By Lemmas 11 and 12, this concludes the proof of Theorem 9.
3.2 The uniform case
In this section we reduce the nonuniform case to the uniform one, and state the resulting lower bound on the complexity of List (:coloring.
Lemma 13.
For any instance of Nonuniform List (:)coloring where the graph is colorable, there is an equivalent instance of List (:)coloring. Moreover, given a coloring of the instance can be constructed in time polynomial in .
Proof.
Let be a coloring of . For every vertex , define a set of filling colors and put
Let be an (:)coloring of . We define a coloring by setting for every vertex . Observe that and . Since was a proper (:)coloring, adjacent vertices can only share the filling colors. However, the lists of adjacent vertices have disjoint subsets of filling colors, since these vertices are colored differently by . It follows that is an (:)coloring of .
Conversely, let be an (:)coloring of . For every vertex , we have . Define to be any cardinality subset of . It is immediate to check that is an (:)coloring of . ∎
We are now ready to prove one of our main results.
Theorem ??.
Proof.
Let be a function as in the statement. We can assume w.l.o.g. that , for otherwise we can replace with a function in the reasoning below, where is a big enough constant; note that . Fix a function and assume there is an algorithm for List (:coloring that runs in time , whenever . Consider an instance of SAT with variables. Let . By Theorem 9 in time we get an equivalent instance of Nonuniform List (:)coloring such that , , and a 3coloring of . Next, by Lemma 13 in time we get an equivalent instance of List (