Primitivity, Uniform Minimality, and State Complexity of Boolean Operations

Primitivity, Uniform Minimality, and State Complexity of Boolean Operations

Sylvie Davies University of Waterloo
Department of Pure Mathematics
11email: sldavies@uwaterloo.ca
Abstract

A minimal deterministic finite automaton (DFA) is uniformly minimal if it always remains minimal when the final state set is replaced by a non-empty proper subset of the state set. We prove that a permutation DFA is uniformly minimal if and only if its transition monoid is a primitive group. We use this to study boolean operations on group languages, which are recognized by direct products of permutation DFAs. A direct product cannot be uniformly minimal, except in the trivial case where one of the DFAs in the product is a one-state DFA. However, non-trivial direct products can satisfy a weaker condition we call uniform boolean minimality, where only final state sets used to recognize boolean operations are considered. We give sufficient conditions for a direct product of two DFAs to be uniformly boolean minimal, which in turn gives sufficient conditions for pairs of group languages to have maximal state complexity under all binary boolean operations (“maximal boolean complexity”). In the case of permutation DFAs with one final state, we give necessary and sufficient conditions for pairs of group languages to have maximal boolean complexity. Our results demonstrate a connection between primitive groups and automata with strong minimality properties.

1 Introduction

Formal definitions are postponed until later.

The state complexity of a regular language is the minimal number of states needed to recognize the language with a deterministic finite automaton. It is well-known that if and are regular languages over a common alphabet with state complexity and respectively, then the state complexity of is at most , and this bound is tight for all . The upper bound follows from the standard “direct product” automaton construction for recognizing unions of regular languages. Examples which meet the bound were given by Maslov in 1970 [15], and independently by Yu, Zhuang and Salomaa in 1994 [21], who noted that the same bound holds for intersection.

More generally, if is a binary boolean operation on languages over , then has state complexity at most , and this bound is tight for all if and only if is proper, that is, not a constant function ( or ) or a function that depends on only one argument (for example, ). This was proved by Brzozowski in 2009 [5], who gave examples showing that is a tight bound for symmetric difference, and noted that the examples for union and symmetric difference (together with their complements) suffice to prove is a tight upper bound for all proper binary boolean operations.

To prove a lower bound on the worst-case state complexity of a regular operation, it suffices to give just one family of examples that meet the bound. Such families are called witnesses. Witnesses are known for most commonly used unary and binary operations on regular languages. However, there are several directions of research in state complexity which necessitate finding new witnesses for operations that have previously been studied. For example, sometimes the first witnesses found for an operation are not optimal in terms of alphabet size, so researchers will look for new witnesses over a smaller alphabet. When studying the -ary versions of binary operations, such as the union of languages, or more generally combined operations [13, 18], such as the star of a union of languages, again new witnesses are needed. It is also interesting to consider families of languages that are simultaneous witnesses for multiple operations; it is not generally the case that a witness for one operation will work for others. Brzozowski found a family of languages which is a simultaneous witness for reversal, star, concatenation and all binary boolean operations [6]. Each of the problems just mentioned, as well as the fundamental problem of determining the worst-case state complexity of an operation, may also be studied in subclasses of the regular languages, such as the star-free languages [9] or ideal languages [8]. Often the known witnesses do not lie in the subclass, so new witnesses must be found.

In some cases, new witnesses can be found by making slight modifications to known witnesses, but this is not always successful. Furthermore, this technique does little to advance our understanding of why particular witnesses work. For these reasons, it is desirable to have results which describe the general landscape of witnesses for a particular operation. By this we mean results that give necessary conditions for witnesses, revealing common structural properties that all witnesses share, or sufficient conditions allowing one to easily generate examples of witnesses or check whether a candidate family is a witness. For example, Salomaa, Wood and Yu proved that a regular language of state complexity is a witness for the reversal operation if the transition monoid of its minimal DFA has the maximal possible size  [19]; this gives a general sufficient condition for a language to be a witness for reversal. Ideally, collecting results of this sort would eventually lead to a complete classification of witnesses for commonly used operations. In reality, we suspect the problem of fully classifying witnesses is only tractable in very special cases, but even results which take small steps in this direction can be quite useful and enlightening.

The main inspiration for this work is a paper of Bell, Brzozowski, Moreira, and Reis [3], which considers the following question: for which pairs of languages (with state complexities and respectively) does reach the maximal state complexity for every proper binary boolean operation ? Bell et al. give sufficient conditions for this to occur. The conditions are based on the transition monoids of the minimal deterministic automata of and ; essentially, if the transition monoids contain the symmetric groups and , then “usually” (i.e., excluding a known class of counterexamples) the language will have state complexity . We obtain a refinement of this result: we prove that if the transition monoids contain 2-transitive groups, then “usually” has state complexity (though our notion of “usually” is more restrictive than that of Bell et al.).

We also obtain necessary and sufficient conditions for to have state complexity in the special case where the minimal automata for and have exactly one final state, and their transition monoids contain a transitive permutation group. We can view this result as solving a particular special case of the problem of characterizing witnesses for boolean operations.

To obtain these results, we exploit a connection between a certain class of permutation groups called primitive groups, and the notion of uniformly minimal automata introduced by Restivo and Vaglica [16]. A minimal deterministic finite automaton (DFA) is uniformly minimal if it always remains minimal when the final state set is replaced by a non-empty proper subset of the state set. For a permutation DFA (that is, a DFA whose transition monoid is a permutation group), uniform minimality is equivalent to primitivity of the transition monoid. Although uniform minimality played an important role in the paper of Bell et al., this connection with primitive groups was not used in their paper. Primitive groups are an important and well-studied class of permutation groups; there are deep results on their structure, and large libraries of primitive groups are available in computer algebra systems such as GAP [14] and Magma [4]. Uniformly minimal DFAs have received comparatively little study; thus this connection has significant implications for the theory of uniformly minimal DFAs.

The paper is structured as follows. Section 2 contains background material needed to understand the paper. Section 3 discusses the relationship between primitive groups and uniformly minimal permutation DFAs. Section 4 contains our main results on witnesses for the maximal state complexity of boolean operations. Section 5 concludes the paper by giving a summary of our results and stating some open problems.

2 Definitions and Notation

For a function , we typically write the symbol to the right of its arguments. For example, if the image of under is , we write . Functions are composed from left to right, and composition is denoted by juxtaposition: if , then denotes the composition of and , and is an element of .

Let denote the power set of , that is, the set of all subsets of . Given we may extend by union to obtain a function (where is the closure of under union) defined by for . We denote the extension by the same symbol as the original function. Note that for convenience, we often make no distinction between an element of a set and the singleton containing the element; so and .

2.1 Monoids, Groups and Actions

A monoid is a set equipped with an associative binary operation and an identity element such that for all . Typically we omit the symbol for the operation; so the previous equation could be written as . For we write for the -fold product of with itself, and define for all . If for each , there exists such that , then is called a group, and is called the inverse of and denoted . The order of an element of a group is the least integer such that .

A submonoid of is a subset which is closed under and contains the identity of . If additionally is a group, it is called a subgroup of ; we write to mean that is a subgroup of . Note that we do not allow submonoids or subgroups of to have an identity element different from that of . If are elements of a group , then denotes the group generated by , the smallest subgroup of containing .

Let and be monoids with identity elements and respectively. A homomorphism from to is a function such that for all and . A bijective homomorphism is called an isomorphism, and two monoids are said to be isomorphic if there exists an isomorphism from one to the other. We write to mean that and are isomorphic. If and are groups and is a homomorphism, the kernel of is the set , that is, the set of elements of that map to the identity of . If is a group, , and for all and , we say is a normal subgroup of . A group is simple if it has no non-trivial proper normal subgroups, that is, the only normal subgroups of are itself and the trivial group (containing just the identity element of ). The kernel of a homomorphism from to another group is always a normal subgroup of . We occasionally use the following elementary facts about normal subgroups and homomorphisms:

  • If is a homomorphism and is the trivial one-element subgroup of , then is injective.

  • If is a surjective homomorphism and is a normal subgroup of , then is a normal subgroup of .

A monoid action of on a set is a function such that and for all and . Equivalently, it is a family of functions such that for all and is the identity map on . The map is called the action of . To simplify the notation, we often omit the action symbol and just write instead of or . Furthermore, we typically avoid assigning a symbol to the action at all; rather than “let be a monoid action of on ” we write “let be a monoid acting on ”, meaning that has a specific but nameless action on associated with it. If generates the monoid , a monoid action is completely determined by its values on elements of . If is a group, we use the term group action rather than monoid action.

Let be a group acting on . For , the stabilizer subgroup or simply stabilizer of is the subgroup of . For , the setwise stabilizer of is the subgroup . Elements of the setwise stabilizer need not fix every element of ; for example, if and then is in the setwise stabilizer of .

Let be a finite set. A function is called a transformation of . The set of all transformations of is a monoid under composition called the full transformation monoid . A submonoid of is called a transformation monoid on . The degree of a transformation monoid on is the size of . If is a transformation monoid on , the monoid action given by for , is called the natural action of . If we write for .

A bijective transformation of is called a permutation of . We can describe any particular permutation of using cycle notation as follows. For , we write for the permutation that sends to for , sends to , and fixes all other elements of . This permutation is called a cycle of length , or simply a -cycle. All permutations that are not cycles can be expressed as a product of cycles. The identity permutation is denoted by an empty cycle, i.e., “”. Cycle notation conflicts with the notation we use for ordered -tuples, but this should not cause confusion. We mainly use cycle notation when giving concrete examples of permutations.

The set of all permutations of is a subgroup of called the symmetric group . A subgroup of is called a permutation group on ; this a special type of transformation monoid and we have the same notions of degree and natural action. The alternating group is the subgroup of consisting of all permutations that can be expressed as a product of an even number of -cycles. If we write for and for .

Let be a group acting on . We say that the action of is transitive or that acts transitively on if for all , there exists such that . We say the action of is -transitive or acts -transitively on if for all pairs of -tuples , there exists such that for we have ; informally, -transitive means “transitive on -tuples”.

A non-empty set is called a block for if for all , either (equivalently, ) or . A block is trivial if it is a singleton or the entire set . We say the action of is primitive or that acts primitively on if it is transitive and all of its blocks are trivial. Equivalently, a transitive group action of is primitive if for every set with at least two elements, there exists such that .

If is a permutation group and the natural action of is transitive (-transitive, primitive), then we say is a transitive group (-transitive group, primitive group). For example, the cyclic group is a transitive group, since its natural action on is transitive. This terminology can cause confusion, since transitivity, -transitivity and primitivity are properties of actions and not groups; statements like “ is transitive” or “ is primitive” are statements about a particular action of (the natural action) rather than the abstract group itself. In particular, these properties are not preserved under isomorphism; for example, the group is not transitive, but it is isomorphic to the transitive group .

As the notions of transitivity and primitivity are central to this paper, we give numerous examples to illustrate them below.

Example 1

Consider the group . This group is clearly transitive, since its natural action on is transitive. However, it is imprimitive, since and are non-trivial blocks. Indeed, if we let , then and . Hence for all , we either have or , and similarly for . One may also verify that , and are non-trivial blocks, and that there are no blocks of size 4 or 5.

Example 2

Consider the group . This group is clearly transitive, and it is also primitive. To see this, suppose for a contradiction that is a non-trivial block. Let and let , where and are distinct elements of . Then , so we must have since is a block. Thus for each , we have , and thus . Then since , we have , and thus . By induction it follows that . We claim , which contradicts the fact that is a non-trivial block. Indeed, for , we have if and only if . Since is prime and , we see that is coprime with . Hence by elementary number theory, there exists such that and so as required. Hence for all , which proves the claim. It follows has no non-trivial blocks, and thus is primitive.

The above argument can be generalized to prove that a cyclic group is primitive if and only if is prime. If , then for each divisor of and each integer , we see that is a block. In particular, when is composite, there exists a divisor with , giving rise to a non-trivial block.

Example 3

Consider the group . This group is intransitive, since (for example) it does not contain a permutation mapping to . Thus it is imprimitive by definition. Alternatively, observe that and are non-trivial blocks for .

Generally an intransitive group will always have non-trivial blocks, but there is one exception: the trivial subgroup of (containing only the identity element). The natural action of this group is clearly not transitive on , but its only blocks are the trivial blocks , and . To avoid dealing with this exception, we require primitive groups to be transitive by definition.

The next example shows that we have the following hierarchy of permutation group properties:

(2-transitive) (primitive) (transitive).

These implications do not reverse. Cyclic groups of composite order give examples of transitive imprimitive groups, while cyclic groups of prime order give examples of primitive, non-2-transitive groups. (For example, the group is not 2-transitive on since nothing maps the pair to the pair .)

Example 4

The alternating group is 2-transitive for . Indeed, given , the permutation is the product of an even number of 2-cycles, and it maps the pair to . We claim is also primitive for . To see this, first note that is a cyclic group of prime order for . For , suppose for a contradiction that is a non-trivial block. Then has at least two elements and , but is not all of . Choose . Since is 2-transitive, there exists an element which maps the pair to . Then (since and contain ), and thus since is a block. But contains and does not, which is a contradiction. Thus all blocks of are trivial, and thus is primitive. In fact, this argument shows that all 2-transitive groups are primitive.

The following fact is immediate from the definitions of transitivity and primitivity, and is frequently useful: if is a subgroup of and is transitive (primitive), then is also transitive (primitive). For example, the symmetric group is primitive for , since it contains the primitive group .

So far, we have only looked at cyclic groups and the symmetric and alternating groups. For our last pair of examples, we consider two subgroups of that are a little more interesting.

Example 5

Define , and , and let . We claim this group is transitive on . For and , we will write to mean . Observe that

Thus for each , there is some group element that maps to . If maps to , then maps to . It follows for each , there is some element that maps to , and another element that maps to , giving

Thus is transitive. It is also imprimitive, with non-trivial blocks and . Indeed, we see that

Hence these sets are non-trivial blocks.

Example 6

Define and and let . It is easy to see that this group is transitive on : just verify that can be mapped to every other element and use the argument from the previous example. This group is also primitive. To see this, first note that the subgroup acts primitively on , since it is a cyclic group of prime order. Hence a non-trivial block of cannot be a subset of , so in particular a non-trivial block of must contain . Suppose is a non-trivial block that contains ; then contains and hence . Since is non-trivial, it contains some element , and since we have . This implies , and so is trivial, which is a contradiction. Thus all blocks of are trivial, and thus is primitive.

A congruence of a monoid action of on is an equivalence relation on that is -invariant in the following sense: if is an equivalence class, then for all , there exists an equivalence class such that . In other words, if and are equivalent, then and are equivalent for all . The equality congruence in which elements are equivalent only if they are equal, and the full congruence in which all elements are equivalent, are called trivial congruences. If is a transformation monoid on , a congruence of the natural action is called an -congruence.

The notion of congruences leads to an important alternate characterization of primitivity. In the case of a permutation group on , notice that for all and , the set has the same size as . Hence a -congruence has the following property: if is an equivalence class, then for all , the set is also an equivalence class. In particular, we either have or for all ; thus the classes of -congruences are blocks.

In fact, if is transitive, then every -congruence arises from the blocks of as follows. If is a block for , the block system corresponding to is the set . As the name implies, each set in a block system is also a block for . Indeed, for all , we either have or , and in the latter case, . But is a block, so this implies and thus . Thus every set in a block system in a block, so in particular, all distinct sets in a block system are pairwise disjoint. Furthermore, since is transitive, each element of appears in at least one block of the system. It follows that block systems are partitions of , and thus equivalence relations on . It is easy to see that block systems are -invariant, and thus are -congruences.

Thus every block gives rise to a block system that is a -congruence, and every -congruence consists of blocks; it follows block systems and -congruences are one and the same if is a transitive group. If all -congruences are trivial, then all block systems of consist only of trivial blocks, and vice versa. Thus we obtain our alternate characterization of primitivity: a transitive permutation group on is primitive if and only if all -congruences are trivial.

Let us revisit some of our earlier examples of primitive and imprimitive groups in the context of this new characterization.

Example 7

Consider the imprimitive cyclic group of Example 1. Put an equivalence relation on by letting if and have the same parity (odd or even). Notice that for , the elements and have opposite parity. Thus is a -congruence, since if then , and so if is the equivalence class of then is also an equivalence class. In fact, the classes of are just the blocks and we found in Example 1; thus the -congruence corresponds to the block system . If we define an equivalence relation by if and are equivalent modulo , we obtain a non-trival -congruence corresponding to the block system . As for the trivial -congruences, the equality congruence corresponds to the block system containing the singletons, and the full congruence corresponds to the block system that just contains the full set .

Example 8

Consider the primitive cyclic group of Example 2. With the notion of -congruences, it is much easier to prove that this group is primitive. Indeed, fix a -congruence on . By -invariance, all classes of the -congruence must have the same size, say . If the congruence has classes, then we have . So is either or since is prime, which means the classes are either singletons (giving the equality congruence) or the full set (giving the full congruence). Thus all -congruences are trivial, and thus is primitive. Alternatively, we could make the same argument in terms of block systems, using the fact that all blocks in a system have the same size to show all blocks must be trivial. This argument actually shows that not only are cyclic groups of prime order primitive, but all transitive groups of prime degree are primitive (since is the degree of a permutation group on ).

2.2 Languages, Automata and State Complexity

Let be a finite set. The set of all finite-length sequences of elements of is called the free monoid generated by , and is denoted . In this context, elements of are called letters, and elements of are called words over . The operation of the free monoid is concatenation of words, and the identity element is the empty word of length zero. A set is called a language over , and is called the alphabet of .

We use the convention that a language is implicitly a pair , so for example, the language over alphabet and the language over alphabet are distinct. In particular, two words over different alphabets are necessarily distinct. This is similar to the convention which views two functions with different codomains as necessarily distinct.

A deterministic finite automaton (DFA) is a tuple where and are finite sets, is a monoid action, , and . The elements of are called states; the state is called the initial state and the states in are called final states. The set is the alphabet of the automaton. The monoid action is called the transition function.

Since generates , we may completely specify the action by defining the function for each . If for , then is the composition . The function is necessarily the identity map. The monoid is called the transition monoid of ; it is a submonoid of and thus has a natural action on . We call the function the action of . Under our notational conventions, we may write as or simply . We may also extend by union and apply it to subsets of the state set: for we have . We also sometimes write to mean .

A state is reachable from if for some . Two states are distinguishable by if there exists such that . We frequently use two special cases of these definitions: A state is reachable if it is reachable from the initial state , and states are distinguishable if they are distinguishable by . We say is accessible if every state is reachable (from the initial state ), and strongly connected if every state is reachable from every other state. A state is empty if for all . In a strongly connected DFA, there exists an empty state if and only if all states are empty.

Consider the following relation on : two states are related if and only if they are indistinguishable by , that is, for all we have . This is an equivalence relation on , and in fact it is an -congruence. Indeed, if and are equivalent, we have for all . So in particular, if we take for some fixed , then for all , and thus and are equivalent for all . This congruence is called the indistiguishability congruence of .

The language recognized by or simply language of is the language over . A language which can be recognized by a DFA is called a regular language. Two DFAs are equivalent if they have the same language. Two DFAs and with are isomorphic if there is a bijection such that , , and for all ; in other words, they are identical up to the naming of the states. In particular, isomorphic DFAs are equivalent.

We say is minimal if the number of states is minimal among all DFAs equivalent to . It is well-known that for each regular language , all minimal DFAs recognizing are isomorphic and hence have the same number of states. The number of states in a minimal DFA for is called the state complexity of , and is denoted . A DFA is minimal if and only if all states are reachable and all pairs of states are distinguishable.

Given a binary regular operation , the state complexity of the operation is the following function:

That is, it is the maximal state complexity of the language resulting from the operation, expressed as a function of the state complexities of the operation’s arguments.

For and , we say if for all ; we say that is an upper bound for the state complexity of the operation if , and a tight upper bound if .

In the definition of state complexity of operations, we assume that takes two languages over the same alphabet as arguments. This is justified by our view that words over different alphabets are necessarily distinct; hence, for example, the union of two languages over different alphabets would be a set containing a mixture of words over different alphabets, which is not a language. To perform such an operation, one must first convert the operands to languages over a common alphabet. This convention is very common in the literature; however, Brzozowski has recently argued this convention is unnecessary, and in fact leads to incorrect state complexity bounds for operations on languages over different alphabets, since converting the input languages to a common alphabet can change their state complexities [7]. Brzozowski introduces a distinction between restricted state complexity of operations, the traditional model in which operands must have the same alphabet, and unrestricted state complexity of operations, a new model which produces accurate state complexity bounds for operations on languages over different alphabets.

We use restricted state complexity in this paper for the following reasons. First, computing unrestricted state complexity requires using DFAs which have an empty state, and in particular are not strongly connected. In this paper, we mainly study DFAs whose transition monoids are permutation groups, which are always strongly connected. This group-theoretic focus is essential to most of our results. Working with DFAs that are not strongly connected would take us into the realm of semigroup theory, and we are unsure how much of our work would carry over. Second, restricted state complexity has been the dominant model of state complexity of operations for many years, while unrestricted state complexity is a recent generalization. Although restricted state complexity gives incorrect results when applied to languages over different alphabets, it is otherwise a correct model. We have chosen to study the simpler case of restricted state complexity in this paper and leave the more general unrestricted case for potential future work.

3 Primitive Groups and Uniform Minimality

A DFA is called a permutation DFA if is a permutation group on . In this case we call the transition group rather than the transition monoid. The languages recognized by permutation DFAs are called group languages.

Proposition 1

For a permutation DFA , the following are equivalent:

  1. is accessible.

  2. is strongly connected.

  3. is transitive.

Proof

: Since is accessible, for each there exists such that . Since is a group, the element has an inverse, and thus for all we have . It follows is transitive.

: Since is transitive, for all there exists such that . This is precisely saying that is strongly connected.

The last implication is immediate. ∎

Note that holds for arbitrary DFAs, not only permutation DFAs.

Let be a DFA and let be its language. For , we write for the DFA obtained by replacing the final state set of with . We say a regular language is a cognate of if for some . We say a DFA is a cognate of if for some ; so a language is a cognate of if and only if it is recognized by a cognate of . If or , then is called a trivial cognate of , since is either or the empty language .

We say is uniformly minimal if all non-trivial cognates of are minimal. That is, we can reassign the final state set of the DFA in any non-trivial way and the new DFA will always be minimal. Equivalently, all cognates of have the same state complexity . This definition is essentially restricted to accessible DFAs, since if is not accessible, then not all states are reachable and hence no cognate of can be minimal.

Restivo and Vaglica introduced and studied uniformly minimal DFAs in [17]. Their notion of uniform minimality is almost the same as ours, except it is restricted to strongly connected DFAs. Presumably, Restivo and Vaglica were interested in DFAs that are minimal for every reassignment of initial and final states; if a DFA is not strongly connected, we can reassign the initial state to obtain a new DFA which is not accessible and hence not minimal. However, for strongly connected DFAs, the choice of initial state has no effect on minimality since every state is reachable from each possible choice of initial state. Hence we lose nothing by fixing an initial state and generalizing to accessible DFAs.

Remark

Restivo and Vaglica also studied uniformly minimal DFAs in [16], but they used different terminology. They used the term “almost uniformly minimal” for the notion discussed above, and used “uniformly minimal” for a stronger condition that can only be met by incomplete DFAs (which we do not discuss in this paper).

A DFA is called simple if all -congruences are trivial. Ésik proved the following result for strongly connected DFAs [17, Proposition 1]. The same proof works for accessible DFAs.

Proposition 2

An accessible DFA is uniformly minimal if and only if it is simple.

Proof

Suppose is simple, that is, all -congruences are trivial. Then in particular, for every , the indistiguishability congruence of is trivial. If the indistiguishability congruence for is the equality relation, then each state lies in its own class, so all pairs of states are distinguishable. Since is accessible, all states are reachable, and hence is minimal. If the indistinguishability congruence for is the full relation, then all states are indistinguishable. But final states are always distinguishable from non-final states, so this can only happen if all states are final () or all states are non-final (). Hence if , then is minimal, so it follows that is uniformly minimal.

Conversely, suppose is not simple, and there exists a non-trivial -congruence. Then this congruence has a class which has at least two elements, but is not all of . Let be the final state set of and let . For all , the states and both lie in the set , which is contained in some congruence class . If , then we have . If , then we have . Thus for all , we have , and so and are not distinguishable by . Hence is not uniformly minimal, since is not minimal. ∎

In the special case of permutation DFAs, we have:

Corollary 1

An accessible permutation DFA is uniformly minimal if and only if is primitive.

Proof

If is uniformly minimal, then it is simple, so all -congruences are trivial. Now, recall that a group is primitive if and only if all -congruences are trivial. Since is a group, we see that is primitive.

Conversely, if is primitive, then all -congruences are trivial. Hence is simple and hence uniformly minimal. ∎

Note that both implications in Corollary 1 are vacuously true if is not accessible: cannot be uniformly minimal, and cannot be transitive and thus cannot be primitive. Thus one can technically omit the accessible assumption.

It seems this relationship between primitivity and minimality has been overlooked until recently. Primitive groups have seen increasing application in automata theory over the past decade, particularly in connection with the classical synchronization problem for DFAs; for a survey of such work see [2]. The connection between simple DFAs and primitive groups was recently noted by Almeida and Rodaro [1]. However, primitive groups are not mentioned in Restivo and Vaglica’s work on uniformly minimal DFAs, nor in any other work on DFA minimality that we are aware of.

The wealth of results on primitive groups makes Corollary 1 quite useful for studying and constructing uniformly minimal DFAs. For example, we can use this corollary to easily prove that for each , there exists a uniformly minimal DFA with states. Restivo and Vaglica proved this using a rather complicated construction [16, Theorem 3].

Proposition 3

For each , there exists a uniformly minimal permutation DFA with states.

Proof

The symmetric group is primitive for all , and clearly for each there exists an -state DFA with transition group . For example, let be a generating set of the symmetric group and let be a DFA with states , alphabet , and transition function with for . In fact we can use a binary alphabet, since has generating sets of size two for all . ∎

This proof illustrates a technique that is very useful for producing examples of DFAs. If we have a generating set for a transformation monoid, we can construct a DFA which has that monoid as its transition monoid.

Example 9

Let be the DFA with alphabet defined as follows.

  • The states are , the initial state is , and the final states are .

  • The transformations are the permutations and .

More formally, we mean that the transition function of is given by and . However, we will generally be brief when describing DFAs, as above.

The permutations and generate the alternating group . Thus the transition group of is . We saw in Example 4 that is transitive and primitive. Hence by Proposition 1, is strongly connected, and by Corollary 1, is uniformly minimal.

A state diagram of is given in Figure 1. We can see from the diagram that is indeed strongly connected. It is tedious, but possible to verify that is uniformly minimal by checking that it is minimal with respect to every non-empty, proper subset of .



Figure 1: Uniformly minimal DFA of Example 9.


Figure 2: Non-minimal DFA of Example 10 with an imprimitive transition group.
Example 10

Let be the DFA with alphabet , states , initial state , final states and . A diagram is in Figure 2.

The transition group of is the cyclic group of order six, which is imprimitive. We saw in Example 1 that is a block for this group. Hence for all , we either have or . Thus if , then for all we have . This means all pairs of states in are indistinguishable by , and hence is not minimal.

This argument actually shows that whenever is a non-trivial block of , the DFA is not minimal. In fact, this also holds whenever is a union of non-trivial blocks of (see Lemma 1 below).

Note that if we construct a DFA from a cyclic group of prime order, we get a uniformly minimal DFA, since cyclic groups of prime order are primitive.

There exist many infinite families of primitive groups, and hence of uniformly minimal permutation DFAs. However, there are infinitely many positive integers for which the only primitive groups of degree are and  [12, pg. 66]. Hence other infinite families of primitive groups cannot be used to construct -state uniformly minimal DFAs for every , unless we “fill in the gaps” with symmetric or alternating groups.

Remark

Steinberg has extended the notion of primitivity to transformation monoids [20]. Steinberg defines a transformation monoid to be primitive if there are no non-trivial -congruences. Under this definition, an accessible DFA is uniformly minimal if and only if the transition monoid is primitive. However, we have not investigated whether any of our other results that hold for primitive groups are also true for primitive monoids.

We close this section with a technical lemma that generalizes Proposition 2. If is a transformation monoid on and , we say that is saturated by an -congruence if it is a union of classes of the -congruence.

Lemma 1

An accessible DFA with is minimal if and only if there is no non-trivial -congruence that saturates .

It follows that if all -congruences are trivial, then is uniformly minimal. Conversely, if there is a non-trivial -congruence, then it saturates its own congruence classes and at least one class is a proper non-empty subset of , and thus is not uniformly minimal. Hence this indeed generalizes Proposition 2.

Proof

We prove the contrapositive: is not minimal if and only if there exists a non-trivial -congruence that saturates .

Suppose is not minimal. Then the indistinguishability congruence of is a non-trivial -congruence, since at least two states are indistinguishable. Suppose there is an indistinguishability class that is neither contained in nor disjoint from . Then there exist such that and . But then and are distinguishable by , which cannot happen since is an indistinguishability class. Thus for each indistinguishability class , we have or . Then we have , so is saturated by its indistinguishability congruence.

Conversely, let be the congruence classes of a non-trivial -congruence that saturates . Choose a congruence class of size at least two. Then for all we have for some . Since is a union of congruence classes, either or . Hence for and all , we have . It follows that states in are indistinguishable, and thus is not minimal. ∎

In the special case of permutation DFAs, this has a useful consequence.

Corollary 2

Let be a permutation DFA. If or , then is minimal if and only if it is accessible.

Proof

Recall that if is a transitive permutation group and and are classes of a -congruence, then . It follows that if , then a non-trivial -congruence cannot saturate since all the congruence classes have size at least two. Furthermore, an -congruence saturates if and only if it saturates , and if then a non-trivial -congruence cannot saturate the set of size one. Hence if is accessible, it is minimal by Lemma 1. On the other hand, if is not accessible, it cannot be minimal. ∎

4 Main Results

Throughout this section, and are minimal DFAs with a common alphabet . The languages of and are and , and the transition monoids are and , respectively. For , we write for and for . Sometimes we will assume and are permutation DFAs, and then we will use and for the transition groups rather than and .

4.1 Direct Products and Boolean Operations

The direct product of and is the DFA with state set , alphabet , transitions for each and , initial state , and an unspecified set of final states. By assigning particular sets of final states to as described below, we can recognize the languages resulting from arbitrary binary boolean operations on and .

Fix a function ; these are called binary boolean functions. For a set , let denote the characteristic function of , defined by if and otherwise. We can think of as giving the “truth value” of the proposition “  ”, where 0 is false and 1 is true. Now for and , define . Then with final states recognizes the language defined by

For example, if is the “logical or” function, then , since if or . Similarly, the “logical and” function gives the intersection .

We say that a boolean function (and the associated boolean operation on languages) is proper if its output depends on both of its arguments. For example, (giving ) only depends on the first argument, and (giving with ) depends on neither argument, so they are not proper. If and have state complexity and respectively, then the improper binary boolean operations have state complexity (if they are constant), state complexity (if they depend only on the first operand), or state complexity (if they depend only on the second operand). There are 16 binary boolean operations in total, and one may easily verify that 10 of them are proper.

If and have and states respectively, has states. Hence every proper binary boolean operation has state complexity bounded by . It is well-known that this bound is tight for general regular languages, and it remains tight for regular group languages. In fact, the original witnesses for union given by Maslov [15] and Yu et al. [21] are group languages, and we will demonstrate later (in Example 13) that these languages are also witnesses for all other proper binary boolean operations. We will say the pair has maximal boolean complexity if for all proper binary boolean operations .

Suppose that and . We say a subset of is -compatible if it is equal to for some proper binary boolean operation . Notice that has maximal boolean complexity if and only if is minimal for every -compatible subset of . We disallow and since then is minimal for only if and or ; these cases are uninteresting. Similarly, we exclude and . We say the pair (or the direct product ) is uniformly boolean minimal if for every pair of sets with and and every -compatible set , the DFA is minimal. In other words, if every pair of cognates of and has maximal boolean complexity.

We give an example of a pair of DFAs that are not uniformly boolean minimal, as well as a pair of DFAs that are.

Example 11

Define two DFAs over alphabet as follows:

  • has state set , initial state , final state set , and transformations