Homing Vector Automata
Abstract
We introduce homing vector automata, which are finite automata augmented by a vector that is multiplied at each step by a matrix determined by the current transition, and have to return the vector to its original setting in order to accept the input. The computational power and properties of deterministic, nondeterministic, blind, nonblind, realtime and oneway versions of these machines are examined and compared to various related types of automata. A generalized version of the SternBrocot encoding method, suitable for representing strings on arbitrary alphabets, is also developed.
1 \secondaddressUniversità di Roma “La Sapienza”, Dipartimento di Matematica, Piazzale Aldo Moro 2, 00185 Roma, Italy
68Q45, 68Q05
1 Introduction
The idea of augmenting the classical finite automaton model with an external storage unit that can hold unlimited amounts of information, yet can be accessed in a limited mode, is a celebrated topic of automata theory, with pushdown automata [2] and counter machines [5] as the most prominent examples.
Focusing on finite automata equipped with a register containing a singleton, one can list automata with multiplication [10], extended finite automata (EFA’s) [15] (also known as “group automata”), and Mautomata [11] among the many such proposed models. In these machines, the register can respectively store rational numbers, elements from a group, or a monoid, and can be modified by multiplication. A computation is deemed successful if the register, which is initialized to the identity element, is equal to the identity element at the end.
Generalizing the idea of finite automata equipped with a register, we have previously introduced vector automata in [18]. A vector automaton is a finite automaton which is endowed with a vector, and which can multiply this vector with an appropriate matrix at each step. One of the entries of this vector can be tested for equality to a rational number. The machine accepts an input string if the computation ends in an accept state, and the test for equivalence succeeds.
Many important models of probabilistic and quantum computation [20, 13] can be viewed in terms of vectors being multiplied by matrices. Vector automata are useful for focusing on this matrix multiplication view of programming, abstracting the remaining features of such models away. In order to incorporate the aforementioned notion of the computation being successful if the register/counter returns to its initial value at the end of the computation to this setup, we propose the new homing vector automaton (HVA) model in this paper. A homing vector automaton can multiply its vector with an appropriate matrix at each step and can check the entire vector for equivalence to the initial value of the vector. The acceptance criterion is ending up in an accept state with the value of the vector being equal to the initial vector.
We examine these machines under several different regimes, enabling us to determine the effect of definitional parameters such as whether the input is scanned in “real time” or pausing the head on an input symbol for several steps is allowed, whether the machine can read its register during computation or is “blind”, with acceptance possible only if the register has returned to its initial value at the end, and whether nondeterminism confers any additional recognition power over deterministic programs. We demonstrate a close relationship between the nondeterministic oneway blind variant of the HVA model and the EFA’s of [15], which we believe to be important for the following reasons.
The study of EFAâs until now essentially covered the cases of free (non commutative) groups, and free abelian groups, together with their algebraic extensions of finite index (virtually free groups), where some theorems of algebraic nature characterize the power of such models and the properties of the languages recognized by these automata[10, 4, 3, 11]. There are no comparable general results for EFA’s associated with groups other than the ones mentioned above. In this theoretical setting, a model that seems natural to investigate is the linear one, that is, the one defined by a group, or more generally, by a semigroup of matrices over the field of rational numbers.
Even in the cases of groups of matrices of low dimension (that are not of the types mentioned above), the study of HVA’s and EFA’s becomes quickly nontrivial, and there are remarkable classes of linear groups for which little is known about the EFA and HVA models that they define. The same consideration obviously holds for the more general case of machines defined by semigroups of matrices.
Under this respect, the relationship among the two models exhibited here, and the fact that the new techniques (like the adaptation of the SternBrocot encoding method to HVA “programming” in Section 6) in this paper can be ported to proofs about EFA’s, provide a new opening for EFA research.
The rest of this paper is structured as follows: Section 2 contains definitions of basic terminology and the machine models that will be compared to several restricted versions of our model. Section 3 defines the homing vector automaton in its most general (nondeterministic, oneway, nonblind) form, and introduces the various limited versions that we will use to examine the nature of the contribution of different aspects of the definition to the power of the machine. In Section 4, we discuss the relationship between the nondeterministic oneway blind version of the HVA model and the extended finite automata of [15], and use this link to prove that these machines can recognize any Turing recognizable language, even when the vector dimension is restricted to four. We then focus on HVA’s with realtime access to their input, providing an exact characterization of the class of languages recognized by these machines for the case where the alphabet is unary, and showing that the nondeterministic version is stronger than its deterministic counterpart, recognizing some complete languages, in Section 5. A method we use for encoding strings on an alphabet of arbitrary size in a blind homing vector automaton, based on SternBrocot trees [19, 1], may be of independent interest. Section 6 contains a hierarchy result based on the dimension of the vector when the matrix entries belong to a restricted set. Further results regarding the model’s relation with counter automata and closure properties are presented in Sections 7 and 8. Section 9 lists some open questions.
2 Preliminaries
The following notation will be used throughout the paper: is the set of states, where denotes the initial state, denotes the set of accepting states, and is the input alphabet. denotes the set of all nonempty words over . An input string is placed between two endmarker symbols on an infinite tape in the form . By , we represent the reverse of the string . denotes the ’th symbol of . The length of is denoted by .
A machine can be realtime or oneway depending on the allowed tape head movements. If the tape head is allowed to stay put during some steps of its lefttoright traversal, then the machine is oneway, and can make (empty string) transitions without consuming any input symbol. A machine is realtime if the tape head can only move to the right at each step.
A machine M is said to recognize a language L if M accepts all and only the members of L. For a machine model , denotes the class of languages recognized by machines of type .
Let be a group under the operation denoted by with the neutral element denoted by . An extended finite automaton [4] over the group (EFA()) is a 6tuple
where the transition function is defined as
An extended finite automaton can be viewed as a nondeterministic finite automaton equipped with a register in which any element of can be written. means that when reads the symbol (or empty string) in state , it will move to state , and write in the register, where is the old content of the register. The initial value of the register is the neutral element of the group . The string is accepted if after completely reading the string, enters an accept state, with the content of the register being equal to the neutral element of .
A realtime deterministic counter automaton (rtDkCA) [6] is a 5tuple
The transition function of is specified so that means that moves the head to the next symbol, switches to state , and updates its counters according to the list of increments represented by , if it reads symbol , when in state , and with describing whether the respective counter values equal zero or not. At the beginning of the computation, the tape head is placed on the symbol , and the counters are set to 0. At the end of the computation, that is, after the right endmarker has been scanned, the input is accepted if is in an accept state.
A realtime deterministic blind counter automaton (rtDkBCA) [9] is a DkCA which can check the value of its counters only at the end of the computation. Formally, the transition function is now replaced by The input is accepted at the end of the computation if enters an accept state, and all counter values are equal to 0.
3 Homing vector automata
A oneway nondeterministic homing vector automaton (1NHVA(k)) is a 6tuple
where v is a dimensional initial row vector, and the transition function is defined as
such that , where indicates equality to the initial vector , and otherwise, denotes the power set of the set , and is the set of rationalvalued matrices. The initial vector is freely chosen by the designer of the automaton.
Specifically, means that when consumes in state , with its current vector corresponding to ( having the value = if and only if the current vector equals the initial vector), it switches to state , multiplying its current vector with the matrix on the right. Thus the vector at step is obtained by multiplying the vector at step by a specified matrix so that . The string is accepted if enters an accept state, and the vector is equal to the initial vector as a result of arriving upon the right endmarker symbol .
A oneway nondeterministic blind homing vector automaton (1NBHVA(k)) is a 1NHVA(k) which is not allowed to check the vector until the end of the computation. The transition function is defined as
where means that when consumes in state , it switches to state , multiplying its current vector with the matrix on the right. The acceptance condition is the same as for 1NHVA()’s.
A realtime deterministic homing vector automaton (rtDHVA(k)) is a 1NHVA which is not allowed to make any nondeterministic moves and operates in realtime. The transition function is defined as
A realtime deterministic blind homing vector automaton (rtDBHVA(k)) is just a rtDHVA(k) which is not allowed to check the vector until the end of the computation. The transition function is now replaced by
4 Relationship with extended finite automata
In this section, we will exploit a relationship between 1NBHVA()’s and the extended finite automata of [15] over free groups to demonstrate the power of homing vector automata.
The two models seem to be linked in the case of extended finite automata over matrix groups, as the register is multiplied with a matrix at each step of the computation. Let us emphasize that the two models are different in the following sense. In a homing vector automaton, there is an initial vector v, and the accepted strings are those which label a computation path along which the product of the sequence of matrices on the transitions is a matrix P, such that P. In the most general setting, the set of transition matrices belongs to the semigroup of rational matrices. In other words, in an accepting computation, the multiplied matrices belong to the stabilizer semigroup of the set of rational matrices with respect to . In contrast, in an extended finite automaton over a matrix group, accepting computations are those in which the product of the transition matrices equals the identity matrix. In that sense, oneway nondeterministic blind homing vector automata can be seen as akin to what someone who wanted to define a version of EFA’s associated with general matrix semigroups, rather than groups, would come up with. Some open questions regarding the link between the two models are listed in Section 9.
We assume a familiarity of the reader with some basic notions from free group theory (see [12, 14] for classical references of this topic). Let us denote by the free noncommutative group over generators. Let us first recall some known results on such groups. A wellknown theorem by Nielsen and Schreier states that every subgroup of a free group is free (see [14], Proposition 2.11). In particular, for every there is a set of elements so that the subgroup generated by is isomorphic to .
We focus our attention on . It is well known that admits a representation by using matrices of the group of all invertible matrices of dimension over the ring of integers. In the sequel, stands for the identity matrix. Let be a positive integer and consider the group of matrices generated by
The following result holds (see [12], Theorem 14.2.1). {fact} The group is isomorphic to . Moreover, if , for every matrix of which is not a power of , As a straightforward consequence, there exists a subgroup of which is isomorphic to and such that:
(1) 
Indeed, let be the subgroup of generated by and . By the theorem of Nielsen and Schreier mentioned above, is freely generated by the latter two elements. In particular, no element of equals a power of . This implies that (1) holds for . Denote
(2) 
the isomorphism from onto .
Now we show that every extended finite automaton over a free group can be simulated by a suitably defined homing vector automaton that is of dimension 2, nondeterministic, and oneway. Precisely, we prove the following result. {thrm} {proof} Let be an extended finite automaton on . Starting from , we construct a 1NBHVA(2) as follows. Let be the finite set of elements of defined as
Set an enumeration on such that with and let , where, for every , is the image under the morphism (2) of . The transition function of
is defined as: for every , and for every
where and . Finally, we set .
Let and be the languages accepted by and respectively. Let us show that the two languages are equal. If , the claim is trivial. Suppose then . If , then there exists a computation of
from to a final state such that and the element associated with is . By the definition of , there exists a computation of
such that, for every , . Set . Since we get , and .
Suppose now that . Then there exists a computation of
from to a final state , where and the vector associated with is , with . Since is accepted by , then . By (1), then one has . On the other hand, let the computation of
where, for every , . Then the element is such that . Hence implies and thus .
This allows us to draw the following conclusion about the class of languages recognized by 1NBHVA(2)’s.
The family of contextfree languages is included in {proof} Dassow and Mitrana [4] provided (see [3] and [11] for alternative proofs that fix some details in the original proof) a characterization of contextfree languages in terms of automata over a free group, namely, they stated that is the family contextfree languages. The result then follows by Theorem 4.
Let be the group given by the direct product of by . The following theorem characterizes the family of recursively enumerable languages.
[16] is the family of recursively enumerable languages.
We can now demonstrate the huge power of 1NBHVA(4)’s.
The family of recursively enumerable languages is included in . {proof} We will show how to simulate an EFA by a 1NBHVA(4). The result then follows from Theorem 4.
Let be the group of matrices of dimension
Since, by (2), is an isomorphism from onto the group of matrices , the mapping defined as:
is an isomorphism from onto .
Let be an extended finite automaton over . Starting from , we construct a 1NBHVA(4) as follows. Let be the finite set of elements of defined as
Set an enumeration on such that with and let , where, for every , is the image under the morphism of . The transition function of
is defined as: for every , and for every
where and . Finally, we set .
Let and be the languages accepted by and respectively. By using the very same argument of the proof of Theorem 4, one verifies .
5 Realtime homing vector automata
In the previous section, we have seen that allowing oneway access to the input tape raises nondeterministic blind homing vector automata of small vector dimension to Turing equivalence. For this reason, we will be focusing on realtime input in the rest of the paper.
Another way in which one can examine the nature of the computational power of homing vector automata is by examining models in which the matrices used at each step for transforming the vectors are restricted in some way. Although the definition given in Section 3 allows arbitrary rational matrices, we are going to constrain the matrix entries to belong to a particular set. In most automaton algorithms in this paper, the entries of the matrices belong to the set , as this basic set will be seen to already capture many capabilities of homing vector automata. Let us note that multiplications with matrices whose entries belong to this set can be used to perform additions, subtractions, resets, and swaps between the vector entries. It is possible to recognize some of the languages in the following discussion with homing vector automata of lower dimension when a larger set of matrix entries is allowed. Some related open questions can be found in Section 9.
We start by comparing the deterministic blind and nonblind versions of our model.
{proof} It is obvious that any rtDBHVA() can be simulated by a rtDHVA(). We are going to prove that the inclusion is proper by the witness language . Let us first construct a rtDHVA(2) recognizing . The idea is to simulate a counter with the help of the matrices. Starting with the initial vector , multiplies the vector with the matrix for each it reads before the ’s, incrementing the first entry of the vector with each such multiplication. After finishing reading the first segment of ’s, multiplies the vector with the matrix , decrementing the first entry of the vector for each .
At each step, checks the current value of the vector for equality to . If the equality is detected right after finishing reading the ’s, it is the case that , and multiplies the vector with the identity matrix at each step for the rest of the computation. If that is not the case, continues to multiply the vector with matrix for each after the ’s. The value of the vector will be equal to at the end of the computation if and only if or .
Note that can be also recognized by a rtDHVA(1) by using the matrices and .
Now we are going to show that can not be recognized by any rtDBHVA(). Suppose for a contradiction that is recognized by some rtDBHVA() . After reading a prefix of ’s, the computation of on a sufficiently long suffix of ’s will go through a sequence of states, followed by a state loop. Suppose that is in the same state after reading two different strings and , . Now consider the strings and . After reading any one of these strings, should be in the same accept state, and the vector should be at its initial value. Assume that the strings in question are both extended with one more . Since the same vector is being multiplied with the same matrix associated with the same state during the processing of that last , it is not possible for to give different responses to and . Noting that , whereas , we conclude that can not be recognized by any rtDBHVA().
We can give the following characterization when the alphabet is unary.
For any , all languages over accepted by a rtDHVA() are regular.
Let be a unary language accepted by a rtDHVA() and let be the initial vector of . We are going to construct a DFA recognizing to prove that is regular. We assume that is infinite and make the following observation. Since has finitely many states, at least one of the accept states of will be accepting more than one string. Let and be the shortest strings accepted by an accept state with . When accepting and , is in state and the value of the vector is equal to . After reading , is in the same configuration as it was after reading and this configuration will be repeated inside a loop of steps. Therefore, we can conclude that all strings of the form for some positive integer will be accepted by .
Between consecutive times accepts a string, some other strings may be accepted by some other accept states. Let be a string accepted by with . Then all strings of the form for some positive integer will be accepted by since every time enters the accepting configuration at state , will enter the accepting configuration at state after steps. The same reasoning applies to any other accepting configuration inside the loop.
Now, let us construct a DFA accepting . has states. The first states correspond to the strings of length at most and the state is an accept state for all that is of length at most . and the next states stand for the configuration loop. States corresponding to accepting configurations inside the loop are labeled as accept states.
The transitions of the DFA are as follows:
Since can be recognized by a DFA, is regular. We conclude that any unary language accepted by a rtDHVA() is regular.
In the following theorem, we show that nondeterministic realtime homing vector automata are more powerful than their deterministic versions, both in the blind and nonblind cases. {thrm}

.

.
i. It is obvious that a rtDBHVA() can be simulated by a rtNBHVA(). We are going to show that the inclusion is proper by constructing a rtNBHVA(3) recognizing the unary nonregular language . Starting with the initial vector , multiplies the vector with matrix when reading each . The idea is to add the first and second entries together repeatedly to obtain powers of 2, so that after reading symbols the value of the vector is equal to . nondeterministically guesses and starts decrementing the first entry from that point on by multiplying the vector with the matrix which fixes the second entry to 1 immediately. At the end of the computation, the value of the vector is equal to if and only if the input string is of the form for some .
From Theorem 5, we know that every unary language recognized by a rtDHVA() is regular, concluding that .
ii. It is obvious that a rtDHVA() can be simulated by a rtNHVA(). The inclusion is proper as we have shown that can be recognized by a rtNHBVA(3), a feat that is impossible for rtDHVA()’s for any .
In the following theorem, we show that by allowing nondeterminism it is possible to recognize an complete language in realtime and with matrices which are restricted to have integer entries. is the complete language which is the collection of all strings of the form , such that and the ’s are numbers in binary notation , and there exists a set satisfying , where . We define in which the binary numbers appear in reverse order. It is obvious that , since . It is possible to reduce to in polynomial time by reversing the binary numbers that appear in the input. Therefore, we can conclude that is complete.
. {proof} We construct a rtNBHVA(5) recognizing . The idea of this construction is to read the binary numbers in the string to entries of the vector, and to nondeterministically select the set of numbers that add up to . We let the initial vector equal . We first encode to the first entry of the vector as follows: While scanning the symbols of , multiplies the vector with the matrix (resp. ) for each scanned (resp. ). The powers of 2 required for the encoding are obtained by adding the third and fourth entries, which always contain identical numbers, to each other, creating the effect of multiplication by 2. When reads a , multiplies the vector with the matrix which subtracts the second entry from the first entry and resets the second entry back to 0, and the third and fourth entries back to 1.
In the rest of the computation, nondeterministically decides which ’s to subtract from the first entry. Each selected is encoded using the same technique into the second entry of the vector. While scanning the symbols of , multiplies the vector with the matrix (resp. ) for each scanned (resp. ).
chooses another if it wishes, and the same procedure is applied. At the end of the input, accepts if the vector is equal to , which requires that the first entry of the vector is equal to 0. This is possible iff there exists a set of ’s whose sum add up to .
A language is in class () if there is a deterministic Turing machine that decides within time and space where is the length of the input. Since the numbers in the vector can grow by at most a fixed number of bits in each multiplication, a Turing machine simulating a rtDHVA() requires only linear space [18]. Since the numbers in the vector can have length , whereas the matrix dimensions and entries are independent of the input length , multiplication of a vector and a matrix requires time for each input symbol. We can conclude that (rtDHVA()).
6 Encoding strings with homing vector automata
6.1 SternBrocot encoding
The SternBrocot tree is an infinite complete binary tree whose nodes correspond onetoone to positive rational numbers [19, 1]. Crucially for our purposes, the SternBrocot tree provides a basis for representing strings as vectors of integers, as suggested for binary alphabets in [8]. The fractions in the SternBrocot tree can be stored as vectors of dimension 2, where the vector entries are the denominator and the numerator of the fraction. This representation allows us to perform the binary encoding easily in homing vector automata, as follows.
The empty string is represented by . Now suppose that we want to encode a binary string of length . For to , if , we add the value of the first entry to the second one, and if , we add the value of the second entry to the first one, multiplying the vector with the appropriate one of the following matrices and :
A list of some binary strings and their encodings follows. A proof on the uniqueness of the encoding can be found in [8].
Given the vector representation of a string , it is also possible to decode the string with the following procedure: Let and . Set if , and otherwise. Subtract the smaller entry from the larger one to obtain and repeat this routine until you obtain the vector . When the given vector is not a valid representation of a string, then it is not possible to obtain . The matrices required for this procedure are , which has the effect of subtracting the value of the first entry of the vector it is multiplied with from the second entry, and , for the symmetric action. Note that and .
6.2 Generalized SternBrocot encoding
We generalize the scheme mentioned above to strings on alphabets of arbitrary size and present a new method for encoding strings. Let , and . With the generalized SternBrocot encoding method described below, it is possible to uniquely encode using a vector of size and matrices whose entries belong to the set . Let us note that one can use other methods to encode strings on arbitrary alphabet size using a vector of a smaller dimension but matrices whose entries belong to a larger set.
We start with the dimensional vector , which represents the empty string. Suppose that . To encode , for to , if , the vector is multiplied with the matrix , the dimensional identity matrix whose ’th column is replaced with a column of ’s. Multiplication with causes the ’th entry of the vector to be replaced by the sum of all the entries in the vector.
Among the different generalizations of the SternBrocot fractions, one that appears in [7] under the name of “Stern’s triatomic sequence” is similar to the encoding we propose for the case . The similarity lies in the construction of the sequence, but that sequence is not used for the purpose of encoding. As far as we know, no such generalization exists for the case .
In the following lemma, we prove the uniqueness of this generalized encoding.
No two distinct strings on () can be represented by the same vector of size using the generalized SternBrocot encoding. {proof} We will prove by induction on that if a dimensional vector is the generalized SternBrocot encoding of a string of length , then is not the encoding of any other string of length at most .
The empty string is represented by the dimensional vector of 1’s. The claim clearly holds for , since no other strings of at most this length exist. Now assume that the claim holds for all natural numbers up to . Let be a string of length . The vector representing is obtained by multiplying the vector , representing the first symbols of , with if . We will examine various possibilities regarding this final multiplication. Note that at a single step, it is possible to modify only a single entry of each vector. Now consider any string with and . If and have the same first symbols, then , the last symbols of the two strings are unequal, and it is not possible to obtain since the same vector is multiplied by different matrices. In the remaining case, we know by the induction hypothesis that . If these vectors disagree in more than two entries, there is no way that one can obtain the same vector by multiplying them once with some matrices of the form . So we consider the case of the two vectors disagreeing in at most two entries.
Suppose that and differ only in the ’th entry. If the final multiplications both work on the ’th entries, they will be adding the same number to them, resulting again in vectors differing in their ’th entries. If one or more of the final multiplications deals with another entry, then the final vectors will surely disagree in that entry. It is not possible in any case to end up with equal vectors,
Now suppose that and differ in two entries. If the final multiplications work on the same entry, then the final vectors will disagree in at least one entry. In the only remaining case, each one of the vectors is multiplied by a matrix updating a different one of the disagreeing entries. Let us represent the disagreeing entries of the vectors and by the pairs and , respectively. Let be the sum of the remaining entries in which the vectors agree. Without loss of generality, say that the entries become and after the final multiplication. But if the final vectors are equal, these pairs should also be equal, implying , an impossibility.
We therefore conclude that it is not possible to have for any string of length at most .
Like in the binary case, given the vector representation of a string, it is possible to reconstruct the string. The allones vector corresponds to the empty string. Any other vector encoding a string of length in this encoding has a unique maximum entry, say at position . Then is , and we obtain by subtracting the sum of the other entries from the greatest entry. One repeats this procedure, reconstructing the string from right to left, until one ends up with the allones vector. In terms of matrices, multiplications with the inverses of ’s capture this process.
6.3 A hierarchy result
We will now use the generalized SternBrocot encoding to show a hierarchy result based on the dimension of the vector when an additional restriction is imposed on the matrices.
Let be the set of matrices whose entries belong to the set for some positive integer , and let a rtDHVA() that is restricted to using members of in its matrices and initial vector be denoted a rtDHVA(). Then for .
Using the generalized SternBrocot encoding, first we will show that it is possible to recognize by a rtDHVA() .
The input alphabet is , and the corresponding matrices are described in Section 6.2. Starting with the dimensional vector of 1’s, encodes the string by multiplying its vector with the matrix whenever it reads an until it encounters a . After reading the , starts decoding by multiplying the vector with matrix whenever it reads an .
If the string is of the form , the vector will be multiplied with the inverse matrices in the correct order and the resulting value of the vector will be .
We also need to show that the input string is not accepted when it is not of the form . Consider an input string and suppose that it is accepted by . Let denote the vector after reading and let denote the product of the matrices the vector is multiplied while reading . Since the string is accepted, must be true. Since the matrices are invertible, is also invertible, which implies that must be unique. Since , then must be the vector obtained after reading . From Lemma 6.2, we know that every string has a unique representation and we conclude that and are identical.
We are now going to show that for . We first note that the value of any entry of a vector of size can be at most after reading symbols. This is possible by letting the initial vector have in all entries, and multiplying the vector with the matrix with all entries equal to at each step. Similarly, the smallest possible value of an entry is , and so the number of possible different values for a single entry is . If the machine has states, is an upper bound for the number of different reachable configurations after reading symbols. Since there are strings of length when the alphabet consists of symbols, for large and , the machine will end up in the same configuration after reading two different strings and . This will cause the strings and which are not in to be accepted by the machine. Therefore, we conclude that .
Since a vector automaton with a larger vector size can trivially simulate a vector automaton with a smaller vector size, the result follows.
7 Relationship with realtime counter automata
A realtime deterministic homing vector automaton with a vector of dimension two can simulate a realtime deterministic one counter automaton (rtD1CA) which accepts with the condition that the counter is empty (See the proof of Theorem 5). The fact that the individual entries of the vector can not be checked prevents us from simulating a realtime deterministic multicounter automaton.
In the following theorem, we show that a rtDBHVA(2) can recognize a language which is not recognizable by any multicounter machine and we conclude that the language recognition powers of homing vector automata and multicounter machines are incomparable. Note that the result also implies the incomparability of and . This is not the case for the blind versions, as we prove in the second part of the theorem.

and are incomparable.

i. We know that can be recognized by a rtDBHVA(2) by Theorem 6.3. In [17], it is proven that no counter machine with counters operating in time can recognize . Since we are working with realtime machines, the result follows.
On the other hand, it is known that the nonregular unary language can be recognized by a rtD2CA [18]. By Theorem 5, we know that rtDHVA()’s and inherently rtDBHVA()’s can recognize only regular languages in the unary case. Hence, we conclude that the two models are incomparable.
ii. Let us simulate a given rtDBCA by a rtDBHVA(). Let be the initial vector of . ’st entry of the vector will remain unchanged throughout the computation which will allow the counter updates. At each step of the computation, will multiply the vector with the appropriate matrix where is the set of all matrices corresponding to possible counter updates. Since each counter can be decremented, incremented or left unchanged, . All matrices will have the property that and . When the ’th counter is incremented and decremented, then and , respectively. At the end of the computation, the input will be accepted if the vector is equal to , which happens iff all counters have value 0.
The inclusion is proper by the witness language .
We have mentioned that deterministic blind homing vector automaton can recognize the language which is not recognizable by any counter machine. Consider the language , whose Parikh image is not semilinear, which proves that the language is not contextfree. Let us note that it is also possible to recognize by a rtDBHVA(3) by using the same idea in the proof of Theorem 5.
8 Closure properties
In this section, we examine the closure properties of the class of languages recognized by realtime homing vector automata. We start with a lemma which will be useful in our proofs. The languages mentioned below are from [10].

.

.

.

.
We can show all these languages to be unrecognizable by rtDHVA’s by applying the following common reasoning. Assume that the language in question is recognized by some rtDHVA() . Since there are finitely many states, one of the states of will end up accepting more than one member of the language. For each language, we will focus on two such members and . Note that is in the same configuration (since it has also returned to its initial vector) after reading both and . We then append another string to both strings, selected so that and . The responses of to the and has to be identical, since it will have returned to the same configuration after processing both strings. We conclude that can not distinguish between these two strings, and therefore that . All that remains is to provide the strings , , and for the languages in the statement of the lemma. In the following, and .

, , and .

, and .
