Real-Time Vector Automata

# Real-Time Vector Automata

Özlem Salehi Boǧaziçi University, Department of Computer Engineering, Bebek 34342 Istanbul, Turkey
, 11email: say@boun.edu.tr,
Abuzer Yakaryılmaz University of Latvia, Faculty of Computing, Raina bulv. 19, Rīga, LV-1586, Latvia
Yakaryılmaz was partially supported by FP7 FET-Open project QCS.
A. C. Cem Say Boǧaziçi University, Department of Computer Engineering, Bebek 34342 Istanbul, Turkey
, 11email: say@boun.edu.tr,
###### Abstract

We study the computational power of real-time finite automata that have been augmented with a vector of dimension , and programmed to multiply this vector at each step by an appropriately selected matrix. Only one entry of the vector can be tested for equality to 1 at any time. Classes of languages recognized by deterministic, nondeterministic, and “blind” versions of these machines are studied and compared with each other, and the associated classes for multicounter automata, automata with multiplication, and generalized finite automata.

###### Keywords:
vector automata, counter automata, automata with multiplication, generalized automata

## 1 Introduction

There have been numerous generalizations of the standard deterministic finite automaton model. In this paper, we introduce the vector automaton, which is linked to many such generalizations like counter automata, automata with multiplication, and generalized stochastic automata [FMR68, Gre78, ISK76, Tur69]. A vector automaton is a finite automaton endowed with a -dimensional vector, and the capability of multiplying this vector with an appropriately selected matrix at every computational step. Only one of the entries of the vector can be tested for equality to 1 at any step. Since equipping these machines with a “one-way” input head, which is allowed to pause on some symbols during its left-to-right traversal of the input, would easily make them Turing-equivalent, we focus on the case of real-time input, looking at the deterministic and nondeterministic versions of the model. We make a distinction between general vector automata and “blind” ones, where the equality test can be performed only at the end of the computation. We examine the effects of restricting to 1, and the input alphabet to be unary. The related language classes are compared with each other, and classes associated with other models in the literature. The deterministic blind version of the model turns out to be equivalent to Turakainen’s generalized stochastic automata in one language recognition mode, whereas real-time nondeterministic blind vector automata are shown to recognize some -complete languages.

## 2 Background

### 2.1 Notation

Throughout the paper, the following notation will be used: is the set of states, where denotes the initial state, denotes the set of accept states, and is the input alphabet. An input string is placed between two endmarker symbols on an infinite tape in the form . The set represents the possible head directions. The tape head can stay in the same position (), move one square to the right (), or move one square to the left () in one step.

For a machine model , denotes the class of languages recognized by automata of type .

Let denote the matrix obtained by setting the ’th entry of the first column of the identity matrix to . For a vector , the product is the vector obtained by adding times the ’th entry of to the first entry when , and the vector obtained by multiplying the first entry of by when .

### 2.2 Machine definitions

#### 2.2.1 Multicounter Automata.

A real-time deterministic multicounter automaton (rtDkCA) [FMR68] is a 5-tuple

 M=(Q,Σ,δ,q0,Qa).

The transition function of is specified so that means that moves the head to the next symbol, switches to state , and updates its counters according to the list of increments represented by , if it reads symbol , when in state , and with the counter values having signs as described by . At the beginning of the computation, the tape head is placed on the symbol , and the counters are set to 0. At the end of the computation, that is, after the right endmarker has been scanned, the input is accepted if is in an accept state.

A real-time deterministic blind multicounter automaton (rtDkBCA) [Gre78] is a rtDkCA which can check the value of its counters only at the end of the computation. Formally, the transition function is now replaced by The input is accepted at the end of the computation if enters an accept state, and all counter values are equal to 0.

#### 2.2.2 Finite Automata With Multiplication.

A one-way deterministic finite automaton with multiplication (1DFAM) [ISK76] is a 6-tuple

 M=(Q,Σ,δ,q0,Qa,Γ),

where is a finite set of rational numbers (multipliers). The transition function is defined as where has a register which can store any rational number, and is initially set to 1. Reading input symbol in state , compares the current value of the register with 1, thereby calculating the corresponding value , and switches its state to , “moves” its head in “direction” , and multiplies the register by , in accordance with the transition function value . The input string is accepted if enters an accept state with the register value equaling 1 after it scans the right endmarker symbol.

A 1DFAM without equality (1DFAMW) is a 1DFAM which can not check whether or not the register has value 1 during computation. The transition function is replaced by . The accept condition of the 1DFAMW is the same with the 1DFAM.

#### 2.2.3 Generalized Finite Automata.

A generalized finite automaton (GFA) [Tur69] is a 5-tuple

 G=(Q,Σ,{Aσ∈Σ},v0,f),

where the ’s are are real valued transition matrices, and and are the real valued initial row vector and final column vector, respectively. The acceptance value for an input string is defined as .

A GFA whose components are restricted to be rational numbers is called a Turakainen finite automaton (TuFA) in [Yak12].

Let be a Turakainen finite automaton. Languages of the form

for any constitute the class .

## 3 Vector Automata

A real-time deterministic vector automaton of dimension (rtDVA(k)) is a 6-tuple

 V=(Q,Σ,δ,q0,Qa,v),

where is a -dimensional initial row vector, and the transition function is defined as

 δ:Q×Σ×Ω→Q×S,

where is the set of rational-valued matrices, and , as in the definition of 1DFAM’s.

Specifically, means that when is in state reading symbol , and the first entry of its vector corresponds to (with having the value = if and only if this entry is equal to 1), moves to state , multiplying its vector with the matrix . As in the definition of 1DFAM’s, is taken to be = if the first entry of the vector equals 1, and otherwise. The string is accepted if enters an accept state, and the first entry of the vector is 1, after processing the right end-marker symbol .

###### Remark 1

The designer of the automaton is free to choose the initial setting of the vector.

In the definition, it is stated that the machine can only check the first entry of the vector for equality to 1. Sometimes we find it convenient to design programs that check for equality to some number other than 1. One may also wish that it were possible to check not the first, but some other entry of the vector. In the following theorem, we show that we can assume our rtDVA(k)’s have that flexibility. For the purposes of that theorem, let a rtDVA(k) be a machine similar to a rtDVA(k), but with a generalized definition that enables it to check the ’th entry, for equality to the number .

###### Theorem 3.1

i. Given a rtDVA(k) recognizing a language , one can construct a rtDVA(k) that recognizes . ii. For any , given a rtDVA(k) recognizing a language , one can construct a rtDVA(+1) that recognizes .

###### Proof

i. Suppose that we are given a rtDVA(k) . We will construct an equivalent rtDVA(k) . Let denote the matrix obtained from the identity matrix by interchanging the first and ’th rows. We will use multiplications with repeatedly to swap the first and ’th entries of the vector when it is time for that value to be checked, and then to restore the vector back to its original order, so that the rest of the computation is not affected. The initial vector of has to be a reordered version of to let the machine check the correct entry at the first step, so . We update the individual transitions so that if has the move , then has the move for every , , and .

ii. Suppose that we are given a rtDVA(k) . We construct an equivalent rtDVA(+1) . The idea is to repeatedly subtract from the first entry of the vector when it is time for that value to be checked, and then add to restore the original vector. We will use the additional entry (which will always equal 1 throughout the computation) in the vector of to perform these additions and subtractions, as will be explained soon. Let be a -dimensional vector equaling , where . The initial vector of has to be a modified version of to accommodate the check for equality to in the first step, so . For every individual transition of , has the move , where the matrix has been obtained by adding a new row-column pair to , i.e. for , for , for and .

Note that when , there is an alternative method for constructing an equivalent rtDVA(k) which does not require an extra entry in the vector, where the first entry is modified simply by repeated multiplications with and when necessary. ∎

We conclude this section with two examples that will familiarize us with the programming of rtDVA(k)’s.

###### Example 1

n is a Fibonacci number.

###### Proof

We construct a rtDVA(5) recognizing as follows: We let the initial vector equal . Reading each , we multiply the vector with the matrix if the first entry of the of the vector is equal to 0, and with otherwise.

 M1=⎡⎢ ⎢ ⎢ ⎢ ⎢ ⎢⎣000001110011000−10010−10011⎤⎥ ⎥ ⎥ ⎥ ⎥ ⎥⎦M2=⎡⎢ ⎢ ⎢ ⎢ ⎢ ⎢⎣000001100000100−10010−10011⎤⎥ ⎥ ⎥ ⎥ ⎥ ⎥⎦.

After reading the ’th , the fourth entry of the vector equals . The second and third entries of the vector hold consecutive Fibonacci numbers. The first entry is equal to 0 whenever equals the second entry, which triggers the next Fibonacci number to be computed and assigned to the second entry in the following step. Otherwise, the second and third entries remain unchanged until reaches the second entry. accepts if the computation ends with the first entry equaling 0, which occurs if and only if the input length is a Fibonacci number. ∎

.

###### Proof

We construct a rtDVA(2) with initial vector . If the input is the empty string, accepts. Otherwise, increments the first entry of the vector by multiplying it by 2 on reading the first which is performed by multiplying the vector with the matrix .

 M1=[2001]

It then repeats the following procedure for the rest of the computation: Decrement the first entry of the vector by multiplying it by until it reaches one, while parallelly incrementing the second entry of the vector by multiplying it by 2 with the help of matrix . The second entry stops increasing exactly when the first counter reaches 1. Then the directions are swapped, with the second entry now being decremented, and the first entry going up by multiplying the vector with the matrix .

 M2=[12002]M3=[20012]

When the second entry of the vector reaches 1, the first entry of the vector is multiplied by 2 one more time with the help of matrix . Throughout this loop, the accept state is entered only when the first entry of the vector is equal to 1.

Suppose that at some step, the value of the vector is . If the input is sufficiently long, steps will pass before the first counter reaches 1 again, with the vector having the value . On an infinite sequence of ’s, the accept state will be entered after reading the second , and then again with intervals of symbols between subsequent entrances, for . Doing the sum, we conclude that strings of the form , , are accepted. ∎

## 4 Deterministic vector automata

We start by specializing a fact stated by Ibarra et al. in [ISK76] in the context of 1DFAM’s to the case of rtDVA(1)’s. For this purpose, we will use the following well-known fact about counter machines.

###### Fact 1

[FMR68] Given any -counter automaton with the ability to alter the contents of each counter independently by any integer between and in a single step (for some fixed integer ), one can effectively construct a -counter automaton which can modify each counter by at most one unit at every step, and which recognizes the same language as in precisely the same number of steps.

###### Fact 2

rtDVA(1)’s are equivalent in language recognition power to real-time deterministic multicounter automata which can only check if all counters are equal to 0 simultaneously.

###### Proof

Let us simulate a given rtDVA(1) by a real-time deterministic multicounter automaton . Let be the set of numbers the single-entry “vector” can be multiplied with during the computation. Let be the set of prime factors of the denominators and the numerators of the numbers in . will have counters to represent the current value of the vector. When multiplies the vector with , where and , the counters of are updated by the values . As stated in Fact 1, we can update the counter values by any integer between and , where here is equal to the largest exponent in the prime decomposition of the numbers in . When checks if the value of the vector is equal to 1, checks if the current value of the counters is , since the value of the vector is equal to 1 exactly when all the counters are equal to 0.

For the other direction, we should simulate a rtDkCA that can only check if all counters are equal to 0 simultaneously with a rtDVA(1) . For each counter of , we assign a distinct prime number for . We multiply the “vector” with and , when the ’th counter is incremented and decremented, respectively. Whenever has all counters equal to 0, ’s vector has value 1, so it can mimic as required. ∎

We now prove a fact about rtDkCA’s that will be helpful in the separation of the classes of languages associated with these machines and rtDVA(1)’s.

.

###### Proof

We construct a real-time deterministic automaton with two counters recognizing . The idea of the proof is the same with the proof of Theorem 3.2. If the input is the empty string, accepts. Otherwise, increments the first counter on reading the first . It then repeats the following procedure for the rest of the computation: Decrement the first counter until it reaches zero, while parallelly incrementing the second counter. The second counter stops increasing exactly when the first counter reaches 0. The counters then swap directions, with the second counter now being decremented, and the first counter going up. When the second counter reaches 0, the first counter is incremented one more time.

Throughout this loop, the accept state is entered only when the first counter is zero.

Suppose that at some step, the value of the counter pair is . If the input is sufficiently long, steps will pass before the first counter reaches zero again, with the pair having the value . On an infinite sequence of ’s, the accept state will be entered after reading the second , and then again with intervals of symbols between subsequent entrances, for . Doing the sum, we conclude that strings of the form , , are accepted. ∎

For , let , where denotes the number of occurrences of symbol in .

###### Fact 3

[Lai67] (rtDkCA), and (rtD(-1)CA), for every .

###### Fact 4

[ISK76] 1DFAM’s can only recognize regular languages on unary alphabets.

We are now able to state several new facts about the computational power of rtDVA(k)’s:

###### Theorem 4.2

For any fixed , (rtDVA(1)) and (rtDkCA) are incomparable.

###### Proof

From Fact 3, we know that can not be recognized by any rtDkCA. We can construct a rtDVA(1) recognizing as follows: We choose distinct prime numbers , each corresponding to a different symbol in the input alphabet, where . When it reads an with in that range, multiplies its single-entry vector with . When it reads an , multiplies the vector with . The input string is accepted if the value of the vector is equal to 1 at the end of the computation, which is the case if and only if . We conclude that (rtDVA(1)).

From Theorem 4.1, we know that rtDkCA’s can recognize some nonregular languages on a unary alphabet. By Fact 4, we know that rtDVA(1)’s, which are additionally restricted 1DFAM’s, can only recognize regular languages in that case. Hence, we conclude that the two models are incomparable. ∎

.

###### Proof

By the argument in the proof of Fact 2, any rtDVA(1) can be simulated by a rtDkCA for some . The inclusion is proper, since we know that a rtD2CA can recognize a nonregular language on a unary alphabet (Theorem 4.1), a feat that is impossible for rtDVA(1)’s by Fact 4. ∎

(rtDVA(2)) .

###### Proof

Let , and let be the Kleene closure of . It is known that no rtDkCA can recognize for any , due to the inability of these machines to set a counter to 0 in a single step [FMR67].

We will construct a rtDVA(2) that recognizes . The idea is to use the first entry of the vector as a counter, and employ matrix multiplication to set this counter to 0 quickly when needed. rejects strings that are not in the regular set easily. The vector starts out as . When it reads an , multiplies the vector with the “incrementation” matrix to increment the counter. When reading a , rejects if the first entry is zero, since this indicates that there are more ’s than there were ’s in the preceding segment. Otherwise, it multiplies the vector with the “decrementation” matrix .

 Ma=[1011]Mb=[10−11]

When an is encountered immediately after a , the counter has to be reset to 0, so the in the processing of such ’s is preceded by the ”reset” matrix .

 M0=[0011]

accepts if it reaches the end of the input without rejecting. ∎

We are now able to compare the power of rtDVA(1)’s with their one-way versions, namely, the 1DFAM’s of Ibarra et al. [ISK76]

###### Theorem 4.5

(rtDVA(1)) (1DFAM).

###### Proof

We construct a 1DFAM recognizing the language that we saw in the proof of Theorem 4.4. uses its register to simulate the counter of a one-way single-counter automaton. When it reads an , multiplies the register by 2. When reading a new , rejects if the register has value 1, and multiplies with otherwise. When a new block of is seen to start, pauses its input head while repeatedly multiplying the register with to set its value back to 1 before processing the new block. accepts if it has processed the whole input without rejecting.

By the already mentioned fact that no rtDkCA for any can recognize , and Theorem 4.3, we conclude that . ∎

The same reasoning also allows us to state

###### Corollary 1

(rtDVA(2)).

Note that Fact 4 and Theorem 3.2 let one conclude that rtDVA(2)’s outperform rtDVA(1)’s when the input alphabet is unary.

It is easy to state the following simultaneous Turing machine time-space upper bound on the power of deterministic real-time vector automata:

.

###### Proof

A Turing machine that multiplies the vector with the matrices corresponding to the transitions of a given rtDVA(k) requires only linear space, since the numbers in the vector can grow by at most a fixed number of bits for each one of the multiplications in the process. Using the primary-school algorithm for multiplication, this takes overall time.∎

If one gave the capability of one-way traversal of the input tape to vector automata of dimension larger than , one would gain a huge amount of computational power. Even with vectors of dimension 2, such machines can simulate one-way 2-counter automata, and are therefore Turing equivalent [ISK76]. This is why we focus on real-time vector automata.

## 5 Blind vector automata

A real-time deterministic blind vector automaton (rtDBVA(k)) is a rtDVA(k) which is not allowed to check the entries of the vector until the end of the computation. Formally, a rtDBVA(k) is a 6-tuple

 V=(Q,Σ,δ,q0,Qa,v),

where the transition function is defined as with as defined earlier. means that when reads symbol in state , it will move to state , multiplying the vector with the matrix . The acceptance condition is the same as for rtDVA(k)’s.

###### Remark 2

Let us start by noting that , unlike the general case considered in Theorem 4.3: Since blind counter automata only check if all counters are zero at the end, the reasoning of Fact 2 is sufficient to conclude this.

.

###### Proof

A rtDBVA(1) is clearly a 1DFAMW, so we look at the other direction of the equality. Given a 1DFAMW , we wish to construct a rtDBVA(1) which mimics , but without spending more than one computational step on any symbol. When scans a particular input symbol for the first time in a particular state , whether it will ever leave this symbol, and if so, after which sequence of moves, are determined by its program. This information can be precomputed for every state/symbol pair by examining the transition function of . We program so that it rejects the input if it ever determines during computation that would have entered an infinite loop. Otherwise, upon seeing the simulated moving on a symbol while in state , simply retrieves the aforementioned information from a lookup table, moves the head to the right, entering the state that would enter when it moves off that , and multiplies its single-entry vector with the product of the multipliers corresponding to the transitions executes while the head is pausing on . ∎

We now give a full characterization of the class of languages recognized by real-time deterministic blind vector automata.

.

###### Proof

For any language , we can assume without loss of generality that [Tur69] for some TuFA with, say, states. Let us construct a rtDBVA(k) simulating . We let , so that the vector is in . The initial vector values of and are identical. has only one state, and the vector is multiplied with the corresponding transition matrix of when an input symbol is read. When processing the right endmarker, multiplies the vector with a matrix whose first column is the final vector of . accepts input string if the first entry of the vector is 1 at the end of the computation, which happens only if the acceptance value .

For the other direction, let us simulate a rtDBVA(k) recognizing some language by a TuFA . If has states, then will have states. For any symbol , the corresponding transition matrix is constructed as follows. View as being tiled to submatrices called , for . If moves from to by multiplying the vector with the matrix when reading symbol , then will be set to equal . All remaining entries of are zeros. The initial vector of will be a row vector with entries, viewed as being segmented to blocks of entries. The first entries of , corresponding to the initial state of , will equal , and the remaining entries of will equal 0. The entries of the final column vector of will again consist of segments corresponding to the states of . The first entry of every such segment that corresponds to an accept state of will equal 1, and all remaining entries will equal 0. imitates the computation of by keeping the current value of the vector of at any step within the segment that corresponds to ’s current state in the vector representing the portion of ’s own matrix multiplication up to that point. We therefore have that . ∎

We can also give a characterization for the case where the alphabet is unary, thanks to the following fact, which is implicit in the proof of Theorem 7 in [Diê77]:

###### Fact 5

All languages on a unary alphabet in are regular.

We can say the following about the effect of increasing on the power of rtDBVA(k)’s:

###### Theorem 5.3

(rtDBVA(1)) (rtDBVA(2)).

###### Proof

Let us construct a rtDBVA(2) recognizing the marked palindrome language , where stands for the reverse of string . We let the initial vector equal . While reading the input string, first encodes the string in the first entry of the vector using the matrices and .

 Ma1=[10  011]Mb1=[10  021]

Each time it reads an and a , multiplies the vector with and , respectively. In the encoding, each is represented by an occurrence of the digit 1, and each is represented by a 2. Upon reading the symbol , finishes reading and starts reading the rest of the string. now makes a reverse encoding and multiplies the vector with and each time it reads an and a , respectively.

 Ma2=⎡⎣110  0−1101⎤⎦Mb2=⎡⎣110  0−2101⎤⎦

When the computation ends, the first entry of the vector is equal to 0 iff the string read after the symbol is the reverse of the string so that the input string is in .

Now, we are going to prove that , that is, the class of languages accepted by two-way probabilistic finite automata with bounded error. Suppose for a contradiction that there exists a two-way probabilistic finite automaton (2pfa) recognizing with bounded error. Then it is not hard show that can be recognized by a 2pfa such that sees the input, say , as and then executes on . Note that accepts if and only if is a member of . Since [DS92], we get a contradiction. Hence, we conclude that can not be in .

It is known [Rav92] that includes all languages recognized by one-way deterministic blind multicounter automata, and we already stated that rtDBVA(1) and rtDkBCA are equivalent models in Remark 2. Since , cannot be in (rtDBVA(1)). Having proven that (rtDBVA(2)), we conclude that (rtDBVA(1)) (rtDBVA(2)). ∎

For an -state rtDBVA(k) , we define the size of to be the product . For all , let (rtDBVASIZE()) denote the class of languages that are recognized by real-time deterministic blind vector automata whose size is . We use the following fact to prove a language hierarchy on this metric.

###### Fact 6

[Diê71] (Recurrence Theorem) Let be a language belonging to in the alphabet . Then there exists a natural number such that for any words , if , then for any .

###### Theorem 5.4

For every , (rtDBVASIZE())(rtDBVASIZE()).

###### Proof

We first establish a hierarchy of complexity classes for TuFA’s based on the number of states, and use this fact to conclude the result.

It is obvious that the language . We claim that any TuFA recognizing should have at least states. Let be the number of states of and let us suppose that . We are going to use Fact 6 as follows: Let , and let be the empty string. Since the strings are in , we see that the strings of the form are also in and we get a contradiction. Hence, we conclude that should hold, and that should have at least states.

By Theorem 5.2, there exists a real-time blind deterministic vector automaton with size (a rtDBVA(k) with just one state) recognizing the same language. Suppose that there exists another real-time blind vector automaton with size such that . Then by Theorem 5.2, there exists a TuFA with states recognizing . Since we know that any TuFA recognizing should have at least states, we get a contradiction. ∎

## 6 Nondeterministic vector automata

We now define the real-time nondeterministic vector automaton (rtNVA(k)) by adding the capability of making nondeterministic choices to the rtDVA(k). The transition function is now replaced by , where denotes the power set of the set . We will also study blind versions of these machines: A real-time nondeterministic blind vector automaton (rtNBVA(k)) is just a rtNVA(k) which does not check the vector entries until the end of the computation.

We start by showing that it is highly likely that rtNVA(k)’s are more powerful than their deterministic versions.

If , then .

###### Proof

We construct a rtNBVA(3) recognizing the -complete language , which is the collection of all strings of the form , such that and the ’s are numbers in binary notation , and there exists a set satisfying , where . The main idea of this construction is that we can encode the numbers appearing in the input string to certain entries of the vector, and perform arithmetic on them, all in real time. We use a similar encoding given in [Yakar]. ’s initial vector is . When scanning the symbols of , multiplies the vector with the matrix (resp. ) for each scanned (resp. ).

 M0=⎡⎢⎣200010001⎤⎥⎦M1=⎡⎢⎣200010101⎤⎥⎦.

When finishes reading , the vector equals . In the rest of the computation, nondeterministically decides which ’s to subtract from the second entry. Each selected is encoded in a similar fashion to the fourth entry of the vector, using the matrices

 N0=⎡⎢⎣100020001⎤⎥⎦N1=⎡⎢⎣100020011⎤⎥⎦.

After encoding the first selected , the vector equals . subtracts the second entry from the first entry by multiplying the vector with the matrix . After this subtraction, the second entry is reinitialized to 0. chooses another if it wishes, and the same procedure is applied. At the end of the input, accepts if the first entry of the vector is equal to 0, and rejects otherwise.

If (rtNVA(k))=(rtDVA(k)), then would be in by Theorem 4.6, and we would have to conclude that . ∎

When we restrict consideration to blind automata, we can prove the following unconditional separation between the deterministic and nondeterministic versions.

.

###### Proof

Let us construct a rtNBVA(2) recognizing the language . The initial value of ’s vector is . ’s computation consists of two stages. In the first stage, doubles the value of the first entry for each that it scans, by multiplying the vector with the matrix . At any step, may nondeterministically decide to enter the second stage. In the second stage, decrements the first entry by 1, for each that is scanned, using the matrix , and accepts if the first entry equals 0 at the end.

 M1=[2001]M2=[10−11]

If the input length is , and if decides to enter the second stage right after the ’th , the vector value at the end of the computation equals . We see that if and only if for some .

Having proven that the nonregular language (rtNBVA(2)), we note that can not be in (rtDBVA(k)), by Theorem 5.2, and Fact 5. ∎

## 7 Open Questions

• Can we show a hierarchy result similar to Theorem 5.4 for general deterministic vector automata, or for nondeterministic vector automata?

• Are general nondeterministic real-time vector automata more powerful than rtNBVA(k)’s?

• Would properly defined bounded-error probabilistic versions of vector automata correspond to larger classes? Would quantum vector automata outperform the probabilistic ones?

## Acknowledgements

We thank Oscar Ibarra and Holger Petersen for their helpful answers to our questions.

## References

• [Diê71] Phan Dinh Diêu. On a class of stochasic languages. Mathematical Logic Quarterly, 17(1):421–425, 1971.
• [Diê77] Phan Dinh Diêu. Criteria of representability of languages in probabilistic automata. Cybernetics and Systems Analysis, 13(3):352–364, 1977. Translated from Kibernetika, No. 3, pp. 3950, MayJune, 1977.
• [DS92] Cynthia Dwork and Larry Stockmeyer. Finite state verifiers I: The power of interaction. Journal of the ACM, 39(4):800–828, 1992.
• [FMR67] Patrick C. Fischer, Albert R. Meyer, and Arnold L. Rosenberg. Real time counter machines. In Proceedings of the 8th Annual Symposium on Switching and Automata Theory (SWAT 1967), FOCS ’67, pages 148–154, Washington, DC, USA, 1967. IEEE Computer Society.
• [FMR68] Patrick C. Fischer, Albert R. Meyer, and Arnold L. Rosenberg. Counter machines and counter languages. Mathematical Systems Theory, 2(3):265–283, 1968.
• [Gre78] S. A. Greibach. Remarks on blind and partially blind one-way multicounter machines. Theoretical Computer Science, 7:311–324, 1978.
• [ISK76] Oscar H. Ibarra, Sartaj K. Sahni, and Chul E. Kim. Finite automata with multiplication. Theoretical Computer Science, 2(3):271 – 294, 1976.
• [Lai67] R. Laing. Realization and complexity of commutative events. Technical report, University of Michigan, 1967.
• [Rav92] Bala Ravikumar. Some observations on 2-way probabilistic finite automata. In Proceedings of the 12th Conference on Foundations of Software Technology and Theoretical Computer Science, pages 392–403. Springer-Verlag, 1992.
• [Tur69] Paavo Turakainen. Generalized automata and stochastic languages. Proceedings of the American Mathematical Society, 21:303–309, 1969.
• [Yak12] Abuzer Yakaryılmaz. Superiority of one-way and realtime quantum machines. RAIRO - Theoretical Informatics and Applications., 46(4):615–641, 2012.
• [Yakar] Abuzer Yakaryılmaz. Quantum alternation. In Proceedings of the 8th International Computer Science Symposium in Russia, 2013 (to appear).
You are adding the first comment!
How to quickly get a good reply:
• Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
• Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
• Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
The feedback must be of minimum 40 characters and the title a minimum of 5 characters