Max-C and Min-D Projection Autoassociative Fuzzy Morphological Memories: Theory and an Application for Face Recognition

Max- and Min- Projection Autoassociative Fuzzy Morphological Memories: Theory and an Application for Face Recognition

Alex Santana dos Santos assantos@ufrb.edu.br Exact and Technological Science Center, Federal University of Recôncavo da Bahia, Rua Rui Barbosa, 710, Centro, Cruz das Almas-BA CEP 44380-000, Brazil Marcos Eduardo Valle valle@ime.unicamp.br Department of Applied Mathematics, University of Campinas, Rua Sérgio Buarque de Holanda, 651, Campinas-SP CEP 13083-859, Brazil
Abstract

Max- and min- projection autoassociative fuzzy morphological memories (max- and min- PAFMMs) are two layer feedforward fuzzy morphological neural networks able to implement an associative memory designed for the storage and retrieval of finite fuzzy sets or vectors on a hypercube. In this paper we address the main features of these autoassociative memories, which include unlimited absolute storage capacity, fast retrieval of stored items, few spurious memories, and an excellent tolerance to either dilative noise or erosive noise. Particular attention is given to the so-called PAFMM of Zadeh which, besides performing no floating-point operations, exhibit the largest noise tolerance among max- and min- PAFMMs. Computational experiments reveal that Zadeh’s max- PFAMM, combined with a noise masking strategy, yields a fast and robust classifier with strong potential for face recognition.

keywords:
Fuzzy associative memory, morphological neural network, lattice computing, face recognition.
journal: \newproof

potProof of Theorem 5 \newproofpot1Proof of Theorem 6 \newproofpot2Proof of Theorem 9

1 Introduction

An associative memory (AM) is an input-output system inspired by the human brain ability to store and recall information by association hassoun97 ; kohonen89 . Apart from a large storage capacity, an ideal AM model should exhibit a certain tolerance to noise. In other words, we expect to retrieve a stored item not only presenting the original stimulus but also from a similar input stimulus hassoun97 . We speak of an autoassociative memory if stimulus and response coincide. For instance, our memory acts as an autoassociative model when we recognize a friend wearing sunglasses or a scarf. In other words, we obtain the desired output (recognize a friend) from a partial or noise input (its occluded face). We say that an associative memory is a heteroassociative model if at least one stimulus differs from its corresponding response.

Several associative memory models have been introduced in the literature and their applications ranges from optimization hopfield85 ; serpen08 and prediction valle11nn ; sussner18ins to image processing and analysis valle11nn ; grana16 ; sussner06fs ; sussner17msc ; valle15 and pattern classification esmi12hais ; esmi15fs ; esmi16fss ; sussner06nn ; valle16gcsr including face recognition zhang04 .

An AM model designed for the storage and recall of fuzzy sets on a finite universe of discourse is called a fuzzy associative memory (FAM) kosko92 . Since fuzzy sets can be interpreted as elements from a complete lattice goguen67 and mathematical morphology can be viewed as a theory on mappings between complete lattices heijmans94 , many important FAM models from the literature belong to the broad class of fuzzy morphological associative memories (FMAMs) valle11nn ; valle08fss . Briefly, FMAMs are implemented by fuzzy morphological neural networks. A morphological neural network is equipped with neurons that perform an elementary operation from mathematical morphology possibly followed by a non-linear activation function sussner11ins . The class of FMAMs includes, for example, the max-mininum and max-product FAMs of Kosko kosko92 , the max-min FAM of Junbo et al. junbo94 , the max-min FAM with threshold of Liu liu99 , the fuzzy logical bidirectional associative memories of Belohlavek belohlavek00a , and the implicative fuzzy associative memories (IFAMs) of Sussner and Valle sussner06fs . In this paper, we only consider the max- and min- autoassociative fuzzy morphological memories (AFMMs) synthesized using the fuzzy learning by adjunction (FLA) valle11nn ; valle08fss . These autoassociative memories can be seen as fuzzy versions of the well-known matrix-based autoassociative morphological memories (AMMs) of ritter98 .

The main features of the max- and min- AFMMs are unlimited absolute storage capacity, one-step convergence when employed with feedback, and an excellent tolerance to either erosive or dilative noise. On the downside, the matrix-based AFMMs with FLA have a large number of spurious memories valle11nn . A spurious memory is an item that has been unintentionally stored in the memory hassoun97 . Furthermore, the information stored on an AFMM with FLA is distributed on a synaptic weight matrix. As a consequence, these autossociative fuzzy memories consume a large amount of computational resources when designed for the storage and recall of large items valle11nn ; vajg15 .

Many autoassociative fuzzy memory models have been proposed in the literature to improve the noise tolerance or to reduce the computational cost of AFMMs with FLA. In order to improve the tolerance with respect to mixed noise, Valle developed the permutation-based finite IFAMs (-IFAMs) by replacing the unit interval by a finite chain valle10ins . A certain -IFAM outperformed the original IFAMs in the reconstruction of gray-scale images corrupted by mixed salt and pepper noise. Similarly, to increase the noise tolerance of the IFAMs, Bui et al. introduced the so-called content-association associative memory (ACAM) bui15 . Using a fuzzy preorder relation, Perfilieva and Vajgl proposed a novel theoretical justification for IFAMs perfilieva15IFSA . They also introduced a fast algorithm for data retrieval that is based on an IFAM model with a binary fuzzy preorder perfilieva16 . Moreover, Vajgl reduced the computational cost of an IFAM by replacing its synaptic weight matrix by a sparse matrix vajgl17 . In a similar fashion, the class of sparsely connected autoassociative fuzzy implications (SCAFIMs) is obtained by removing (possibly a significant amount of) synaptic weights from the original IFAMs valle10nc . More generally, the quantale-based associative memories (QAMs) generalize several lattice-based autoassociative memories and have been effectively applied for the storage and the recall of large color images valle13prl . Using piecewise linear transformation in the input and output spaces, Li et al. increased the storage capacity of fuzzy associative memories li17asc . Recently, Sussner and Schuster proposed the interval-valued fuzzy morphological associative memories (IV-FMAMs) which are designed for the storage and retrieval of interval-valued fuzzy sets sussner18ins . The novel IV-FMAMs have been effectively applied for time-series prediction.

Apart from the distributed models like the FMAMs with FLA and their variations, non-distributed associative memories models have received considerable attention in recent years partially due to their low computational effort and extraordinary successes in pattern recognition and image restoration tasks. Examples of non-distributed associative memories include models based on Hamming distance ikeda01 and kernels zhang04 ; souza18nafips as well as subsethood and similarity measures valle15 ; esmi15fs ; esmi16fss ; valle16gcsr ; souza18tema . In the context of non-distributed models, we recently introduced the max-plus and min-plus projection autoassociative morphological memories (max-plus and min-plus PAMMs) which can be viewed as non-distributed versions of the autoassociative morphological memories of Ritter et al. santos18nn ; valle14wcciB . Max-plus and min-plus PAMMs have less spurious memories than their corresponding distributed models and, thus, they are more robust to either dilative noise or erosive noise than the original autoassociative morphological memories. Computational experiments revealed that PAMMs and their compositions are competitive to other methods from the literature on classification tasks santos18nn .

In the light of the successful developed of the max-plus and min-plus PAMMs and, in order to circumvent the aforementioned downsides of AFMMs, we introduced the class of max- projection autoassociative fuzzy morphological memories (max- PAFMMs) in the conference paper santos16cbsf . Max- PAFMMs have been further discussed in santos17msc , where some results concerning their implementation and storage capacity are given without proofs. In few words, a max- PAFMM projects the input into the family of all max- combinations of the stored items. In this paper, we present the dual version of max- PAFMMs: The class of min- PAFMMs which projects the input into the set of all min- combinations of the stored items santos17cnmac . Furthermore, we address some theoretical results concerning both max- and min- PAFMM models. Thus, the theoretical part of this paper can be viewed as an extended version of the conference paper santos17cnmac . In particular, we conclude that max- and min- PAFMMs exhibit better tolerance with respect to either dilative noise or erosive noise than their corresponding matrix-based AFMM with FLA. Additionally, we show in this paper that the most robust max- PAFMM with respect to dilative noise is based on Zadeh’s inclusion measure and, thus, it is referred to as Zadeh’s max- PAFMM. Accordingly, the dual of Zadeh’s max-C PAFMM is the min- PAFMM most robust with respect to erosive noise. Finally, inspired by the work of Urcid and Ritter urcid07LC , the frail tolerance of the max- and min- PAFMMs with respect to mixed noise can be improved significantly by masking the noise contained in the input santos17bracis . Despite some preliminary experiments can be found in the conference paper santos17bracis , we provide in this paper conclusive computational experiments concerning the application of Zadeh’s max- PAFMM for face recognition.

The paper is organized as follows. Some basic concepts on fuzzy logic and fuzzy sets are briefly presented in next section. Section 3 briefly reviews the max- and min- AFMMs with FLA. The max-C and min- PAFMMs are addressed subsequently in Section 4. Zadeh’s PAFMMs and the noise masking strategy are discussed in Sections 5 and 6, respectively. The performance of Zadeh’s max- PAFMM for face recognition is addressed on Section 7. We finish the paper with some concluding remarks and an appendix with the proofs of theorems.

2 Some Basic Concepts on Fuzzy Systems

The autoassociative fuzzy memories considered in this paper are based on fuzzy set theory and operations from fuzzy logic. In this section, we briefly review the most important concepts on fuzzy systems. The interested reader is invited to consult barros17livro ; klir95 ; nguyen00 ; gomide07 ; debaets97a for a detailed review on this topic.

The key concept for the development of fuzzy morphological associative memories is adjunction heijmans95 ; deng02 . We say that a fuzzy implication and a fuzzy conjunction form an adjunction if the following equivalence holds true for :

(1)

Analogously, a fuzzy disjunction and a fuzzy co-implication form an adjunction if and only if

(2)

Examples of adjunction include the following pairs:

  • Gödel’s implication and the minimum fuzzy conjunction .

  • Goguen’s implication and the product fuzzy conjunction .

  • Lukasiewicz’s fuzzy implication and fuzzy conjunction .

  • Gaines’ fuzzy implication and fuzzy conjunction defined respectively by

    (3)
  • The maximum fuzzy disjunction and Gödel’s fuzzy co-implication

    (4)
  • The probabilistic sum disjunction and Goguen’s fuzzy co-implication

    (5)
  • Lukasiewicz’s disjunction and co-implication .

  • Gaines’ fuzzy disjunction and fuzzy co-implication defined as follows

    (6)

We would like to point out that adjunctions arise naturally on complete lattices and are closely related to Galois connection and residuation theory birkhoff93 ; davey02 ; blyth72 . Furthermore, adjunction is one of the most important concept in mathematical morphology, a theory widely used for image processing and analysis heijmans95 . The elementary operations from mathematical morphology are erosions and dilations. Dilations and erosions are operators that commute with the supremum and infimum operations, respectively heijmans94 . Formally, and are respectively a dilation and an erosion if

(7)

where the symbols “” and “” denote respectively the supremum and infimum. It turns out that, if the pair forms an adjunction, then is an erosion and is a dilation for all . Moreover, if is a dilation for all , then there exists an unique fuzzy implication such that forms an adjunction. Such unique fuzzy implication that forms an adjunction with the fuzzy conjunction is the residual implication (R-implication) given by

(8)

Similarly, if the pair forms an adjunction then is an erosion and is a dilation. Also, if is an erosion for all , then its residual co-implication

(9)

is the unique fuzzy co-implication such that the pair forms an adjunction.

Apart from adjunctions, fuzzy logical operators can be connected by means of a strong fuzzy negation. A strong fuzzy negation is a nonincreasing mapping such that , , and for all . The standard fuzzy negation is a strong fuzzy negation. A fuzzy conjunction can be connected to a fuzzy disjunction by means of a strong fuzzy negation as follows:

(10)

In this case, we say that and are dual operators with respect to . In a similar manner, a fuzzy co-implication is the dual operator of a fuzzy implication with respect to a strong fuzzy negation if and only if

(11)

The pairs of fuzzy conjunction and fuzzy disjunction , , , and are duals with respect the standard fuzzy negation . The pairs , , , and of fuzzy implication and fuzzy co-implication are also dual with relation the standard fuzzy negation.

The fuzzy logic operators , , , and can be combined with either the maximum or the minimum operations to yield four matrix products. For instance, the max- and the min- matrix product of by , denoted respectively by and , are defined by the following equations for all and :

(12)

In analogy to the concept of linear combination, we say that is a max- combination of the vectors belonging to the finite set if

(13)

where for all . Similarly, a min- combination of the vectors of is given by

(14)

where , for all . The sets of all max- combinations and min- combinations of are denoted respectively by

(15)

and

(16)

The sets of max- and min- combinations plays a major role for the projection autoassociative fuzzy morphological memories (PAFMMs) presented in Section 4. Before, however, let us review the class of fuzzy autoassociative morphological memories which are defined using fuzzy logical connectives and adjunctions.

3 Autoassociative Fuzzy Morphological Memories

Let us briefly review the autoassociative fuzzy morphological memories (AFMM). The reader interested on a detailed account on this subject is invited to consult valle11nn ; valle08fss .

Let and be adjunction pairs where is a fuzzy conjunction and is a fuzzy disjunction. As far as we know, most AFMMs are implemented by a single-layer network defined in terms of either the max- or the min- matrix products established by (12) valle11nn . Formally, a max- and a min- autoassociative fuzzy morphological memory (AFMM) are mappings defined respectively by the following equations

(17)

where are called the synaptic weight matrices. The AFMMs and given by (17) are called morphological because they perform respectively a dilation and an erosion from mathematical morphology. Examples of AFMMs include the autoassociative version of the max-mininum and max-product fuzzy associative memories of Kosko kosko92 , the max–min fuzzy associative memories with threshold liu99 , and the implicative fuzzy associative memories sussner06fs .

Let us now turn our attention to a recording recipe, called fuzzy learning by adjunction (FLA), which can be effectively used for the storage of vectors on the AFMMs and defined by (17) valle08fss . Given a , called the fundamental memory set, FLA determines the matrix of a max- AFMM and the matrix of the min- AFMM by means of the following equations for all :

(18)

We would like to point out that, although the matrices and given by (18) are well defined for any fuzzy implication and fuzzy co-implication , the following properties hold true only if they form an adjunction with the fuzzy conjunction and the fuzzy disjunction , respectively.

We would like to point out that, using a strong fuzzy negation , we can derive from a max- AFMM another AFMM called the negation of and denoted by . Formally, the negation of is defined by the equation

(19)

where the strong fuzzy negation is applied in a component-wise manner. It is not hard to show that the negation of a max- AFMM is a min- AFMM , and vice-versa, where the fuzzy conjunction and the fuzzy disjunction are dual with respect to the strong fuzzy negation , i.e. they satisfy (10) valle08fss .

The following proposition reveals that a min- AFMM and a max- AFMM , both synthesized using FLA, project the input into the set of their fixed points valle11nn . Furthermore, Proposition 1 shows that the output of a min- AFMM with FLA is the largest fixed point less than or equal to the input . Analogously, a max- AFMM with FLA yields the smallest fixed point which is greater than or equal to the input valle11nn .

Proposition 0 (Valle and Sussner valle11nn )

Let and be adjunction pairs where and are respectively an associative fuzzy conjunction and an associative fuzzy disjunction, both with a left identity. In this case, the output of the min- AFMM defined by (17) with FLA given by (18) satisfies

(20)

where denotes the set of all fixed points of which depends on and includes the fundamental memory set . Dually, the output of the max- AFMM with FLA satisfies

(21)

where denotes the set of all fixed points of which also depends on and contains the fundamental memory set .

In the light of Proposition 1, besides the adjunction relationship, from now on we assume that the fuzzy disjunction and the fuzzy conjunction are associative and have both a left identity. As a consequence, AFMMs with FLA present the following properties: they can store as many vectors as desired; they have a large number of spurious memories; an AFMM exhibits tolerance to either dilative noise or erosive noise, but it is extremely sensitive to mixed (dilative+erosive) noise. Recall that a distorted version of a fundamental memory has undergone a dilative change if . Dually, we say that has undergone an erosive change if ritter98 .

Example 0

Consider the fundamental memory set

(22)

Using Gödel’s co-implication in (18), the synaptic weight matrix of the min- AFMM with FLA is:

(23)

Now, consider the input fuzzy set

(24)

Note that is a dilated version of the fundamental memory because . The output of the min- AFMM with FLA is

(25)

where “” denotes the min- product defined in terms of the fuzzy disjunction . According to Proposition 1, the output is a fixed point of that does not belong to the fundamental memory set . Thus, it is a spurious memory of . In a similar fashion, we can use FLA to store the fundamental set into the min- AFMMs , , and obtained by considering respectively the probabilistic sum, the Lukasiewicz, and the Gaines fuzzy disjunction. Upon presentation of the input vector given by (24), the min- AFMMs , , and yield respectively

(26)
(27)

and

(28)

Such as the min- AFMM , the autoassociative memories , , and failed to produce the desired output .

4 Max- and Min- Fuzzy Projection Autoassociative Morphological Memories

As distributed matrix-based autoassociative memories, a great deal of computer memory is consumed by min- and max- AFMMs if the length of stored vectors is large. Furthermore, from Proposition 1, their tolerance with respect to either dilative or erosive noise is degraded as the number of fixed points increase.

Inspired by the feature that min- and max- AFMMs with FLA project the input vector into the set of their fixed point, we can improve the noise tolerance of these memory models by reducing their set of fixed points. Accordingly, we recently introduced the max- projection autoassociative fuzzy memories (max- PAFMMs) by replacing in (20) the set by the set of all max- combinations of vectors of santos16cbsf ; santos17msc . Formally, given a set , a max- PAFMM is defined by

(29)

where the set is defined in (15). A dual model, referred to as min- PAFMM, is obtained by replacing by the set of all min- combinations of the fundamental memories in (21). Precisely, a min- PAFMM satisfies

(30)

where the set is given in (16). The following theorem is a straightforward consequence of these definitions.

Theorem 3

The max- and min- PAFMMs given respectively by (29) and (30) satisfy the inequalities for any input vector . Furthermore, and for all .

As a consequence of Theorem 3, a max- PAFMM and a min- PAFMM are respectively an opening and a closing form fuzzy mathematical morphology deng02 . Like the min- AFMM, a max- PAFMM exhibits only tolerance with respect to dilative noise. Also, it is extremely sensitive to either erosive or mixed noise. In fact, a fundamental memory cannot be retrieved by a max- PAFMM from an input such that . In a similar manner, such as the max- AFMM, a min- PAFMM exhibits tolerance with respect to erosive noise but it is not robust to either dilative or mixed noise.

Let us now address the absolute storage capacity of max- and min- PAFMMs. Clearly, a max- PAFMM has optimal absolute storage capacity if a fundamental memory belongs to . In other words, if , then . It turns out that belongs to the set of all max- combinations of if the fuzzy conjunction has a left identity, i.e., there exists such that for all . In fact, for any fuzzy conjunction , we have for all debaets97a . Thus, if the fuzzy conjunction has a left identity, we can express a fundamental memory by the following max- combination

(31)

Dually, a min- PAFMM has optimal absolute storage capacity if the fuzzy disjunction has a left identity. Summarizing, we have the following theorem:

Theorem 4

Let and denote respectively a fuzzy conjunction and a fuzzy disjunction and consider a fundamental memory set . The max- PAFMM given by (29) satisfies for all if the fuzzy conjunction has a left identity. Dually, if the fuzzy disjunction has a left identity then for all , where denotes the min- PAFMM given by (30).

The next theorem, which is based on adjunctions, provides effective formulas for the implementation of the max- and min- PAFMMs.

Theorem 5

Given a fundamental memory set . Let a fuzzy implication and a fuzzy conjunction form an adjunction. For any input , the max- PAFMM given by (29) satisfies

(32)

Dually, let a fuzzy disjunction and a fuzzy co-implication form an adjunction. For any input , the output of the min- PAFMM can be computed by

(33)

In the light of Proposition 1, we only consider PAFMMs based fuzzy conjunctions and fuzzy disjunctions that form adjunction pairs with a fuzzy implication and a fuzzy co-implication, respectively.

Remark 1

Theorem 5 above gives a formula for the coefficients that is used to define the output of a max- PAFMM. Note that the coefficient corresponds to the degree of inclusion of the fundamental memory in the input fuzzy set . In other words, we have

(34)

where denotes the Bandler-Kohout fuzzy inclusion measure bandler80 .

As to the computational effort, max- and min- PAFMMs are usually cheaper than their corresponding min- and max- AFMMs. In fact, from (5), max- and min- PAFMMs are non-distributive memory models which can be implemented by fuzzy morphological neural networks with a single hidden layer valle08fss ; sussner08gr ; valle18wiley . They do not require the storage of a synaptic weight matrix of size . Also, they perform less floating-point operations than their corresponding min- and max- AFMMs if . To illustrate this remark, consider a fundamental memory set , where for all with . On the one hand, to synthesize the synaptic weight matrix of a min- AFMM , we perform evaluation of a fuzzy co-implication and comparisons. Besides, the resulting synaptic weight matrix consumes of memory space. In the recall phase, the min- AFMM requires evaluations of a fuzzy disjunction and comparisons. On the other hand, to compute the parameters ’s of a max- PAFMM , we perform evaluations of a fuzzy implication and comparisons. The subsequent step of the max- PAFMM requires evaluations of a fuzzy conjunction and comparisons. Lastly, it consumes of memory space for the storage of the fundamental memories. Similar remarks holds for a max- AFMM and a min- PAFMM. Concluding, Table 1 summarizes the computational effort in the recall phase of the AFMMs and PAFMMs. Here, fuzzy operations refers to evaluations of fuzzy conjunction, disjunction, implication, or co-implications.

Fuzzy Operations Comparison Memory Space
AFMMs and
PAFMMs and
Table 1: Computational complexity in the recall phase of autoassociative memories

Finally, different from the min- and max- AFMMs, the max- and min- PAFMMs are not dual models with respect to a strong fuzzy negation. Precisely, the next theorem shows that the negation of a min- PAFMM is a max- PAFMM designed for the storage of the negation of the fundamental memories, and vice-versa.

Theorem 6

Let and be adjunction pairs where the fuzzy conjunction is connected to the fuzzy disjunction by means of a strong fuzzy negation , that is, , , and satisfies (10). Given a fundamental memory set , define by setting , for all and . Also, let and be respectively the max- and the min- PAFMMs designed for the storage of and define their negation as follows for every :

(35)

The negation of is the max- PAFMM designed for the storage of , that is,

(36)

Analogously, the negation of is the min- PAFMM given by

(37)

It follows from Theorem 6 that the negations and fail to store the fundamental memory set .

Example 0

Consider the fundamental memory set given by (22). Let and be the minimum fuzzy conjunction and fuzzy implication of Gödel, respectively. We synthesized the max- PAFMM designed for the storage of using the adjunction pair . Since is a fuzzy conjunction with 1 as identity, from Theorem 4, the equation holds for . Given the input vector defined by (24), we obtain from (32) the coefficients

(38)

Thus, the output of the max- PAFMM is

(39)

Note that failed to retrieve the fundamental memory .

Analogously, we can store the fundamental memory set into the max- PAFMMs and using respectively the adjunction pairs and . Upon presentation of the vector , the max- PAFMMs and produce

(40)

and

(41)

Like the min- AFMMs and , the memories and failed to recall the fundamental memory . Nevertheless, the max- PAFMMs , , and yielded outputs are more similar to the desired vector than the min- AFMMs , , , and (see Example 2). Quantitatively, Table 2 shows the normalized mean squared error (NMSE) between the recalled vector and the desired output . Recall that the NMSE between and is given by

(42)

This simple example confirms that a max- PAFMM can exhibit a better tolerance with respect to dilative noise than its corresponding min- AFMM.

0.33 0.32 0.14 0.05 0.33 0.01 0.02 0.05 0.00
Table 2: Normalized mean squared error.

Let us conclude the section by emphasizing that we cannot ensure optimal absolute storage capacity if does not have a left identity.

Example 0

Consider the “compensatory and” fuzzy conjunction defined by

(43)

Note that does not have a left identity. Moreover, the fuzzy implication that forms an adjunction with is

(44)

Now, let be the max- PAFMM designed for the storage of the fundamental memory set given by (22). Upon the presentation of the fundamental memory as input, we obtain from (32) the coefficients

(45)

Thus, the output vector of the max- PAFMM is

(46)

In a similar fashion, using the fundamental memories and as input, we obtain from the outputs

(47)

In accordance with Theorem 3, the inequality holds for . The fundamental memories , , and , however, are not fixed points of the max- PAFMM .

5 Zadeh Max- PAFMM and Its Dual Model

According to Hassoun and Watta hassoun97 , one of the most common problems in an associative memory design task is the creation of spurious or false memories. A spurious memory is a fixed point of an autoassociative memory that does not belong to the fundamental memory set. For instance, the fixed point is a spurious memory of the max- PAFMM in Example 7.

In general, the noise tolerance of an autoassociative memory decreases as the number of spurious memories increase. The set of fixed points of a max- PAFMM, however, corresponds to the set of all max- combinations of the fundamental memories. Hence, the smaller the family , the higher the noise tolerance of a max- PAFMM.

Given a fundamental memory set , we can reduce significantly by considering in (15) the fuzzy conjunction of Gaines . From Theorem 4, the output of the max- PAFMM based on Gaines’ fuzzy conjunction is given by

(48)

where

(49)

where is the fuzzy inclusion measure of Zadeh defined as follows for all :

(50)

In other words, instead of a general fuzzy inclusion measure of Bander-Kohout, the coefficients are determined using Zadeh’s fuzzy inclusion measure . Hence, this max- PAFMM is referred to as Zadeh’s max- PAFMM and denoted by .

From (50), the coefficient is either 0 or 1. Moreover, if and only if for all . Also, we have and for all . Therefore, for any input , the output of Zadeh’s max- PAFMM is alternatively given by the equation

(51)

where

(52)

is the set of the indexes such that is less than or equal to the input , i.e., . Here, we have if , where is a vector of zeros.

In a similar manner, from (33), the dual of Zadeh’s max- PAFMM is the min- PAFMM defined by

(53)

Here, and denote respectively the fuzzy disjunction and fuzzy co-implicantion of Gaines. Alternatively, the output of Zadeh’s min- PAFMM is given by

(54)

where

(55)

is the set of indexes such that . Here, we have if , where is a vector of ones.

Note from (51) and (54) that no arithmetic operation is performed during the recall phase of Zadeh’s max-C PAFMM model and its dual model; they only perform comparisons! Thus, both and are computationally cheap and fast associative memories. In addition, Zadeh’s max- PAFMM is extremely robust to dilative noise. On the other hand, its dual model exhibits an excellent tolerance with respect to erosive noise. The following theorem address the noise tolerance of these memory models.

Theorem 9

Consider the fundamental memory set . The identity holds true if there exists an unique such that . Furthermore, if there exists an unique such that then .

Example 0

Consider the fundamental memory set given by (22) and the input fuzzy set defined by (24). Clearly, , , and . Thus, the set of indexes defined by (52) is . From (51), the output of Zadeh’s max- PAFMM is

(56)

Note that the max- PAFMM achieved perfect recall of the original fundamental memory. As a consequence, the NMSE is zero. From Table 2, the max- PAFMM of Zadeh yielded the best NMSE, followed by the max- PAFMMs , and .

Let us conclude this section by remarking that Zadeh’s max- PAFMM also belongs to the class of -fuzzy associative memories (-FAMs) proposed by Esmi et al. esmi15fs .

Remark 2

An autoassociative -FAM is defined as follows: Consider a fundamental memory set and let be operators such that for all . Given an input and a weight vector , a -FAM yields

where is the following set of indexes:

Now, the max- PAFMM of Zadeh is obtained by considering and , for all . Specifically, in this case coincides with the set of index defined by (51).

6 Noise Masking Strategy for PAFMMs

A max- PAFMM cannot retrieve a fundamental memory from an input . The frail tolerance with respect to erosive or mixed noise may limit the applications of a max- PAFMM to real world problems. From the duality principle, similar remarks holds true for min- PAFMMs. It turns out that the noise tolerance of a PAFMM can be significantly improved by masking the noise contained in a corrupted input urcid07LC . In few words, noise masking converts an input degraded by mixed noise into a vector corrupted by either dilative or erosive noise. Inspired by the works of Urcid and Ritter urcid07LC , let us present a noise masking strategy for the PAFMMs. In order to simplify the presentation, we shall focus on max- PAFMMs.

Let denote a max- PAFMM which have been synthesized using a fundamental memory set . Also, assume that is a version of the fundamental memory corrupted by mixed noise. Then, is the masked input vector which contains only dilative noise, i.e., the inequality holds true. Since is robust to dilative noise, we expect the max- PAFMM to be able to retrieve the original fuzzy set under presentation of the masked vector .

The noise masking idea has a practical shortcoming: we do not known a priori which fundamental memory have been corrupted. Hence, Urcid and Ritter suggested to compare, for all , the masked vector with both the input and the fundamental memory urcid07LC . The comparison is based on some meaningful measure such as the normalized mean squared error (NMSE). In this paper, we propose to use a fuzzy similarity measure to determine the masked vector. Briefly, a fuzzy similarity measure is a mapping such that corresponds to the degree of similarity between and couso13 ; debaets05 ; baets09 ; fan99a ; xuecheng92 . Using a fuzzy similarity measure, the masked vector is obtained by computing the maximum between the input and the fundamental memory most similar to the input. In mathematical terms, we have where is an index such that

(57)

Concluding, the technique of noise masking for recall of vectors using a max- PAFMM yields the autoassociative fuzzy morphological memory defined by

(58)

where is an index that satisfies (57).

In a similar manner, we can define the technique of noise masking for recall of vectors using a min- PAFMM . Formally, we denote by an autoassociative fuzzy morphological memory given by