Learning in Competitive Network with Haeusslers Equation adapted using FIREFLY algorithm

Learning in Competitive Network with Haeusslers Equation adapted using FIREFLY algorithm

N. Joshi njoshi@fias.uni-frankfurt.de Institute for Applied Physics, Goethe University,
60438 Frankfurt, Germany
Abstract

Many of the competitive neural network consists of spatially arranged neurons. The weigh matrix that connects cells represents local excitation and long-range inhibition. They are known as soft-winner-take-all networks and shown to exhibit desirable information-processing. The local excitatory connections are many times predefined hand-wired based depending on spatial arrangement which is chosen using the previous knowledge of data. Here we present learning in recurrent network through Haeusslers equation and modified wiring scheme based on biologically based Firefly algorithm. Following results show learning in such network from input patterns without hand-wiring with fixed topology.

Pattern Recognition, Computer vision, machine learning
pacs:
42.30.Sy, 42.30.Tz
preprint: APS/123-QED\pdfstringdefDisableCommands

I Introduction

The concept of competitive learning and self organization in neural networks was first introduced by Malsburg (1973) successfully demonstrated for retinotopy Mals (); Haeuss (). Later developed many of the competitive neural network consists of spatially arranged neurons. The weigh matrix that connects cells represents local excitation and long-range inhibition. The ”soft-winner-take-all” dynamics is commonly used in such networks. The ”soft-winner-take-all” dynamics consists of local group of the winners rather than single winner as in ”hard ” winner in Winner-Take-All dynamics WTA (); SWTA (). The Self-Organizing Maps is another example that uses a similar mechanism SOM (). In case of WTA networks the network topology is mostly predefined and spatially fixed. The soft-WTA mechanism can be implemented by mexican hat fuction or also called Ricker function. With one winning neuron and immidiate neighbours connected with positive weights decreasing with the distance and negative weights on far distant neurons.

A FIREFLY algorithm was developed in order to simulate soft-WTA network with adaptive network topology. The dynamics of frequency synchronization in fireflies was described by Hanson in (1978). Here we simulate the spatial development of neurons modelled as firelies that can be used to form a population of inhibitory and excitory neurons firefly (); firefly_2 (); firefly_3 (). The disadvantege of this system would be one additional computational step and the memory usage. But on the other hand the efficiency of recall and signal reconstruction using this algorith is much better. In the following sections we describe the learning mechanism using the Haeussler’s equation and the improvement achieved using FIREFLY algorithm.

Ii Haeusslers Equation

The Haeussler’s equation can be written in the following form:

(1)

with additional saturation condition:

(2)

Here, the function enforces saturation of the synaptic weights at the value v such that . The first term by itself is trying to pull the weights towards the value , whereas the second term by itself would let eventually converge towards , so that the sum in the second term could be interpreted as a weighted average: . The second term, then, would let those weights that have above-average cooperative help grow at the expense of the below average cooperation weights. The weights will thus favor the higher cooperation terms, increasing , letting more weights fall below-average cooperation and thus to start decaying, until finally only weight(s) with maximal cooperation will stabilize. If is positive, the system is driven towards a compromise between the first and the second term of eq . The initial state of should be random, with the probability or strength of a connection between cells and falling off monotonously with the distance between the two in a two-dimensional plane. For detailed explaination on equation see Appendix.

Iii Response of the equation

Consider number of neurons with two lateral connections. Fig. 1 shows evolution weight matrix and weight vector for under modified equation. Following conditions are assumed:

  • for , no self connections

Thus if we have two neighbouring connections, the initial condition would be,

In the example neighbours are assumed. To avoid any boundary effects a periodic condition is assumed. Fig. 1 shows immediate neighbour strengthening and others lowering the weights.

(a)
(b)
Figure 1: Evolution of weight matrix . with lateral connections.(a): Initial and final matrix,(b): Initial and final weight vector for .

Iv Input Patterns

Two types of input patterns were used, one dimensional and two dimensional defined through Gaussian function. Fig. 2 shows and input in square form.

In one dimensional case the input pattern given as,

(3)

where is the central value varied over the number of neurons. Similarly in case the and varied with where is total number of neurons.

(4)
Figure 2: and input patterns reshaped into . Example with neurons.

The input vectors are normalized and generated at random for tranning and fed to the network.

V FIREFLY Algorithm

In earlier discussed cases 2D grid is defined. To modify this scenario a population based firefly algorithm is developed. The female fireflies attracted towards dominant male flies depending on brightness. The frequency driven algorithm was developed by Hanson (1978). The excitory neurons and inhibitory neurons are attracted to active cell depending on the activity level or brightness. Thus forming a population of neurons which is used to define a weight matraix. The movement of neurons is governed by equations:

(5)

where is the brightness of a ’male firefly’ to which the ’female’ fireflies are attracted. The frequency dependence is ignored in this case.

(6)

After feeding each pattern to the network the population of fireflies is redistributed thus redefining the distribution of of inhibitory and excitory neurons. The weight matrix then subjected to the Haeuslers equation thus reducing number of excitory neurons and inhibitory neurons those represent particular pattern and remain active.

1:procedure FIREFLY
2:     Generate initial population
3:     Compute brightness using objective function
4:     Move fireflies and recalculate positions according to attractiveness
5:     Calculate minimum distance condition
Algorithm 1 Psuedo code Firefly
1:procedure Weights
2:     Generate matrix
3:     Generate initial population
4:     Run FIREFLY CODE
5:     Read pattern and set activity
6:     Calculate weight matrix
7:     Run weight learning algorithm
8:     Get new pattern
Algorithm 2 Psuedo code Weight Calculation

Vi Results

For demonstration purpose low number of neurons are used. Previously described 2D input patterns are fed. The input pattern can either be presented after convergence or

The learning algorithm can either run when a new input is first presented or only after the network has converged. The scenario of running learning algorithm at stimulus onset is more biophysiology compatible. Fig. 3 shows the learnt weight matrix for two dimensional case.

Figure 3: Network of , fed with random 2D pattern. (a): Input patterns, (b): Output without FIREFLY, .

In the earlier examples the periodic boundary condition was not applied. Hence one observes assymetry at .

Figure 4: Network of , fed with random 2D pattern. (a): Input patterns, (b): Output without FIREFLY, (c): Output with FIREFLY routine, (d): Input and output signals, with and without FIREFLY .

Fig. 4 compares the recalled pattern from memory with and without FIREFLY subroutine. One clearly observes the the recalled pattern carries matches with the input pattern more closely when FIREFLY is applied.

This type of recurrent competitive networks is also able to complete partial patterns Fig. 5 shows the signal recall when partially distorted pattern is presented. After training the network the excitatory connections are strengtened between units whose activity is highly correlated . This provides solution to the missing information.

Figure 5: Network of , fed with random 2D pattern. (a): Input pattern consisting with noise, (b): Recovered signal by trained network with FIREFLY subroutine, (c): Input and output signal.

When two fused inputs are presented it treats the input patters as similar input as shown in Fig 6. Or if one of the two inputs is weaker than the other it captures the difference between them.

Figure 6: Network of , fed with random 2D pattern. (a): Two Input pattern fused with noise, (b): Recovered signal by trained network with FIREFLY subroutine

An Example with Handwritten digits

As an ”real world” example we consider images of Handwritten digits, a dataset available on internet for standard case. A model image of each digit is prepared and fed to the network for learning.

Figure 7: The stored patterns and retrieved patterns using FIREFLY algorithm with two examples as digits. The figure on the bottom shows input and output signal from neurons.

Then noisy image is fed to the network which then outputs the corresponding stored image. Fig. 7 shows the stored patterns and retrieved patterns using FIREFLY algorithm.

Vii Discussion

Here we have shown that the biologically motivated FIREFLY algorithm can be used for memory based neural network with competitive learning. In this paper we have shown that the network has ability to act as associative memory, retrieve the information and handle the noisy signal or recalling partial information. One of the disadvange would be that it requires additional calculation time to simulate firefly algorithm. But the spotaneous population based algorithm makes it topology independent network. Further efforts involve optimization of number of excitatory and inhibitory cells for which we would be implementing Lokta Volterra equations.

*

Appendix A

To describe an associative memory for fragments of image content. Each fragment has the form of a 2D neural field with short-range connections. Fragments may tile a larger visual domain. This visual domain may be , and the fragments correspond to texture patches. After some training the collections of all fragments should represent a codebook for all texture patches occurring in the input with some statistical significance. The visual domain is composed of so-called columns. Each column stands for small resolvable region of the image and contains a full set of texture neurons, e.g., neurons with receptive fields in the form of Gabor wavelets. A particular visual input into selects a subset of all the neurons of . is the set of indices of those neurons that are active in state . A state is assumed to be active for a typical period of length . The goal of the associative memory is to extract from a long sequence of image inputs a set of states . We are restricting consideration to image patches of a certain diameter . Let us designate by resp. the set and number of neurons contained in a patch. The whole image domain is composed of a tiling of such patches, and an infinite number of possible images can be represented combinatorially by each patch selecting one of its states. Whereas the general idea of associative memory is that the states are stored by plastically strengthening all connections between the neurons that are simultaneously active in a state, we want the connections between neurons active within a patch’s activity state to form a network pattern (a ”net fragment”) that is an attractor under network self-organization. In network self-organization the activity of the neurons addressed by the input fluctuates spontaneously, and connections are changed by Hebbian plasticity. This is a positive feedback loop, as excitatory connections generate signal fluctuations, which in turn strengthen connections. This instability is checked by a constraint limiting the weight sum of connections converging on a neuron. Given enough time this loop runs into an attractor state that is stabilized by the balance between Hebbian plasticity and the weight sum constraint. We thus assume that at a given time the input selectively activates neurons with the help of sources (whereas for ). These source terms fluctuate with a time constant and have correlations , where the temporal average is taken over the time scale . We assume further that the propagation of signal deviations from the average is very fast, at a time scale , so that we get equilibrium signal deviations changing at the same time scale as the sources . These equilibrium deviations, then, obey the equation.

(7)

or in vector notation

(8)

which is solved by

(9)

Expanding that matrix inverse as

(10)

and braking this series off after the third power we have an approximate inverse which we call , so that .

With this in hand we can compute temporal correlations of signals

(11)

where we have used

(12)

(If is symmetric, .) This result is easily interpreted: Fluctuations are communicated from the source cell to the two cells directly or over one or two or three intermediate connections.

These correlations are to be used to drive Hebbian plasticity. Using the Häussler equation as model we formulate it as:

(13)

which contains the cooperation term

(14)

and the competition term

(15)

Here, is an unspecific synaptic growth rate. Eq. (6) can be re-written as

(16)

References

  • (1) Chr. von der Malsburg, ”Self-organization of orientation sensitive cells in the striate cortex”
  • (2) A. F. Haeussler and Christoph von der Malsburg, ”Development of retinotopic projections - an analytical treatment”, Journal of Theoretical Neurobiology, 2:47-73, 1983.
  • (3) R. Douglas and K. Martin, “Recurrent neuronal circuits in the neocortex”, Current Biology, vol. 17, no. 13, pp. 496–500, 2007.
  • (4) S. Amari, “Dynamics 6 of pattern formation in lateral-inhibition type neural fields,” Biological 8 Cybernetics, Jan. 1977.
  • (5) T. Kohonen
  • (6) Janmenjoy Nayak et al , ” Anovel nature inspired firefly algorithm with higher order neural network: Performance analysis” EST, 2015
  • (7) Surafel Luleseged Tilahun and Hong Choon Ong, ”Modified Firefly Algorithm”, Journal of Applied Mathematics, Volume 2012, Article ID 467631,2012
  • (8) Iztok Fister et al, ”A comprehensive review of firefly algorithms”
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
382235
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description