Fish School Search Algorithm for Constrained Optimization

Fish School Search Algorithm for Constrained Optimization

J.B. Monteiro-Filho* Computational intelligence research group - Polytechnical School of Pernambuco Benfica 455, Recife-PE, Brazil I.M.C. Albuquerque Computational intelligence research group - Polytechnical School of Pernambuco Benfica 455, Recife-PE, Brazil F.B. Lima Neto Computational intelligence research group - Polytechnical School of Pernambuco Benfica 455, Recife-PE, Brazil
Abstract

In this work we investigate the effectiveness of the application of niching able swarm metaheuristic approaches in order to solve constrained optimization problems. Sub-swarms are used in order to allow the achievement of many feasible regions to be exploited in terms of fitness function. The niching approach employed was wFSS, a version of the Fish School Search algorithm devised specifically to deal with multi-modal search spaces. A base technique referred as wrFSS was conceived and three variations applying different constraint handling procedures were also proposed. Tests were performed in seven problems from CEC 2010 and a comparison with other approaches was carried out. Results show that the search strategy proposed is able to handle some heavily constrained problems and achieve results comparable to the state-of-the-art algorithms. However, we also observed that the local search operator present in wFSS and inherited by wrFSS makes the fitness convergence difficult when the feasible region presents some specific geometrical features.

1 Introduction

According to Koziel and Michalewicz [14], the general nonlinear programming problem (NLP) consists in finding x such that:



where . Objective function is defined on the search space and the set defines the feasible region.

Search space is defined as a rectangle within and constraints define the feasible space :

and

Equality constraints are commonly relaxed and transformed in inequality constraints as [23]: , where is a very small tolerance value.

Almost all real-world optimization problems are constrained [13]. Hence, many metaheuristic search procedures were conceived for the general NLP. Recent approaches include: Genetic Algorithms [18, 9, 14], Diferential Evolution [19, 11, 30, 5, 29, 28], Cultural Algorithm [15], Particle Swarm Optimization [13, 6, 3, 17, 27, 12] and Artificial Bee Colony Optimization [16, 4, 1, 23].

Regarding the approaches applied in order to tackle constrained search, Mezura-Montes and Coello Coello [22] present a simplified taxonomy of the common procedures in literature:

  1. Penalty functions - Includes a penalization term in the objective function due to constraint violation. This is a popular and easy to implement approach but has the drawback of requiring the penalty weights adjustment;

  2. Decoders - Consists in mapping the feasible region on search spaces where an unconstrained problem will be solved. The high computational cost required is the main disadvantage in its use;

  3. Special operators - Mainly in evolutionary algorithms, operators can be designed in a way to prevent the creation of infeasible individuals;

  4. Separation of objective function and constraints - This approach, different from penalty functions, treat objective function and constraint violations in separately. Many procedures can be applied from this approach such as dividing the search process in phases or applying multi-objective techniques.

Fish School Search (FSS) algorithm, presented originally in 2008 in the work of Bastos-Filho and Lima-Neto et al. [10], is a population based continuous optimization technique inspired in the behavior of fish schools while looking for food. Each fish in the school represents a solution for a given optimization problem and the algorithm utilizes information of every fish to guide the search process to promising regions in the search space as well as avoiding early convergence in local optima.

Ever since the original version of FSS algorithm was developed, many modifications were performed in order to tackle different issues such as multi-objective optimization [2], multi-solution optimization [20] and binary search [26]. Among those, a novel niching and multi-solution version known as wFSS was recently proposed [8].

To the best of the authors knowledge, the application of FSS in the solution of constrained optimization problems has never been reported before. Hence, in this work, a modification in niching weight based FSS (wFSS) was carried out. The separation of objective function and constraints was applied and the niching feature was used in order for the population to find different feasible regions within the search space to be exploited in terms of fitness value. Moreover, three different mechanisms were applied generating in total four approaches.

This paper is organized as follows: section 2 provides an overview of Fish School Search algorithm and its niching version, wFSS. Section 3 introduces the proposed modifications in order to employ wFSS in constrained optimization problems as well as the variations introduced based on the main procedure conceived. Section 4 presents the tests performed and results achieved.

2 Fish Schooling Inspired Search Procedures

2.1 Fish School Search Algorithm

FSS is a population based search algorithm inspired in the behavior of swimming fishes in a school that expands and contracts while looking for food. Each fish -dimensional location represents a possible solution for the optimization problem. The algorithm makes use of weights for all fishes which represent cumulative account on how successful has been the search for each fish in the school.

FSS is composed of feeding and movement operators, the latter being divided into three sub-components, which are:

  1. Individual component of the movement: Every fish in the school performs a local search looking for promising regions in the search space. It is done as represented by (1):

    (1)

    where and represent the position of fish before and after the individual movement operator, respectively. rand(-1,1) is an uniformly distributed random numbers array with the same dimension as and values varying from up to . is a parameter that defines the maximum displacement for this movement. The new position is only accepted if the fitness of fish improves with the position change. If it is not the case, remains the same and .

  2. Collective-instinctive component of the movement: An average of displacements performed within individual movements is calculated based (2):

    (2)

    Vector I represents the weighted average of the displacements of each fish. It means that fishes which experienced a higher improvement will attract other fishes into its current position.

    After vector I computation, every fish will be encouraged to move according to (3):

    (3)
  3. Collective-volitive component of the movement: This operator is used in order to regulate exploration/exploitation abilities of the school during the search process. First of all, barycenter B is calculated based on the position and weight of each fish:

    (4)

    and then, if total weight given by the sum of weights of all fishes in the school has increased from last to current iteration, the fishes are attracted to the barycenter according to (5). If the total school weight has not improved, fishes are spread away from the barycenter according to (6):

    (5)
    (6)

    where defines the maximum displacement performed with the use of this operator. is the euclidean distance between fish position and the school barycenter. rand(0,1) is an uniformly distributed random numbers array with the same dimension as B and values varying from up to .

Besides movement operators, it was also defined a feeding operator used in order to update the weights of every fish according to (7):

(7)

where is the weight parameter for fish , is the fitness variation between the last and new positions and represents the maximum absolute value of fitness variation among all fishes in the school.

is only allowed to vary from 1 up to , which is a user defined attribute. Weights of all fishes are initialized with the value .

The parameters and decay linearly along with the iterations.

2.2 Weight-based Fish School Search Algorithm

Introduced in the work of Lima-Neto and Lacerda [8], wFSS is a weight based niching version of FSS intended to provide multiple solutions for multi-modal optimization problems. The niching strategy is based on a new operator called Link Formator. This operator is responsible for defining leaders for fishes in order to form sub-schools and works according to the following: each fish chooses randomly another fish in the school. If is heavier than , then now has a link with and follows (i.e. b leads a). Otherwise, nothing happens. However, if already has a leader and the weights sum of followers is higher than weight, then stops following and starts following . In each iteration, if becomes heavier than its leader, the link will be broken.

In addition to Link Formator operator inclusion, some modifications were performed in the components of the movement operators in order to emphasize leaders influence on sub-swarms. Thus, the displacement vector I of the collective-instinctive component becomes:

(8)

where is 1 if fish has a leader and otherwise. and are the displacement and fitness variation of the leader of fish . Furthermore, the influence of vector I in fishes movements is increased along with iterations. This is represented by with . The collective-volitive component of the movement is also modified in a sense that the barycenter is now calculated for each fish with relation to its leader. If the fish does not have a leader, its barycenter will be its current position. This means:

(9)

3 wrFSS

Some modifications were proposed in wFSS in order to make the algorithm able to tackle constrained optimization problems. Basically, either fitness values and constraint violation are measured for every fish. In the beginning of each iteration, a decision has to be done in order to define whether fitness function or constraint violation will be used as objective function within current iteration.

The decision of which value to use as objective function was chosen to be done according to the feasible individuals proportion with relation to whole population. This means that, if the current feasible proportion of the population is higher than an user defined threshold , the search will be performed using fitness as objective function. If that is not the case, constraint violation will be then optimized.

The described procedure was applied to divide the search process in two different phases and to allow the algorithm to: phase 1 - find many feasible regions; phase 2 - optimize fitness within feasible regions. The niching feature of wFSS is useful in phase 1 once this feature will make the school able to find many different feasible regions. Moreover, every once the search changes from phase 1 to phase 2, an increase factor is applied in the steps of either Individual and Collective-volitive movement operators in order to augment the school mobility in the new phase.

The algorithm described will be referred as wrFSS and its pseudocode is:

1:  Initialize user parameters
2:  Initialize fishes positions randomly
3:  while Stopping condition is not met do
4:     Calculate fitness for each fish
5:     Calculate constraint violation for each fish
6:     if  then
7:        To define fitness as objective function
8:     else
9:        To define constraint violation as objective function
10:     end if
11:     Run individual movement operator
12:     Run feeding operator
13:     Run collective-instinctive movement operator
14:     Run collective-volitive movement operator
15:  end while

The constraint violation measure applied in wrFSS was the same as in the work of Takahama and Sakai [30]:

(10)

Best fish selection was done using Deb’s heuristic [9]:

  1. Any feasible solution is preferred to any infeasible solution;

  2. Among two feasible solutions, the one having better fitness function will be preferred;

  3. Among two infeasible solutions, the one having smaller constraint violation is preferred.

Furthermore, the feeding operator version applied was the same as in the work of Monteiro et al. [24]. In this version, feeding becomes a normalization of both fitness and constraint violation values:

(11)

where will be constraint violation values within phase 1 and fitness on phase 2. and are the minimum and maximum values found in all the search process.

It is important to highlight that the normalization applied in Equation 11 makes and once this equation is applied for minimization of both fitness function and constraint violation.

3.1 wrFSS Variations

In this Section, some variations of the aforementioned algorithm will be presented applying state-of-the-art constrained optimization approaches within wrFSS. wrFSS variations are:

  1. wrFSSe - Applies the -method [27] in Individual component of the movement;

  2. wrFSSg - Includes a gradient based local search in either phases of the search process;

  3. wrFSSp - Uses a penalized fitness function in phase 2.

wrFSS variations are intended to increase the algorithm performance when tackling challenging problems. Each of the variations specific mechanisms will be better described in next sections.

3.1.1 wrFSSe

The -method [31, 29, 30, 5] defines a comparison procedure taking simultaneously into account constraint violation and fitness value. Let and be the fitness value and constraint violation evaluated at point . Thus, comparisons and , with , are defined as:

if (12)
if (13)
Otherwise, (14)
if (15)
if (16)
Otherwise. (17)

When , -comparison becomes a simple fitness comparison. Further, for , Deb’s heuristic is carried out.

In wrFSSe, was chosen to decay along with the iterations in the same way as in the work of Takahama and Sakai [30]:

(18)
, (19)

where is the current iteration and is a percentage of the maximum number of iterations. is given by , and is a user-defined parameter. depends on the initial school constraint violation [27]:

(20)

Individual movement operator in wrFSSe applies the -comparison between the fishes’ current and candidate positions. If candidate position is with relation to the current position, the movement is allowed.

3.1.2 wrFSSg

Gradient based individual movement operators were designed in order to guide the local search process in either phases of the search process intended to:

  • Phase 1 - The gradient based individual movement is performed in order to allow the fishes to quickly achieve feasible regions;

  • Phase 2 - The gradient based individual movement is intended not to allow the fishes to escape the feasible regions.

In order to do so, the following steps are employed:

  1. Calculate ;

  2. Chose random directions and compute their director vectors ;

  3. Calculate the directional derivatives given by ;

  4. Return in phase 1 or in phase 2.

With this procedure, we intended to provide the individual movement operator with the directions containing high probabilities of improving constraint violation within phase 1 or improving fitness value maintaining fishes in feasible regions within phase 2 of the search process.

The gradient is calculated according to (21) [7]:

(21)

where is the problem dimensions number and is a pertubation constant.

The directional derivatives are computed by the inner product between the gradient and the director vectors.

The candidate position for the Individual movement operator is given by or for phases 1 and 2, respectively.

Gradient evaluation demands evaluations. Therefore, a user-defined probability is defined in order for each fish to check whether it will run the gradient based or the original Individual movement operator.

3.1.3 wrFSSp

A simple modification was proposed in wrFSS originating wrFSSp. A penalty approach was applied specifically in phase 2 in order to avoid that feasible fishes move to infeasible positions which improve their fitnesses. Thus, the objective function of phase 2 is defined as in (22):

(22)

4 Experiments

In order to evaluate wrFSS different versions performance, a set of constrained optimization problems defined for CEC 2010 [21] was solved.

The chosen problems are presented in Table 1 as well as their features. The problems selected to be included in the test set present different feasible regions. The feasible region is the ratio between the feasible portion and the whole search space.

Problem Search Space Number of Constraints
Feasible
Region (10D)
E I
C01 0 2 0.997689
C03 1 0 0.000000
C04 4 0 0.000000
C06 2 0 0.000000
C07 0 1 0.505123
C08 0 1 0.379512
C09 1 0 0.000000
Table 1: Chosen CEC 2010’s Problems

Two levels of each parameter were chosen in order to evaluate which combination of them caused the best results for C01 and C03. Based on that, the best performers parameters sets on C01 were extended to C07 and C08. The parameters which generated best results to C03 were applied also in the tests with C04, C06 and C09 due to the similarities of the problems in terms of feasible region. Table 2 presents the chosen parameters values used in the tests performed.

Problem wrFSS wrFSSe wrFSSg wrFSSp
C01
C03
C04
C06
C07
C08
C09
Table 2: Parameters Definition

Table 3 presents the results obtained in 30 runs of each wrFSS variation when solving each of the aforementioned CEC’s problem. In all the tests, the maximum number of iterations was set to and the tolerance values in all tests. All wrFSS variations include the Stagnation Avoidance Routine [25] within the Individual movement operator. was set to decay exponentially: , where is the current iteration.

wrFSS wrFSSe wrFSSg wrFSSp
Fitness Feasibility Fitness Feasibility Fitness Feasibility Fitness Feasibility
C01 Mean -5,91E-01 0,00E+00 -4,03E-01 0,00E+00 -5,76E-01 0,00E+00 -6,93E-01 0,00E+00
SD 4,83E-02 0,00E+00 1,17E-01 0,00E+00 3,16E-02 0,00E+00 1,64E-02 0,00E+00
min -7,06E-01 0,00E+00 -7,42E-01 0,00E+00 -6,42E-01 0,00E+00 -7,20E-01 0,00E+00
max -4,95E-01 0,00E+00 -2,67E-01 0,00E+00 -5,15E-01 0,00E+00 -6,63E-01 0,00E+00
C03 Mean 6,33E+12 4,45E-05 4,01E+09 1,55E-05 5,20E+13 5,11E-05 7,71E+12 3,54E-05
SD 5,54E+12 6,55E-05 8,37E+09 4,02E-05 1,46E+14 7,14E-05 1,45E+13 6,03E-05
min 6,82E+10 0,00E+00 1,40E+03 0,00E+00 1,35E+12 0,00E+00 1,56E+10 0,00E+00
max 2,31E+13 1,77E-04 3,47E+10 1,24E-04 7,82E+14 2,25E-04 5,97E+13 1,50E-04
C04 Mean 2,23E+00 6,26E-04 5,60E+00 1,20E-03 1,88E+00 7,46E-04 1,55E+00 7,26E-04
SD 5,37E+00 2,85E-04 7,16E+00 5,43E-04 4,64E+00 3,54E-04 4,24E+00 2,79E-04
min 1,17E-02 2,03E-04 1,51E-02 2,32E-04 1,19E-02 0,00E+00 4,40E-03 1,02E-04
max 1,62E+01 1,53E-03 1,62E+01 2,46E-03 1,62E+01 1,61E-03 1,40E+01 1,38E-03
C06 Mean 2,92E+02 0,00E+00 -5,65E+02 0,00E+00 -5,20E+00 0,00E+00 3,04E+02 0,00E+00
SD 9,40E+01 0,00E+00 3,55E+00 0,00E+00 1,51E+02 0,00E+00 8,60E+01 0,00E+00
min 4,86E+01 0,00E+00 -5,71E+02 0,00E+00 -4,52E+02 0,00E+00 1,12E+02 0,00E+00
max 4,55E+02 0,00E+00 -5,56E+02 0,00E+00 1,64E+02 0,00E+00 4,45E+02 0,00E+00
C07 Mean 5,09E+05 0,00E+00 5,01E+00 0,00E+00 5,88E+09 0,00E+00 4,32E+05 0,00E+00
SD 3,17E+05 0,00E+00 6,63E+00 0,00E+00 4,23E+09 0,00E+00 2,40E+05 0,00E+00
min 9,29E+04 0,00E+00 2,44E-01 0,00E+00 1,83E+09 0,00E+00 7,82E+04 0,00E+00
max 1,74E+06 0,00E+00 3,44E+01 0,00E+00 2,02E+10 0,00E+00 1,17E+06 0,00E+00
C08 Mean 4,16E+09 0,00E+00 6,04E+01 0,00E+00 7,34E+09 0,00E+00 4,19E+09 0,00E+00
SD 2,13E+09 0,00E+00 1,60E+01 0,00E+00 3,76E+09 0,00E+00 2,25E+09 0,00E+00
min 4,64E+08 0,00E+00 3,72E+01 0,00E+00 1,01E+09 0,00E+00 1,11E+09 0,00E+00
max 8,65E+09 0,00E+00 1,14E+02 0,00E+00 1,47E+10 0,00E+00 8,83E+09 0,00E+00
C09 Mean 4,57E+12 0,00E+00 3,61E+06 0,00E+00 9,52E+12 0,00E+00 4,39E+12 0,00E+00
SD 2,06E+12 0,00E+00 1,40E+07 0,00E+00 4,89E+12 0,00E+00 1,79E+12 0,00E+00
min 3,16E+11 0,00E+00 1,74E+03 0,00E+00 1,76E+12 0,00E+00 1,40E+12 0,00E+00
max 7,84E+12 0,00E+00 6,66E+07 0,00E+00 2,59E+13 0,00E+00 7,97E+12 0,00E+00
Table 3: CEC 2010 Problems Solutions Results

From Table 3, it is noticeable that all the proposed algorithms were able to reach feasible solutions in all runs for problems C01, C08 and C09, which are those containing relatively large feasible regions. The same did not happen in the cases of C03, C04, C06 and C09, heavily constrained problems due to equality constraints presence. Specifically in C04, only wrFSSg was able to find feasible individuals. In the other hand, in C06 and C09 all the approaches were able to find feasible individuals in all runs. In C03, all the variations of wrFSS were able to find feasible individuals, but not in al runs.

The difficult of wrFSS variations in order to tackle some heavily constrained problems is related to the search mechanisms employed. The Individual movement operator is based on a local search performed with a random jump. Therefore, in situations in which the feasible regions are very small, random jumps may neither guarantee that a fish can reach this region in phase 1 nor guarantee that a fish that has already reached it will keep there.

In the specific case of C03, for instance, the feasible region is composed by the line . Thus, even when fishes are able to reach the line, becoming feasible and changing the search mode from phase 1 to phase 2, once they perform the Individual movement operator, the random jumps will not allow them to strictly move over the line. This drawback was tackled by the application of the method and the gradient based individual movement operators. However, these two methods still apply random jumps and then, depending on the topological features of the feasible regions, the algorithm could fail to exploit fitness in phase 2.

Figure 1 supports the aforementioned issues and displays the mean Best Fish’s Fitness and Feasibility (constraint violation measure) along with the iterations. It is possible to notice that, in C03, most approaches are not able to improve fitness once the feasibility improves up to the end of the search process. This means that a long phase 1 happens in all versions, trying to find and keep the feasible line. The fast improving fitness in wrFSSe happens because of the characteristic relaxation of the constraint violation present in method. Moreover, in C09, It is possible to notice that all versions of wrFSS are able to reach feasible regions, but the fitness does not improve when that happens. Which means that once fishes reach feasible regions and change the search mode to phase 2, in few iterations the feasibility state degenerates due to random jumps in infeasible directions and phase 1 takes place again avoiding fitness convergence.

(a) Best Fish Fitness Convergence in C03
(b) Best Fish Feasibility Convergence in C03
(c) Best Fish Fitness Convergence in C09
(d) Best Fish Feasibility Convergence in C09
Figure 1: Best Fish Fitness and Feasibility Convergence

A fitness comparison is provided in Table 4. Three CEC 2010’s Top Ten rated approaches namely DEg [30], E-ABC [23] and Co-CLPSO [17] were selected to be compared with wrFSS variations.

The best mean fitness values reached among all the approaches as well as the best performing wrFSS variation are highlighted.

One can notice that DEg outperforms all the other approaches in all the problems selected, except for C08 where Co-CLPSO reaches better results. However, regarding specifically wrFSS variations, it can be seen that wrFSSe is the best wrFSS variation in 5 out of 7 tests. Further, in 6 out of 7 tests performed, some wrFSS variation outperforms at least one of the three approaches selected for comparison.

wrFSS wrFSSe wrFSSg wrFSSp DEg Co-CLPSO E-ABC
C01 Mean -5,91E-01 -4,03E-01 -5,76E-01 -6,93E-01 -7,47E-01 -7,34E-01 -7,16E-01
SD 4,83E-02 1,17E-01 3,16E-02 1,64E-02 1,32E-03 1,78E-02 2,69E-02
C03 Mean 6,33E+12 4,01E+09 5,20E+13 7,71E+12 0,00E+00 3,55E-01 2,45E+12
SD 5,54E+12 8,37E+09 1,46E+14 1,45E+13 0,00E+00 1,78E+00 1,01E+12
C04 Mean 2,23E+00 5,60E+00 1,88E+00 1,55E+00 -9,92E-06 -9,34E-06 8,56E-01
SD 5,37E+00 7,16E+00 4,64E+00 4,24E+00 1,55E-07 1,07E-06 3,01E+00
C06 Mean 2,92E+02 -5,65E+02 -5,20E+00 3,04E+02 -5,79E+02 -5,79E+02 4,38E+02
SD 9,40E+01 3,55E+00 1,51E+02 8,60E+01 3,63E-03 5,73E-04 8,60E+01
C07 Mean 5,09E+05 5,01E+00 5,88E+09 4,32E+05 0,00E+00 7,97E-01 7,16E+01
SD 3,17E+05 6,63E+00 4,23E+09 2,40E+05 0,00E+00 1,63E+00 5,19E+01
C08 Mean 4,16E+09 6,04E+01 7,34E+09 4,19E+09 6,73E+00 6,09E-01 4,11E+02
SD 2,13E+09 1,60E+01 3,76E+09 2,25E+09 5,56E+00 1,43E+00 9,36E+02
C09 Mean 4,57E+12 3,61E+06 9,52E+12 4,39E+12 0,00E+00 1,99E+10 2,02E+12
SD 2,06E+12 1,40E+07 4,89E+12 1,79E+12 0,00E+00 9,97E+10 1,81E+12
Table 4: Fitness values comparison with other approaches

5 Conclusion

Several problems within Industry an Academia are constrained. Therefore, many approaches try to employ metaheuristic procedures in order to efficiently solve the aforementioned class of problems. Different search strategies were developed and applied in either Evolutionary Computation and Swarm Intelligence techniques.

The first contribution in this work regards the proposal of a new approach in order to tackle constrained optimization tasks: the separation of objective function and constraint violation by the division of the search process in two phases. Phase 1 is intended to make the swarm to find many different feasible regions and, after that, phase 2 takes place in order to exploit the feasible regions in terms of fitness values.

This strategy, mainly in phase 1, requires a niching able algorithm. Thus, we selected wFSS, the multi-modal version of the Fish School Search algorithm, to be employed as base algorithm. Hence, we conceived a variation of wFSS named wrFSS embedding the division strategy. Moreover, we proposed three variations of wrFSS applying different strategies in order to improve its performance.

In order to evaluate the techniques proposed, seven problems from CEC 2010 were solved. Results show that wrFSS as well as its variations are able to solve many hard constrained optimization problems. However, in some cases, specifically in problems containing feasible regions presenting geometric conditions in which the widths in some directions are much higher than in others, the algorithm’s local search procedure brings difficulties for wrFSS to keep solutions feasible once phase 1 finishes. Even so, in a comparison performed with three CEC 2010’s approaches within top 10 winners has shown that some wrFSS variation outperforms one of these techniques in almost all solved problems in this work. wrFSSe was the best variation of the proposed versions.

It is important to highlight that the approaches used for comparison apply costly search mechanisms that were not employed in wrFSS. Local optimization procedures and external archive were applied in these techniques.

Based on the aforementioned, the proposed strategy of dividing the search process in two different phases and apply a niching swarm optimization technique in order to find many feasible regions in phase 1 is an interesting approach to be explored. In future works, improvements in wrFSS could include a hybridization between wrFSSe and wrFSSg in order to improve the local search ability of wrFSSe, the best performing wrFSS variation. Moreover, the inclusion of other resources such as local optimization operators as well as external archive related operators could be employed in order to solve the issues highlighted in this work.

References

  • [1] B. Akay and D. Karaboga. Artificial bee colony algorithm for large-scale problems and engineering design optimization. Journal of Intelligent Manufacturing, 23(4):1001–1014, 2012.
  • [2] C. J. A. Bastos-Filho and A. C. S. Guimarães. Multi-Objective Fish School Search. International Journal of Swarm Intelligence Research, 6(1):23–40, 2015.
  • [3] M. Bonyadi, X. Li, and Z. Michalewicz. A hybrid particle swarm with velocity mutation for constraint optimization problems. Proceeding of the fifteenth annual conference on Genetic and evolutionary computation conference - GECCO ’13, page 1, 2013.
  • [4] I. Brajevic and M. Tuba. An upgraded artificial bee colony (ABC) algorithm for constrained optimization problems. Journal of Intelligent Manufacturing, 24(4):729–740, 2013.
  • [5] J. Brest. Constrained real-parameter optimization with -self-adaptive differential evolution. Studies in Computational Intelligence, 198:73–93, 2009.
  • [6] M. Campos and R. A. Krohling. Hierarchical bare bones particle swarm for solving constrained optimization problems. 2013 IEEE Congress on Evolutionary Computation, CEC 2013, pages 805–812, 2013.
  • [7] P. Chootinan and A. Chen. Constraint handling in genetic algorithms using a gradient-based repair method. Computers & operations research, 33(8):2263–2281, 2006.
  • [8] F. B. De Lima Neto and M. G. P. De Lacerda. Multimodal fish school search algorithms based on local information for school splitting. Proceedings - 1st BRICS Countries Congress on Computational Intelligence, BRICS-CCI 2013, pages 158–165, 2013.
  • [9] K. Deb. An efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics and Engineering, 186(2-4):311–338, 2000.
  • [10] C. J. a. B. Filho, F. B. D. L. Neto, A. J. C. C. Lins, A. I. S. Nascimento, and M. P. Lima. A novel search algorithm based on fish school behavior. Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics, pages 2646–2651, 2008.
  • [11] N. Hamza, D. Essam, and R. Sarker. Constraint Consensus Mutation based Differential Evolution for Constrained Optimization. IEEE Transactions on Evolutionary Computation, (c):1–1, 2015.
  • [12] X. Hu and R. Eberhart. Solving Constrained Nonlinear Optimization Problems with Particle Swarm Optimization. Optimization, 2(1):1677–1681, 2002.
  • [13] A. R. Jordehi. A review on constraint handling strategies in particle swarm optimisation. Neural Computing and Applications, 26(6):1265–1275, 2015.
  • [14] S. Koziel and Z. Michalewicz. Evolutionary algorithms, homomorphous mappings, and constrained parameter optimization. Evolutionary computation, 7(1):19–44, 1999.
  • [15] R. Landa Becerra and C. A. C. Coello. Cultured differential evolution for constrained optimization. Computer Methods in Applied Mechanics and Engineering, 195(33-36):4303–4322, 2006.
  • [16] X. Li and M. Yin. Self-adaptive constrained artificial bee colony for constrained numerical optimization. Neural Computing and Applications, 24(3-4):723–734, 2014.
  • [17] J. J. Liang, S. Zhigang, and L. Zhihui. Coevolutionary comprehensive learning particle swarm optimizer. 2010 IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 IEEE Congress on Evolutionary Computation, CEC 2010, 450001(2):1–8, 2010.
  • [18] C.-H. Lin. A rough penalty genetic algorithm for constrained optimization. Information Sciences, 241:119–137, 2013.
  • [19] J. Liu, K. L. Teo, X. Wang, and C. Wu. An exact penalty function-based differential search algorithm for constrained global optimization. Soft Computing, 20(4):1305–1313, 2016.
  • [20] S. S. Madeiro, F. B. De Lima-Neto, C. J. A. Bastos-Filho, and E. M. Do Nascimento Figueiredo. Density as the segregation mechanism in fish school search for multimodal optimization problems. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 6729 LNCS(PART 2):563–572, 2011.
  • [21] R. Mallipeddi and P. N. Suganthan. Ensemble of constraint handling techniques. IEEE Transactions on Evolutionary Computation, 14(4):561–579, 2010.
  • [22] E. Mezura-Montes and C. A. Coello Coello. Constraint-handling in nature-inspired numerical optimization: Past, present and future. Swarm and Evolutionary Computation, 1(4):173–194, 2011.
  • [23] E. Mezura-Montes and R. E. Velez-Koeppel. Elitist Artificial Bee Colony for constrained real-parameter optimization. 2010 IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 IEEE Congress on Evolutionary Computation, CEC 2010, 2010.
  • [24] J. B. Monteiro, I. M. C. Albuquerque, F. B. L. Neto, and F. V. S. Ferreira. Comparison on novel fish school search approaches. 16th International Conference on Intelligent Systems Design and Applications, 2016.
  • [25] J. B. Monteiro, I. M. C. Albuquerque, F. B. L. Neto, and F. V. S. Ferreira. Optimizing multi-plateau functions with FSS-SAR (Stagnation Avoidance Routine). IEEE Symposium Series on Computational Intelligence, 2016.
  • [26] J. A. G. Sargo, S. M. Vieira, J. M. C. Sousa, and C. J. A. B. Filho. Binary Fish School Search applied to feature selection: Application to ICU readmissions. IEEE International Conference on Fuzzy Systems, pages 1366–1373, 2014.
  • [27] T. Takahama and S. Sakai. Contrained Optimization by Constrained Swarm Optimizer with -level Control. In 4th IEEE International Workshop on Soft Computing as Transdisciplinary Science and Technology, pages 1019–1029, 2005.
  • [28] T. Takahama and S. Sakai. Constrained Optimization by the Constrained Differential Evolution with Gradient-Based Mutation and Feasible Elites. IEEE Congress on Evolution Computation, pages 1–8, 2006.
  • [29] T. Takahama and S. Sakai. Solving difficult constrained optimization problems by the constrained differential evolution with gradient-based mutation. Studies in Computational Intelligence, 198:51–72, 2009.
  • [30] T. Takahama and S. Sakai. Constrained Optimization by the Constrained Differential Evolution with an archive and Gradient-Based Mutation. IEEE Congress on Evolution Computation, (1):1–8, 2010.
  • [31] T. Takahama, S. Sakai, and N. Iwane. Constrained Optimization by the Constrained Hybrid Algorithm of Particle Swarm Optimization and Genetic Algorithm. Advances in Artificial Intelligence, 3809(1):389–400, 2005.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
46116
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description