2014 - Discrete Cuckoo Search Algorithm For The Travelling Salesman
2014 - Discrete Cuckoo Search Algorithm For The Travelling Salesman
DOI 10.1007/s00521-013-1402-2
ORIGINAL ARTICLE
Received: 24 November 2012 / Accepted: 28 March 2013 / Published online: 11 April 2013
Ó Springer-Verlag London 2013
123
1660 Neural Comput & Applic (2014) 24:1659–1669
variety of optimisation problems and have many advanta- very effective in solving continuous optimisation problems.
ges over traditional algorithms. Two of the advantages are Inspired by the obligate brood parasitic behaviour of some
simplicity and flexibility. Metaheuristics are usually simple cuckoo species, combined with Lévy flights which describe
to implement, but they often can solve complex problems the foraging patterns adopted by many animals and insects,
and can thus be adapted to solve many real-world optimi- CS is a good example of nature-inspired metaheuristics. CS
zation problems, from the fields of operations research, is also characterized by the reduced number of parameters
engineering to artificial intelligence [10, 11, 12, 36]. In and provided effective results for multimodal functions in
addition, these algorithms are very flexible, and they can comparison with both genetic algorithms (GA) and particle
deal with problems with diverse objective function prop- swarm optimisation (PSO) [35].
erties, either continuous, or discrete, or mixed. Such flex- This paper is organized as follows: Sect. 2 first briefly
ibility also enables them to be applied to deal a large introduces the standard CS, and then describes the improve-
number of parameters simultaneously. ment carried out on the source of inspiration of CS. Section 3
Metaheuristic algorithms use search strategies to explore introduces briefly the TSP. Section 4 describes the discrete
the search space more effectively, often focusing on some CS to solve symmetric TSP. Section 5 presents in detail the
promising regions of the search space. These methods begin results of numerical experiments on a set of benchmarks of
with a set of initial solutions or an initial population, and the so-called symmetric TSP from the TSPLIB library [24].
then, they examine step by step a sequence of solutions to Finally, Sect. 6 concludes with some discussions.
reach, or hope to approach, the optimal solution to the
problem of the interest. Among the most popular metaheu-
ristics, we can have a long list, to name a few, genetic 2 Cuckoo search algorithm (CS)
algorithms (GA), Tabu search (TS), simulated annealing
(SA), ant colonies optimisation (ACO) which are presented 2.1 Basic CS
in Glover and Kochenberger [14], and particle swarm opti-
misation (PSO)[16]. Bee colonies optimisation (BCO) [30], Among many interesting feature of cuckoo species, a
monkey search algorithm (MS) [22], harmony search algo- striking feature of cuckoos is that some species engage the
rithm (HS) [13], firefly algorithm (FA) [32], intelligent water so-called brood parasitism. Female cuckoos lay eggs in the
drops (IWD) [26], bat-inspired algorithm (BA) [33], cuckoo nests of another species to let host birds to hatch and brood
search (CS) [34], and krill herd (KH) [9] are among the young cuckoo chicks. To increase the probability of having
recently proposed metaheuristics. Most of these metaheu- a new cuckoo and reduce the probability of abandoning
ristics are nature inspired, mimicking the successful features eggs by the host birds, cuckoos (female, male, and young)
of the underlying biological, physical, or sociological sys- use several strategies [23].
tems. The success of these methods in solving various In the standard Cuckoo Search algorithm (CS) [34], a
problems such as combinatorial optimisation problems cuckoo searches for a new nest via Lévy flights. Lévy
comes from a relative ease of implementation, a good flights, named by the French mathematician Paul Lévy,
adaptation to practical applications and proper consideration represent a model of random walks characterized by their
of their constraints, and producing high quality solutions step lengths which obey a power-law distribution. Several
[29]. However, some algorithms can produce better solutions scientific studies have shown that the search for preys by
to some particular problems than others. Therefore, there is hunters follows typically the same characteristics of Lévy
no specific algorithm to solve all optimisation problems. So flights. Lévy flights are widely used in optimisation and in
the development of new metaheuristics remains a great many fields of sciences [5, 28, 34].
challenge, especially for tough NP-hard problems [31].
Many of metaheuristic algorithms have been applied to
solve TSP by various researchers, such as SA [4], TS [20],
GA [17], ACO [8], Discrete particle swarm optimisation
(DPSO) [27], genetic –simulated annealing ant colony
system with particle swarm optimisation techniques (GSA-
ACS-PSOT) [6] and fuzzy particle swarm optimisation
with simulated annealing and neighbourhood information
(PSO-NIC-SA) [1]. This paper introduces a new variant of
cuckoo search (CS) so as to improve and to efficiently
solve the symmetric TSP. CS is a metaheuristic search
algorithm which is recently developed by Yang and Deb in
2009 [34, 35]. This novel algorithm has been shown to be
123
Neural Comput & Applic (2014) 24:1659–1669 1661
CS is a metaheuristic search algorithm which was recently egg). In this case, we can talk about a kind of local search
developed by Xin-She Yang and Suash Deb in 2009, initially performed by a fraction of cuckoos around current solutions.
designed for solving multimodal functions. CS as shown in For simplicity, we can divide the mechanism adopted by
Algorithm 1 is summarized around the following ideal rules this new fraction of cuckoos in our proposed approach into
[34]: (1) Each cuckoo lays one egg at a time and selects a nest two main steps: (1) a cuckoo, initially moves by Lévy
randomly; (2) The best nest with the highest quality egg can flights towards a new solution (which represents a new
pass onto the new generations; (3) The number of host nests area); (2) from the current solution, the cuckoo in the same
is fixed, and the egg laid by a cuckoo can be discovered by the area seeks a new, better solution. According to these two
host bird with a probability pa 2 [0,1]. steps, we can see that the search mechanism with a new
The following Eq. (1) is used to produce a new solution fraction pc can be directly introduced in the standard
x(t?1)
i , for a cuckoo i, by a Lévy flight: algorithm of CS. So the population of improved CS algo-
ðtþ1Þ ðtÞ rithm (Algorithm 2) can be structured in terms of three
xi ¼ xi þ a Levy ðs; kÞ ð1Þ
types of cuckoos:
where a (a [ 0) is the step size. In the general case, a
1. A cuckoo, seeking (from the best position) areas which
should be associated with the scales of the problem of
may contain new solutions that are much better than
interest, though a = O(1) can be used in many cases. The
the solution of an individual can be randomly selected
step length follows the Lévy distribution
in the population;
Levy ðs; kÞ sk ; ð1 \k 3Þ ð2Þ 2. A fraction pa of cuckoos seek new solutions far from
the best solution;
which has an infinite variance with an infinite mean [34]. 3. A fraction pc of cuckoos search for solutions from the
Here, s is step size drawn from a Lévy distribution. current position and try to improve them. They move
from one region to another via Lévy flights to locate
2.2 Improved CS the best solution in each region without being trapped
in a local optimum.
CS succeeded in proving its superior performance, com- The goal of this improvement is to strengthen intensive
pared with PSO and GA (to solve multimodal functions) search around the best solutions of the population, and at
with its better strategy in exploring the solution space. This
the same time, randomization should be properly used to
strategy is enhanced by Lévy flights, which has an explore new areas using Lévy flights. Thus, an extension to
important role in controlling the balance between intensi- the standard CS is the addition of a method that handles the
fication and diversification. And the reduced number of
fraction ‘pc’ of smart cuckoos.
parameters enables CS to be more versatile [34]. We can expect that the new category of the cuckoo
There is still some room for improvement in CS, both in makes it possible for CS to perform more efficiently with
terms of the inspiration source and of the algorithm itself.
fewer iterations. It gives better resistance against any
The strength of CS is the way how to exploit and explore potential traps and stagnation in local optima in the case of
the solution space by a cuckoo. This cuckoo can have some TSP. This allows to the adaptation of CS to TSP more
‘intelligence’ so as to find much better solutions. So we can
control over the intensification and diversification with
control the intensification and diversification through the fewer parameters. The adaptation of CS for solving sym-
cuckoo’s mobility in the search space. The proposed metric TSP is described in the Sect. 4.
improvement considers a cuckoo as a first level of con-
trolling intensification and diversification, and since such a
cuckoo is an individual of a population, we can qualify the
population as a second level of control, which can be
restructured by adding a new category of cuckoos smarter
and more efficient in their search.
Studies show that cuckoos can also engage a kind of
surveillance on nests likely to be a host [23]. This behaviour
can serve as an inspiration to create a new category of
cuckoos that have the ability to change the host nest during
incubation to avoid abandonment of eggs. These cuckoos
use mechanisms before and after brooding such as the
observation of the host nest to decide if the nest is the best
choice or not (so, it looks for a new nest much better for the
123
1662 Neural Comput & Applic (2014) 24:1659–1669
3 The traveling salesman problem – An egg of the cuckoos is a new solution candidate for a
place/location in the population.
The Traveling salesman problem (TSP) [15, 18] is defined We can say that an egg is the equivalent of a hamiltonian
by N cities and distance matrix D = (dij)N 9 N which gives cycle. Here, we neglect the need to take a departure city for
distances between all pairs of cities. In TSP, the objective all circuits and also the direction of the tour taken by the
is to find a tour (i.e., a closed path) which visits each city salesman.
exactly once and has the minimum length. A tour can be
represented as a cyclic permutation p¼ 4.2 The nest
ðpð1Þ; pð2Þ; . . .; pðNÞÞ of cities from 1 to N if p(i) is
interpreted to be the city visited in step i; i ¼ 1; . . .; N: The In CS, the following features can be imposed concerning a
cost of a permutation (tour) is defined as: nest:
X
N 1
f ðpÞ ¼ dpðiÞpðiþ1Þ þ dpðNÞpð1Þ ð3Þ – The number of nests is fixed;
i¼1 – A nest is an individual of the population and the
number of nests is equal to the size of the population;
If the distances satisfies dij = dji for 1 B i, j B N, this
– An abandoned nest involves the replacement of an
case is the symmetric TSP.
individual of the population by a new one.
TSP can be modelled as a weighted graph. The vertices
of the graph correspond to cities and the graph’s edges By the projection of these features on TSP, we can
correspond to connections between cities,; the weight of an say that a nest is shown as an individual in the popu-
edge is the corresponding connections distance. A TSP tour lation with its own hamiltonian cycle. Obviously, a nest
now becomes a hamiltonian cycle and an optimal TSP tour can have multiple eggs for future extensions. In the
is the shortest hamiltonian cycle. present paper, each nest contains only one egg, for
simplicity.
The main idea of our present work is that the improved CS Each solution in the search space is associated with a
seeks good solutions found using local search in areas specified numeric objective value. So the quality of a solution is
by Lévy flights. We can say that both approaches, improved proportional to the value of the objective function. In CS, a
CS and local search, constitute a single entity in finding solu- nest egg of better quality will lead to new generations. This
tions of good quality. The weakness of local search is that it is means that the quality of a cuckoo’s egg is directly related
likely to be trapped in a local optimum. This can easily be to its ability to give a new cuckoo. For the traveling
strengthened by using our improved CS that requires the dis- salesman problem, the quality of a solution is related to the
placements by zones and not by solutions, which should length of the hamiltonian cycle. The best solution is the one
minimize the probability of falling into local optima. with the shortest hamiltonian cycle.
One of the objectives in extending CS to solve the
traveling salesman problem (TSP) is to keep its main
advantages and integrate these advantages into the discrete 4.4 Search space
version of improved CS. The process of adapting CS to
TSP focuses mainly on the reinterpretation of terminology In the case of two dimensions, the search space represents
used in the basic CS. CS and its inspiration sources can be the positions of potential nests. These positions are
structured and explained in the following five main ele- ðx; yÞ 2 R R. To change the position of a nest, we only
ments: egg, nest, objective function, search space, and have to modify the actual values of its coordinates. It is
Lévy flights. These key elements can have important obvious that moving nests or locations of the nests does
meanings for combinatorial problems. not impose real constraints. This is the case in most
continuous optimisation problems, which can be consid-
ered as an advantage that avoids many technical obstacles
4.1 The egg
such as the representation of the coordinates in the
solution space of TSP, especially in the mechanism for
If we assume that a cuckoo lays a single egg in one nest, we
moving a solution from one neighbourhood to another.
can give eggs the following properties:
The coordinates of cities are fixed coordinates of the
– An egg in a nest is a solution represented by one visited cities; however, the visiting order between the
individual in the population; cities can be changed.
123
Neural Comput & Applic (2014) 24:1659–1669 1663
4.4.1 Moving in the search space solution, and the minimum number of non-contiguous
edges that we can delete is two. So, 2-opt move is a good
Since the coordinates of cities are fixed, the movements are candidate for this type of perturbation.
based on the order of visited cities. There are several
methods, operators, or perturbations that generate a new 4.4.3 The step
solution from another existing solution by changing the
order of visited cities. The step of a movement is the distance between two
In the adaptation of CS to TSP, we have a discrete CS solutions. It is based on the space topology and the concept
where perturbations used to change the order of visited of neighbourhood. The step length is proportional to the
cities are 2-opt moves [7] and double-bridge moves [21]. number of successive 2-opt moves on a solution. A big step
The 2-opt move method is used for small perturbations, and is represented by a double-bridge move.
large perturbations are made by double-bridge move. A
2-opt move, as shown in Fig. 1, removes two edges from a 4.5 Lévy flights
tour (solution or Hamiltonian cycle) and reconnects the two
paths created. A double-bridge move cuts four edges and Lévy flights have as a characteristic of an intensive search
introduces four new ones as shown in Fig. 2. around a solution, followed by occasional big steps in the
long run. According to Yang and Deb [34], in some opti-
4.4.2 The neighbourhood misation problems, the search for a new best solution is more
efficient via Lévy flights. In order to improve the quality of
In continuous problems, the meaning of neighbourhood is search, we will associate the step length to the value gen-
obvious. However, for combinatorial problems, the notion erated by Lévy flights as outlined in the standard CS.
of neighbourhood requires that the neighbour of a given
solution must be generated by the smallest perturbation.
This perturbation must make the minimum changes on the 5 Experimental results
solution. This leads to the 2-opt move, because, for a new
The basic and the improved discrete cuckoo search (DCS)
algorithms proposed are tested on some instances (bench-
b b marks) of TSP taken from the publicly available electronic
a a
library TSPLIB of TSP problems [24]. Most of the instances
included in TSPLIB have already been solved in the litera-
ture and their optimality results can be used to compare
algorithms. Forty-one instances are considered with sizes
ranging from 51 to 1379 cities. In Reinelt [24], all these TSP
c c instances belong to the Euclidean distance type. A TSP
d d instance provides cities with their coordinates. The numer-
(A) (B) ical value in the name of an instance represents the number of
provided cities, e.g., the instance named eil51 has 51 cities.
Fig. 1 2-opt move. a Initial tour. b The tour created by 2-opt move A comparison between both algorithms, the basic DCS,
[the edges (a, b) and (c, d) are removed, while the edges (a, c) and
(b, d) are added] and the improved DCS, is firstly carried out. Then, the
improved DCS algorithm is compared with some other
recent methods (genetic simulated annealing ant colony
system with particle swarm optimisation techniques (GSA-
a b a b
ACS-PSOT) [6] and discrete particle swarm optimisation
(DPSO) [27]). Notice that in Chen and Chien [6], the
h c h c authors have compared their proposed method with others
g d g d metaheuristic algorithms for solving TSP in literature.
We have implemented basic/standard and improved
DCS algorithms using Java under 32 bit Vista operating
f e f e
system. Experiments are conducted on a laptop with
(A) (B) Intel(R) CoreTM 2 Duo 2.00 GHz CPU, and 3 GB of RAM.
The values of parameters of the proposed algorithm are
Fig. 2 Double-bridge move. a Initial tour. b The tour created by
double-bridge move [the edges (a, b), (c, d), (e, f) and (g, h) are selected, based on some preliminary trials. The selected
replaced by the edges (a, f), (c, h), (e, b) and (g, d), respectively] parameters in both algorithms (basic and improved DCS)
123
1664 Neural Comput & Applic (2014) 24:1659–1669
are those values that gave the best results concerning both In Table 2, the experimental results of the comparison
the solution quality and the computational time. The between basic DCS and improved DCS algorithms are
parameter settings used in the experiments are shown in given. It can be seen from this Table and Figs. 3 and 4 that
Table 1. In each case study, 30 independent runs of the the improved DCS is superior to basic DCS regarding to both
algorithms with these parameters are carried out. Figure 3 PDav(%) and PDbest(%). The improved DCS gets the
shows that the maximum number of iterations (MaxGen- smallest values for the fourteen TSP instances. The high
eration) can be set to 500 for both algorithms. performance of the improved DCS in relation to the basic
Tables 2 and 3 summarize the experiments results, DCS may be due to the improvement applied on the basic
where the first column shows the name of the instance, the DCS by the new category of cuckoos which have an efficient
column ‘opt’ shows the optimal solution length taken from method of generating new solutions by moving from one
the TSPLIB, the column ‘best’ shows the length of the best area to another to find one of the best solutions for each area.
solution found by each algorithm, the column ‘average’ Table 3 presents the computational results of improved
gives the average solution length of the 30 independent DCS algorithm on 41 TSPLIB instances. The column ’SD’
runs of each algorithm, the column ‘worst’ shows the denotes the standard deviation which takes the value 0.00
length of the worst solution found by each algorithm, the shown in bold when all solutions found have the same
column ‘PDav(%)’ denotes the percentage deviation of length over the 30 runs, while the column ’C1 %/Copt’ gives
the average solution length over the optimal solution length the number of solutions that are within 1 % optimality
of 30 runs, the column ‘PDbest(%)’ gives the percentage (over 30 runs)/the number of the optimal solutions. With
deviation of the best solution length over the optimal respect to PDbest(%), we can say that 90.24 % of the
solution length of 30 runs, and the column ’time’ shows the values of PDbest(%) are less than 0.5 %, which means that
average time in seconds for the 30 runs. The percentage the best solution found, of the 30 trials, approximates less
deviation of a solution to the best known solution (or than 0.5 % of the best known solution, while the value of
optimal solution if known) is given by the formula: 0.00 shown in bold in column PDav(%) indicates that all
solutionlength bestknownsolutionlength solutions found on the 30 trials have the same length of the
PDsolutionð%Þ ¼ best known solution. Numerical values presented in
best known solution length
100 Table 3 show that the improved DCS can indeed provide
good solutions in reasonable time.
ð4Þ
In Tables 4 and 5, the experimental results of the
improved DCS algorithm are compared with the both
methods GSA-ACS-PSOT and DPSO. The results of these
Table 1 Parameter settings for both algorithms, basic and improved two methods are directly summarized from original papers
DCS 6, 27]. It can be seen clearly from Tables 4 and 5 that DCS
Parameter Value Meaning outperforms the other two algorithms (GSA-ACS-PSOT
and DPSO) in solving all the eighteen/five tested TSP
n 20 Population size instances. The proposed DCS algorithm obtains fifteen/five
pa 0.2 Portion of bad solutions best solutions while GSA-ACS-PSOT/DPSO only obtains
pc 0.6 Portion of intelligent cuckoos (only for the eleven/two best solutions among eighteen/five TSP
improved DCS) instances. Moreover, we find that the average of SDs/
MaxGeneration 500 Maximum number of iterations PDav(%)s is equal to 161.55/3.54 for the GSA-ACS-PSOT/
906
866 99400
826 eil51 (basic DCS) 92400 lin318 (basic DCS)
786 85400
eil51 (improved DCS) lin318 (improved DCS)
746 78400
Length
706 71400
Length
666 64400
626 57400
586 50400
546 43400
506 46400
466 49400
426 42400
40 120 200 280 360 440 520 600 680 760 840 920 1000 40 120 200 280 360 440 520 600 680 760 840 920 1000
Generations Generations
Fig. 3 Average length of the best solutions of 10 runs for eil51(opt = 426) and lin318(opt = 42029)
123
Table 2 Comparison of both algorithms, the basic DCS and the improved DCS on 14 TSP benchmark instances from TSPLIB
Instance Opt Basic DCS Improved DCS
Neural Comput & Applic (2014) 24:1659–1669
Best Average Worst PDav(%) PDbest(%) Time(s) Best Average Worst PDav(%) PDbest(%) Time(s)
eil51 426 426 439 459 3.05 0.00 2.05 426 426 426 0.00 0.00 1.16
berlin52 7,542 7,542 7,836.4 8,356 3.90 0.00 2.24 7,542 7,542 7,542 0.00 0.00 0.09
st70 675 675 696.9 725 3.24 0.00 3.54 675 675 675 0.00 0.00 1.56
pr76 108,159 108,469 111,225.6 11,6278 2.83 0.28 4.22 108,159 10,8159 108,159 0.00 0.00 4.73
eil76 538 544 565.7 599 5.14 1.11 4.27 538 538.03 539 0.00 0.00 6.54
kroA100 21,282 21,515 22,419.96 23,444 5.34 1.09 6.53 21,282 21,282 21,282 0.00 0.00 2.70
kroB100 22,141 22,335 23,417.06 25,177 5.76 0.87 6.69 22,141 22,141.53 22,157 0.00 0.00 8.74
eil101 629 648 669.4 699 6.42 3.02 7.59 629 630.43 633 0.22 0.00 18.7
bier127 118,282 120,858 127,832.23 159,788 8.07 2.17 9.96 118,282 118,359.63 118,730 0.06 0.00 25.50
ch130 6,110 6,309 6,661.23 7,883 9.02 3.25 10.60 6,110 6,135.96 6,174 0.42 0.00 23.12
ch150 6,528 6,913 7,234.9 8,064 10.82 5.89 18.06 6,528 6,549.90 6,611 0.33 0.00 27.74
kroA150 26,524 27,328 28,928.83 32,786 9.06 3.03 14.24 26,524 26,569.26 26,767 0.17 0.00 31.23
kroA200 29,368 30,641 32,896.03 37,993 12.01 4.33 23.43 29,382 29,446.66 29,886 0.26 0.04 62.08
lin318 42,029 44,278 49,623.96 6,9326 18.07 5.35 72.55 42,125 42,434.73 42,890 0.96 0.22 156.17
1665
123
1666 Neural Comput & Applic (2014) 24:1659–1669
Table 3 Computational results of Improved DCS algorithm for 41 TSP benchmark instances for TSPLIB
Instance Opt Best Worst Average SD PDav(%) PDbest(%) C1%/Copt time
eil51 426 426 426 426 0.00 0.00 0.00 30/30 1.16
berlin52 7542 7,542 7,542 7,542 0.00 0.00 0.00 30/30 0.09
st70 675 675 675 675 0.00 0.00 0.00 30/30 1.56
pr76 108,159 108,159 108,159 108,159 0.00 0.00 0.00 30/30 4.73
eil76 538 538 539 538.03 0.17 0.00 0.00 30/29 6.54
kroA100 21,282 21,282 21,282 21,282 0.00 0.00 0.00 30/30 2.70
kroB100 22,141 22,141 22,157 22,141.53 2.87 0.00 0.00 30/29 8.74
kroC100 20,749 20,749 20,749 20,749 0.00 0.00 0.00 30/30 3.36
kroD100 21,294 21,294 21,389 21,304.33 21.79 0.04 0.00 30/19 8.35
kroE100 22,068 22,068 22,121 2,281.26 18.50 0.06 0.00 30/18 14.18
eil101 629 629 633 630.43 1.14 0.22 0.00 30/6 18.74
lin105 14,379 14,379 14,379 14,379 0.00 0.00 0.00 30/30 5.01
pr107 44,303 44,303 44,358 44,307.06 12.90 0.00 0.00 30/27 12.89
pr124 59,030 59,030 59,030 59,030 0.00 0.00 0.00 30/30 3.36
bier127 118,282 118,282 11,8730 118,359.63 12.73 0.06 0.00 30/18 25.50
ch130 6,110 6,110 6,174 6,135.96 21.24 0.42 0.00 28/7 23.12
pr136 96,772 96,790 97,318 97,009.26 134.43 0.24 0.01 30/0 35.82
pr144 58,537 58,537 58,537 58,537 0.00 0.00 0.00 30/30 2.96
ch150 6,528 6,528 6,611 6,549.9 20.51 0.33 0.00 29/10 27.74
kroA150 26,524 26,524 26,767 26,569.26 56.26 0.17 0.00 30/7 31.23
kroB150 26,130 26,130 26,229 26,159.3 34.72 0.11 0.00 30/5 33.01
pr152 73,682 73,682 73,682 73,682 0.00 0.00 0.00 30/30 14.86
rat195 2,323 2,324 2,357 2,341.86 8.49 0.81 0.04 20/0 57.25
d198 15,780 15,781 15,852 15,807.66 17.02 0.17 0.00 30/0 59.95
kroA200 29,368 29,382 29,886 29,446.66 95.68 0.26 0.04 29/0 62.08
kroB200 29,437 29,448 29,819 29,542.49 92.17 0.29 0.03 28/0 64.06
ts225 126,643 126,643 126,810 126,659.23 44.59 0.01 0.00 30/26 47.51
tsp225 3,916 3,916 3,997 3,958.76 20.73 1.09 0.00 9/1 76.16
pr226 80,369 80,369 80,620 80,386.66 60.31 0.02 0.00 30/19 50.00
gil262 2,378 2,382 2,418 2,394.5 9.56 0.68 0.16 22/0 102.39
pr264 49,135 49,135 49,692 49,257.5 159.98 0.24 0.00 28/13 82.93
a280 2,579 2,579 2,623 2,592.33 11.86 0.51 0.00 25/4 115.57
pr299 48,191 48,207 48,753 48,470.53 131.79 0.58 0.03 27/0 138.20
lin318 42,029 42,125 42,890 42,434.73 185.43 0.96 0.22 15/0 156.17
rd400 15,281 15,447 15,704 15,533.73 60.56 1.65 1.08 0/0 264.94
fl417 11,861 11,873 11,975 11,910.53 20.45 0.41 0.10 30/0 274.59
pr439 107,217 107,447 109,013 107,960.5 438.15 0.69 0.21 22/0 308.75
rat575 6,773 6,896 7,039 6,956.73 35.74 2.71 1.81 0/0 506.67
rat783 8,806 9,043 9,171 9,109.26 38.09 3.44 2.69 0/0 968.66
pr1002 259,045 266,508 271,660 268,630.03 1,126.86 3.70 2.88 0/0 1662.61
nrw1379 56,638 58,951 59,837 59,349.53 213.89 4.78 4.08 0/0 3160.47
DPSO algorithm in table 4/5, while the average of SDs/ which is associated with the improved DCS algorithm is
PDav(%)s of our proposed DCS algorithm is equal to better, in terms of solution quality. This can be explained
31.28/0.00. Figure 5 shows the PDav(%) of both algo- basically by the strengths of CS: a good balance between
rithms improved DCS and GSA-ACS-PSOT for the eigh- intensification and diversification, an intelligent use of
teen different size instances. From Fig. 5, the lower curve Lévy flights and the reduced number of parameters. It can
123
Neural Comput & Applic (2014) 24:1659–1669 1667
0
kr 00
0
kr 50
01
kr 0
8
70
be 51
l7
10
13
20
12
15
31
n5
1
l1
pr
st
l
oA
ch
ei
oA
oB
ch
oA
lin
er
rli
ei
bi
kr
Table 4 Comparison of experimental results of the improved DCS with GSA-ACS-PSOT [6]
Instance Opt GSA-ACS-PSOT Improved DCS
Best Average SD Best Average SD
123
1668 Neural Comput & Applic (2014) 24:1659–1669
Table 5 Comparison of experimental results of the improved DCS with DPSO [27]
Instance Opt DPSO Improved DCS
Best Worst PDAv(%) Best Worst PDAv(%)
3
PDAv(%)
25
15
05
0
0
0
0
0
0
0
01
0
be 1
8
0
5
0
10
20
15
10
13
20
10
12
15
15
l5
n5
l7
31
10
10
10
l1
ei
ei
oB
oB
oD
oA
oA
oA
oC
oE
ch
ch
lin
oB
lin
rli
er
ei
bi
kr
kr
kr
kr
kr
kr
kr
kr
kr
Instance
studies can be fruitful if we can focus on the parametric 4. Bonomi E, Lutton JL (1984) The n-city travelling salesman
studies and applications of DCS into other combinatorial problem: statistical mechanics and the metropolis algorithm.
SIAM Rev 26(4):551–568
problems such as scheduling and routing. 5. Brown CT, Liebovitch LS, Glendon R (2007) Lévy flights in
We want to mimic nature so that we can design new dobe ju/’hoansi foraging patterns. Hum Ecol 35(1):129–138
algorithms to solve very complex problems more efficiently. 6. Chen SM, Chien CY (2011) Solving the traveling salesman
We also want to develop new algorithms that are truly problem based on the genetic simulated annealing ant colony
system with particle swarm optimization techniques. Expert Syst
intelligent. New algorithms should be more controllable and Appl 38(12):14439–14450
less complex, compared with contemporary metaheuristics. 7. Croes GA (1958) A method for solving traveling-salesman
It can be expected that an intelligent algorithm should have problems. Oper Res 6(6):791–812
ability to tune its algorithm-dependent parameters so as to 8. Dorigo M, Gambardella LM et al (1997) Ant colonies for the
travelling salesman problem. BioSystems 43(2):73–82
optimize its performance automatically. In the end, some 9. Gandomi AH, Alavi AH (2012) Krill herd: a new bio-inspired
truly efficient and intelligent algorithms may emerge to optimization algorithm. Commun Nonlinear Sci Numer Simul
solve NP-hard problems in an ever-increasing efficient way. 17(12):4831–4845
At least, some seemingly intractable problems can be solved 10. Gandomi AH, Yang XS, Alavi AH (2011) Mixed variable
structural optimization using firefly algorithm. Comput Struct
more efficiently by new metaheuristic algorithms. 89(23–24):2325–2336
11. Gandomi AH, Talatahari S, Yang XS, Deb S (2012) Design
optimization of truss structures using cuckoo search algorithm.
Struct Des Tall Special Build. doi:10.1002/tal.1033
References 12. Gandomi AH, Yang XS, Alavi AH (2013) Cuckoo search algo-
rithm: a metaheuristic approach to solve structural optimization
problems. Eng Comput 29(1):17–35
1. Abdel-Kader RF (2011) Fuzzy particle swarm optimization with 13. Geem ZW, Kim JH et al (2001) A new heuristic optimization
simulated annealing and neighbourhood information communi- algorithm: harmony search. Simulation 76(2):60–68
cation for solving TSP. Int J Adv Comput Sci Appl 2(5):15–21 14. Glover F, Kochenberger GA (2003) Handbook of metaheuristics.
2. Arora S (1998) Polynomial time approximation schemes for Springer, New York
euclidean traveling salesman and other geometric problems. 15. Gutin G, Punnen AP (2002) The traveling salesman problem and
J ACM (JACM) 45(5):753–782 its variations, vol 12. Springer, New York
3. Blum C, Roli A (2003) Metaheuristics in combinatorial optimi- 16. Kennedy J, Eberhart R (1995) Particle swarm optimization. In:
zation: overview and conceptual comparison. ACM Comput Neural networks, 1995. Proceedings, IEEE international confer-
Surveys (CSUR) 35(3):268–308 ence on, IEEE, vol4, pp 1942–1948
123
Neural Comput & Applic (2014) 24:1659–1669 1669
17. Larranaga P, Kuijpers CMH, Murga RH, Inza I, Dizdarevic S 27. Shi XH, Liang YC, Lee HP, Lu C, Wang QX (2007) Particle
(1999) Genetic algorithms for the travelling salesman problem: a swarm optimization-based algorithms for tsp and generalized tsp.
review of representations and operators. Artif Intell Rev Inf Process Lett 103(5):169–176
13(2):129–170 28. Shlesinger MF, Zaslavsky GM, Frisch U (1995) Lévy flights and
18. Lawler EL, Lenstra JK, Kan AR, Shmoys DB (1985) The trav- related topics in physics:(Nice, 27–30 June 1994). Springer, New
eling salesman problem: a guided tour of combinatorial optimi- York
zation, vol 3. Wiley, New York 29. Taillard ED, Gambardella LM, Gendreau M, Potvin JY (2001)
19. Lenstra JK, Rinnooy Kan AHG (1975) Some simple applications Adaptive memory programming: a unified view of metaheuris-
of the travelling salesman problem. Oper Res Q 26(5): 717–733 tics. Eur J Oper Res 135(1):1–16
20. Malek M, Guruswamy M, Pandya M, Owens H (1989) Serial and 30. Teodorovic D, Lucic P, Markovic G, Orco MD (2006) Bee col-
parallel simulated annealing and tabu search algorithms for the ony optimization: principles and applications. In: Neural network
traveling salesman problem. Ann Oper Res 21(1):59–84 applications in electrical engineering, 2006. NEUREL 2006. 8th
21. Martin O, Otto SW, Felten EW (1991) Large-step markov chains Seminar on, IEEE, pp 151–156
for the traveling salesman problem. Complex Syst 5(3):299–326 31. Wolpert DH, Macready WG (1997) No free lunch theorems for
22. Mucherino A, Seref O (2007) Monkey search: a novel meta- optimization. Evol Comput IEEE Trans 1(1):67–82
heuristic search for global optimization. In: Data mining, systems 32. Yang XS (2009) Firefly algorithms for multimodal optimization.
analysis, and optimization in biomedicine(AIP conference pro- In: Stochastic algorithms: foundations and application, SAGA
ceedings, vol 953), American Institute of Physics, 2 Huntington 2009. Lecture notes in computer sciences, vol 5792, pp 169–178
Quadrangle, Suite 1 NO 1, Melville, NY, 11747-4502, USA, vol 33. Yang XS (2010) A new metaheuristic bat-inspired algorithm.
953, pp 162–173 Nature inspired cooperative strategies for optimization (NICSO
23. Payne RB, Sorenson MD (2005) The cuckoos, vol 15. Oxford 2010), pp 65–74
University Press, USA 34. Yang XS, Deb S (2009) Cuckoo search via lévy flights. In: Nature
24. Reinelt G (1991) Tsplib—a traveling salesman problem library. & biologically inspired computing, 2009. NaBIC 2009. World
ORSA J Comput 3(4):376–384 congress on, IEEE, pp 210–214
25. Reinelt G (1994) The traveling salesman: computational solutions 35. Yang XS, Deb S (2010) Engineering optimisation by cuckoo
for TSP applications, vol 15. Springer, New York search. Int J Math Modell Numer Optim 1(4):330–343
26. Shah-Hosseini H (2009) The intelligent water drops algorithm: a 36. Yang XS, Gandomi AH (2012) Bat algorithm: a novel approach
nature-inspired swarm-based optimization algorithm. Int J Bio- for global engineering optimization. Eng Comput 29(5):464–483
Inspired Comput 1(1):71–79
123