Hybrid LS-SA-PS Methods
Hybrid LS-SA-PS Methods
1. Introduction
This paper discusses a real-world industrial problem for product-mix selection [1] involving eight variables and 21
constraints with fuzzy technological coefficients and thereafter, a formulation for an optimization approach to solve the
problem. This problem occurs in production planning in which a decision maker plays a pivotal role in making decision
under a fuzzy environment. The decision-maker should be aware of his level of satisfaction as well as degree of fuzziness
while making product-mix decisions. Thus, a thorough analysis was performed on a modified S-curve membership function
for the fuzziness patterns and fuzzy sensitivity solutions found from the various optimization methodologies in [1–11].
In this paper a hybrid optimization method is proposed to capture multiple non-dominated solutions in a single run of
the algorithm. The obtained results have been compared with the well-known various hybrid evolutionary algorithms [4].
The combination of two or more hybrid methods to solve non-linear optimization problems has been shown to improve the
results and to overcome drawbacks and weaknesses of each method separately.
In this paper, the line search method is focused on industrial production planning problems. The main advantage of
this method is its ability to locate the near global optimal solutions for the fitness function with its strong criteria of
global convergence. The line search method used the fmincon approach from the MATLAB computational toolbox. The
Levenberg–Marquardt algorithm is a gradient-based method. It uses the sequential quadratic programming principles. In
this method, the function solves a quadratic programming (QP) of the sub-problem at each iteration. A line search uses
a merit function similar to that proposed by Powel [12]. The QP sub-problem uses an active set strategy as in [12]. A full
description of this algorithm was found in [12].
0895-7177/$ – see front matter © 2011 Elsevier Ltd. All rights reserved.
doi:10.1016/j.mcm.2011.08.002
P. Vasant / Mathematical and Computer Modelling 57 (2013) 180–188 181
Constrained nonlinear Optimization Problems (COP) often take place in many practical applications such as construction
planning, industrial process optimization, manufacturing optimization systems and so on [13]. These problems are
challenging in terms of identifying feasible solutions when constraints are linear and the objective functions are nonlinear.
Therefore, finding the location of the global optimum in the non-linear COP is more difficult as compared to linear bound-
constrained global optimization problems. This research proposes a Hybrid Simulated Annealing method (HSA), for solving
the general COP. HSA has features that address both feasibility and optimality issues and here, it is supported by a local
search (LS) procedure, the gradient based method. In this research we developed a new hybrid method that combines the
Simulated Annealing (SA), Pattern Search (PS) and Line Search (LS) – referred to as the hybrid LS–SA–PS method – in the
context of fuzzy optimization of industrial production planning problems with non-linear objective functions.
The pattern search methods are class of direct search methods of nonlinear optimization. The original pattern search
methods were introduced in late 1950s and early 1960s by Abramson [14] and remained popular due to their simplicity and
the fact that they work well in the practice on various problems. Recently the nonlinear programming community renewed
its interest on optimization, the fact that they are provably convergent [14].
The Pattern Search algorithm (PS) is a valuable one, but the application of non-smooth analysis techniques in [15]
showed its limitations due to the finite choice of directions in [16]. The Mesh Adaptive Direct Search (MADS) removes
the PS restriction to finitely many poll directions. We have long felt that this was the major impediment to stronger proofs
of optimality for PS limit points (and better behavior), and a more satisfying optimality process for MADS in addition to
opening new possibilities for handling the non-linear objective functions.
We list two attractive features of the pattern search algorithms:
• They can be extremely simple to specify and implement.
• Neither explicit calculations of derivatives nor anything like Taylor’s series appear in the algorithms.
This makes these algorithms useful in situations where derivatives are not available and finite-difference derivatives are
unreliable, such as when f (x) is noisy.
These qualities made pattern search algorithms popular with users. Yet despite their seeming simplicity and heuristic
nature and the fact that they have requirements to their derivatives of f (x), the pattern search algorithms possess global
convergence properties that are almost as strong as those of comparable line-search and trust-region algorithms. This
surprising fact was explained by Lewis et al. [17].
These papers showed some further features of pattern search, which are used in their research work.
• They required only monotonic decreasing function f (x). In fact, they do not require any numerical values of f (x), but for
information about improvement values of f (x+ ) on k-th step.
• If they are lucky, they need only one evaluation of f (x) after an iteration. Once they can find an x+ for which f (x+ ) < f (xk ),
they can accept it and proceed. On the other hand, in the worst case they will look in quite a few directions (2n, for
example) before they try shorter steps.
• The allowed steps are restricted in direction and length. i.e., the step direction must be parallel to the coordinate axes
and the length of any step, which has the form ∆0 /2N for some integer N.
The example also assumed that there is a great deal of flexibility in the pattern search algorithm, depending on how one
specifies the pattern of points to be searched for the next iteration; see [17].
The pattern search algorithms are globally convergent, see [17] work since
(1) at each iteration, they look in enough directions to ensure that a suitably good descent direction will ultimately be
considered;
(2) they possess a reasonable back-tracking strategy that avoids unnecessarily short steps;
(3) they otherwise avoid unsuitable steps by restricting the nature of the step allowed between successive iterates, rather
than by placing requirements on the amount of decrease realized between successive iterates.
At the heart of the argument lies an unusual twist: Lewis et al. [17] relaxed the requirement of sufficient decrease and
require only simple decrease (f (xk+1 ) < f (xk )), but Lewis et al. [17] imposed stronger conditions on the form the step sk may
take. Furthermore, this trade-off is more than just a theoretical innovation: in practice, it permits useful search strategies
that are precluded by the condition of sufficient decrease.
The paper presents a new approach for obtaining the best optimal solution of the objective function and values of decision
variables with minimal computational CPU time. This approach combines a line search heuristic with simulated annealing
and pattern search. Computational experiments for these algorithms involve an industrial production planning problem and
show simulation results. The remarkable performance of these techniques is revealed in this paper. The improvement in the
computational efficiency of CPU timing has also been investigated in detail.
The LS has been used in the early stage to define an initial guess, then the hybrid SA–PS has been implemented in the
final stage to finely tune the optimal results for the objective function.
The paper is organized as follows: Section 2 provides the problem statement; Section 3 presents a description of the
proposed hybrid algorithm; the analysis and simulation results are included in Section 4, followed by conclusions and future
research directions.
182 P. Vasant / Mathematical and Computer Modelling 57 (2013) 180–188
2. Problem statement
The optimization techniques are primarily used in production planning problems in order to achieve an optimal profit,
which maximizes a certain objective function by satisfying constraints. The first step in an production planning problem is to
state the underlying nonlinear programming (NLP) problem by writing the mathematical functions subject to the objective
function and constraints.
Given a degree of possibility value µ, the fuzzy constrained optimization problem has been formulated by Jimenez
et al. [18], Vasant [19] as the nonlinear constrained optimization problem, shown below. The fuzzy technological coefficients
ãij in Eq. (1) are represented by the modified s-curve membership function as in [20–29].
8
Maximize (ci xi − di x2i − ei x3i )
i =1
subject to :
8
ahij − alij
1 B
alij + ln −1 xi − bj ≤ 0 , j = 1, 2, . . . , 17
i =1
α C µ
8
6
ri xi − 0.15 ri xi ≤ 0
i=7 i =1
x1 − 0.6x2 ≤ 0
x3 − 0.6x4 ≤ 0
x5 − 0.6x6 ≤ 0
0 ≤ x i ≤ ui , i = 1, 2, . . . , 8. (1)
In the above non-linear programming problem, the variable vector x represents a set of variables xi , i = 1, 2, . . . , 8. This
optimization problem contains eight continuous variables and 21 inequality constraints. A test point xi satisfying constraints
is called feasible, if not infeasible. The set satisfying constraints is called the feasible domain. The aim of the optimization is
to maximize the total production profit for the industrial production planning problem. The definition of the new non-linear
cubic function for this particular problem has been refereed to Lin [30] and Chaudhuri [31]. The cubic objective function has
24 coefficients for eight decision variables. This problem considered one of the most challenging problems in the research
area of industrial production planning as in [32–34].
The non-linear fuzzy optimization problem in which, for any given degree of the satisfaction value µ, the uncertain
technological constrained optimization problem can be formulated in Eq. (1) as the nonlinear constrained optimization
problem. The input data for the Eq. (1) and the values of ci , di , ei , aij , bj , ri and ui in the problem statement is provided
in [35].
A new approach for obtaining the best optimal solution of the objective function, decision variable and minimal
computational CPU time has been presented. The approach combines a line search heuristic with simulated annealing
and general pattern search. Computational experiences with these algorithms, on industrial production planning problems
are shown thoroughly via simulation results. The remarkable performance of these techniques have been revealed in this
section. The improvement in the computational efficiency of CPU timing also has been investigated in detail.
The hybrid algorithm for LS, SA and PS techniques is as follows.
Step 1: Start
Line search
The line search method used the fmincon approach from MATLAB computational toolbox.
Step 2: Simulated annealing
(i) Randomly generate a new point. The distance from the current point to the new point, or the extent of the search, is
determined by a probability distribution with a scale proportional to the current temperature.
(ii) Determine whether the new point is better or worse than the current point. If the new point is better than the current
point, it becomes the next point. If the new point is worse than the current point, the algorithm may still make it the
next point: the simulated annealing accepts a worse point based on an acceptance probability. Use the threshold value
to accept a worse point if the objective function is greater than some fixed threshold plus its previous value.
(iii) Decrease the temperature and the threshold, storing the best point.
(iv) Perform re-annealing after a certain number of points (Reanneal Interval) are accepted by the solver. Increase the
temperature in each dimension, depending on sensitivity information. The search is resumed with the new temperature
values.
P. Vasant / Mathematical and Computer Modelling 57 (2013) 180–188 183
(v) Stop the algorithm when the average change in the objective function is very small, or when any other stopping criteria
are met.
Stopping conditions for the algorithm
The simulated annealing algorithm uses the following conditions to determine when to stop:
• Tol Fun—The algorithm runs until the average change in value of the objective function in StallIter Lim iterations is less
than Tol Fun. The default value is 1e−6.
• Max Iter—The algorithm stops if the number of iterations exceeds the maximum number of iterations. One can specify
the maximum number of iterations as a positive integer or Inf. Inf is the default.
• Max Fun Eval specifies the maximum number of evaluations of the objective function. The allowed maximum is
3000 ∗ number of variables.
• Time Limit specifies the maximum time in seconds the algorithm runs before stopping.
• Objective Limit—the algorithm stops if the best objective function value is less than or equal to the value of Objective
Limit.
Parameter settings:
Annealing Fcn: @annealing fast
Temperature Fcn: @temperature exp
Acceptance Fcn: @acceptance sa
TolFun: 1.0000e−006
StallIter Limit: ‘500 ∗ number of variables’
Max Fun Evals: ‘3000 ∗ number of variables’
Time Limit: Infinite
Max Iter: Infinite
Objective Limit: Infinite
Initial Temperature: 100
Reanneal Interval: 100
Data Type: ‘double’
Step 3: Pattern search
The parameter setting in the program is as follows:
Poll method: PS position basis 2N
Complete poll
Polling order: Consecutive
Search method: PS
Mesh: Initial size 1 and Max size Infinite
Stopping Criteria: Mesh tolerance = 1.0000 e−006; Max iteration = 100 ∗ 8; Max function evaluation = 2000 ∗ 8; Time
limit = Infinite and function tolerance = 1.0000 e−006.
Step 4: End
The flowchart for the solution procedure is given in Fig. 1.
4. Numerical results
The computational experiment was executed by using a desktop computer Intel Pentium⃝ R
2.8 GHz, Window XP home
with 256 MB DDR RAM of memory and MATLAB software [36]. Fig. 2 exhibits the simulation result of optimal value for the
objective function with respect to eight decision variables for α = 13.813. The CPU time for this simulation is 1.40 s. The
best optimal value for the objective function is 200 116.4.
Table 1 indicates the optimal value for the objective function, feasible solution for decision variables and CPU time
running LS, SA and PS techniques. An average CPU time running LS, SA and GPS for α = 13.813 and γ = 0.001 to γ = 0.99
is 0.0478 s, 0.0481 s and 0.0484 s respectively. This individual CPU time for each technique is less than the CPU techniques
for hybrid LS with SA. It proved that the major contribution of SA optimization techniques in this case is on the outstanding
performance of computational time (CPU). Fig. 3 shows the optimal value for objective functions versus levels of satisfaction
for α = 13.813 and γ = 0.001 to γ = 0.99. This figure clearly indicates the behavior of the objective function versus the
level of satisfaction in order to make a satisfactory decision under an uncertain environment.
Fig. 4 exhibits the simulation results for the optimal value for objective function with respect to α and γ . CPU time for this
simulation result is 7.92 s for α = 1 to α = 41 and γ = 0.001 to γ = 0.99. Figs. 5–7 provide a holistic contour view for the
2D solution. These 2D plots will be a useful tool for the decision makers and implementers in order to make an appropriate
decision under an uncertain environment.
Further experiments were carried out to investigate the performance of computational efficiency of hybrid LS, SA and
GPS optimization techniques. Table 2 reveal the CPU time for running LS, SA and GPS techniques in this optimization process.
Simulation for CPU time for α = 1 to α = 41 and γ = 0.99.
184 P. Vasant / Mathematical and Computer Modelling 57 (2013) 180–188
From Table 2, it observed that the average CPU time for running LS, SA and PS techniques is 0.04525 s, 0.04550 s and
0.04573 s respectively. At α = 9, the best optimal computational time for LS is 0.02465 s, SA is 0.02489 s and PS is 0.02513 s.
In this case, the CPU time are distributed among all these three techniques in the simulation process. Moreover, this CPU
time is comparable to the hybrid method of LS and SA [37]. It has been concluded that the novelty of SA techniques lies on
P. Vasant / Mathematical and Computer Modelling 57 (2013) 180–188 185
Table 1
Optimal value for objective function.
γ x1 x2 x3 x4 x5 x6 x7 x8 f LSt (s) SAt (s) PSt (s)
0.001 246.8 411.4 205.7 342.8 134.3 223.9 120.6 0.00 147 712.9 0.12597 0.12672 0.12735
0.1 284.8 474.7 239.2 398.7 148.8 248.0 150.4 0.00 163 731.6 0.06226 0.06251 0.06275
0.2 292.5 487.6 246.1 410.1 151.8 253.0 156.7 0.00 166 763.3 0.02821 0.02846 0.02869
0.3 297.9 496.5 250.8 418.0 153.8 256.4 161.2 0.00 168 830.5 0.06215 0.06241 0.06264
0.4 302.5 504.1 254.8 424.7 155.6 259.3 165.1 0.00 170 548.7 0.02887 0.02912 0.02935
0.5 306.8 511.7 258.6 431.0 157.2 262.1 168.8 0.00 172 146.4 0.06363 0.06389 0.06412
0.6 311.2 518.6 262.5 437.5 158.9 264.9 172.6 0.00 173 763.9 0.02461 0.02486 0.02509
0.7 316.1 526.9 266.9 444.8 160.8 268.1 176.9 0.00 175 548.3 0.02432 0.02456 0.02479
0.8 322.4 537.4 272.4 454.1 163.3 272.1 182.4 0.00 177 754.9 0.02742 0.02767 0.02790
0.9 332.3 553.9 281.2 468.6 167.1 278.5 191.3 0.00 181 132.7 0.04661 0.04687 0.04711
0.99 414.3 690.6 354.0 590.0 200.0 333.4 200.0 54.5 200 116.4 0.03191 0.03216 0.03239
Table 2
Best CPU time for objective function at γ = 0.99.
α f LS: CPU time (s) SA: CPU time (s) PS: CPU time (s)
the computational efficiency of CPU time. On the other hand, LS and PS techniques showed their superiority in obtaining
the best optimal value for the objective function as well as a reasonable feasible solution for decision variables.
5. Discussion
The novel hybrid optimization technique of LSSAPS manage to obtain a non-zero solution for all the eight decision
variables at γ = 0.99 in Table 1. This shows the best feasibility of decision variables. Moreover the best optimal solution for
the objective function is 200 116.44 which is the best convergence solution for the entire vagueness factor α . The strength of
this hybrid approach lies on the robustness of the optimal solution for the objective function as well as on the computational
P. Vasant / Mathematical and Computer Modelling 57 (2013) 180–188 187
Table 3
Comparative results for optimization techniques.
Method Obj. function CPU time (s)
efficiency. The main goal of this paper is to obtain a very high productive solution for all decision variables with lesser
computational time. The main advantages of the PS approach are it is very simple in concept, easy to implement and, more
important, it is computationally efficient. Table 3 provides the comparative results for best values of the objective function
and the best computational time by various heuristic optimization techniques.
6. Conclusions
A novel hybrid algorithm, based on a combination of Line Search (LS), Simulated Annealing (SA) and Pattern Search
(PS) to solve a non-linear production planning problem, is successfully presented in this paper. A thorough comparative
study with the other heuristic methods from the literature in terms of best optimal values for the objective function and
the computational time were conducted and discussed in detail. The main advantage of the LSSAPS hybrid optimization
technique is obtaining the global optimal solution for the objective function with high level of satisfaction. The high level
of satisfactory solution solely depends on the vagueness factor in the technological coefficient in the non-linear fuzzy
optimization problem. The LSSAPS hybrid method has outperformed the other heuristic methods in terms of the optimal
value for the objective function and the CPU computational time. Moreover, the hybrid algorithm overcomes the previous
drawback of the need to supply a good initial point in order to reach its global or near global optimal solution.
Acknowledgment
The author sincerely thank the editorial board of MCM (Elsevier) for proof reading this paper.
References
[1] A. Bhattacharya, P. Vasant, Soft-sensing of level of satisfaction TOC product-mix decision heuristic using robust fuzzy-LP, European Journal of
Operational Research 177 (1) (2007) 55–70.
[2] A. Bhattacharya, P. Vasant, S. Susanto, Simulating theory of constraint problem with a novel fuzzy compromise linear programming model,
in: A. Elsheikh, A.T. Al Ajeeli, E.M. Abu-Taieh (Eds.), Simulation and Modeling: Current Technologies and Applications, IGI Publisher, 2007, pp. 307–336.
[3] P. Vasant, H. Kale, Introduction to fuzzy logic and fuzzy linear programming, in: A. Frederick, P. Humphreys (Eds.), Encyclopedia of Decision Making
and Decision Support Technologies, IGI Publisher, USA, 2007, pp. 1–15.
[4] P. Vasant, N. Barsoum, Fuzzy optimization of units products in mix-product selection problem using FLP approach, soft computing. A fusion of
foundations, Methodologies and Applications 10 (2006) 144–151.
[5] B.C.M. Reddy, K.H. Reddy, C.N.M. Reddy, K.V.K. Reddy, Quata allocation to distributors of the supply chain under distributors’ uncertainty and demand
uncertainty by using fuzzy goal programming, Jordan Journal of Mechanical and Industrial Engineering 2 (4) (2008) 215–226.
[6] D. Peidro, J. Mula, M. Jimenez, M.D.M. Botella, A fuzzy linear programming based approach for tactical supply chain planning in an uncertain
environment, European Journal of Operational Research 205 (2010) 65–80.
[7] A. Baykasoglu, T. Gocken, A review and classification of fuzzy mathematical programs, Journal of Intelligent and Fuzzy Systems 19 (2008) 205–229.
[8] J.D. Zhang, G. Rong, Fuzzy possibilistic modeling and sensitivity analysis for optimal fuel gas scheduling in refinery, Engineering Applications of
Artificial Intelligence 23 (2010) 371–385.
[9] K.K.F. Yuen, H.C.W. Lau, A linguistic possibility–probability aggregation model for decision analysis with imperfect knowledge, Applied Soft Computing
9 (2009) 575–589.
[10] J. Mula, D. Peidro, R. Poler, The effectiveness of a fuzzy mathematical programming approach for supply chain production planning with fuzzy demand,
International Journal of Production Economics 128 (1) (2010) 136–143.
[11] Y. Zhang, Z.P. Fan, Y. Liu, A method based on stochastic dominance degrees for stochastic multiple criteria decision making, Computers and Industrial
Engineering 58 (14) (2010) 544–552.
[12] M.J.D. Powell, Variable metric methods for constrained optimization, in: A. Bachem, M. Grotschel, B. Korte (Eds.), Mathematical Programming: The
State of the Art, Springer Verlag, 1983, pp. 288–311.
[13] P. Vasant, N. Barsoum, Hybrid pattern search and simulated annealing for fuzzy production planning problem, Computers and Mathematics with
Applications 60 (2010) 1058–1067.
[14] M.A. Abramson, Pattern search algorithms for mixed variable general constrained optimization problems, Unpublished Doctoral Thesis, Rice
University, TX, 2002.
[15] C. Audet, J.E. Dennis, Analysis of generalized pattern searches, SIAM Journal on Optimization 13 (3) (2003) 889–903.
[16] C. Audet, Convergence results for pattern search algorithms are tight, Optimization and Engineering 5 (2) (2004) 101–122.
[17] R.M. Lewis, V. Torczon, M.W. Trosset, Why pattern search works, Optima (1998) 1–7.
[18] F. Jimenez, G. Sanchez, P. Vasant, J. Verdegay, A multi-objective evolutionary approach for fuzzy optimization in production planning, in: IEEE
International Conference on Systems, Man, and Cybernetics, IEEE Press, USA, 2006, pp. 3120–3125.
[19] P. Vasant, Fuzzy production planning and its application to decision making, Journal of Intelligent Manufacturing 17 (1) (2006) 5–12.
188 P. Vasant / Mathematical and Computer Modelling 57 (2013) 180–188
[20] M. Zamirian, A.V. Kamyad, M.H. Farahi, A novel algorithm for solving optimal path planning problems based on parametrization method and fuzzy
aggregation, Physics Letters A 373 (2009) 3439–3449.
[21] T.F. Liang, Interactive multi-objective transportation planning decisions using fuzzy linear programming, Asia Pacific Journal of Operational Research
25 (1) (2008) 11–31.
[22] D. Peidro, J. Mula, R. Poler, Fuzzy linear programming for supply chain planning under uncertainty, International Journal of Information Technology
and Decision Making 9 (3) (2010) 373–392.
[23] P. Vasant, Innovative hybrid genetic algorithms and line search method for industrial production management, in: M. Chis (Ed.), Evolutionary
Computation and Optimization Algorithms in Software Engineering: Application and Techniques, IGI Global, Hershey, PA, USA, 2010, pp. 142–160.
[24] P. Vasant, N. Barsoum, Hybrid simulated annealing and genetic algorithms for industrial production management problems, AIP Conference
Proceedings 1159 (2009) 254–261.
[25] P. Vasant, N. Barsoum, J.F. Webb, The optimization of a revenue function in a fuzzy linear programming model used for industrial production planning,
Journal of Applied Computer Science Methods 2 (1) (2009) 65–84.
[26] F. Jimenez, G. Sanchez, P. Vasant, Fuzzy optimization via multi-objective evolutionary computation for chocolate manufacturing, in: C. Kahraman
(Ed.), Fuzzy Multi-Criteria Decision Making and Application with Recent Developments, Springer, 2008, pp. 523–538.
[27] I. Elamvazuthi, P. Vasant, T. Ganesan, Fuzzy linear programming using modified logistic membership function, International Review of Automatic
Control 13 (4) (2010) 370–377.
[28] D. Peidro, P. Vasant, Transportation planning with modified s-curve membership functions using an interactive fuzzy multi-objective approach,
Applied Soft Computing 11 (2) (2011) 2656–2663.
[29] P.M. Vasant, Hybrid optimization techniques for industrial production planning, Journal of Computer Science and Technology 10 (3) (2010) 150–151.
[30] F.T. Lin, Personal Communication, March 2007.
[31] K. Chaudhuri, Personal Comminication, March 2007.
[32] I. Elamvazuthi, T. Ganesan, P. Vasant, J.F. Webb, Application of a fuzzy programming technique to production planning in the textile industry,
International Journal of Computer Science and Information Security 6 (3) (2009) 238–243.
[33] P. Vasant, I. Elamvazuthi, J.F. Webb, Fuzzy technique for optimization of objective function with uncertain resource variables and technological
coefficients, International Journal of Modeling, Simulation, and Scientific Computing 1 (3) (2010) 349–367.
[34] P. Vasant, I. Elamvazuthi, T. Ganesan, J.F. Webb, Iterative fuzzy optimization approach for crude oil refinery industry, Scientific Annals of Computer
Science 8 (2) (2010) 261–280.
[35] P. Vasant, N. Barsoum, Hybrid simulated annealing and genetic algorithms for industrial production management problems, AIP Conference
Proceedings 1159 (2009) 254–261. USA.
[36] Genetic Algorithm and Direct Search Toolbox for use with Matlab user’s guide, The Math works Mathlab R2007a, 2008.
[37] P. Vasant, Hybrid optimization for decision making in an uncertain environment, LAP LAMBERT Academic Publishing, Germany, 2011, 244 pages.
[38] P. Vasant, Hybrid optimization techniques for industrial production planning, International Journal of Industrial Engineering-Theory, Applications
and Practice (2008).
[39] P. Vasant, Hybrid genetic algorithms and line search method for industrial production planning with non-linear fitness function, Engineering
Applications of Artificial Intelligence 22 (4–5) (2009) 767–777.
[40] P. Vasant, N. Barsoum, Hybrid general pattern search and simulated annealing for industrial production planning problems, in: Proceedings of 3rd
Global Conference on Power Control Optimization, PCO’2010, Gold Coast, Australia, 2–4 February 2010.