100% found this document useful (9 votes)
97 views55 pages

Cohort Intelligence: A Socio-Inspired Optimization Method 1st Edition Anand Jayant Kulkarni All Chapters Instant Download

Cohort

Uploaded by

sporteshoda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (9 votes)
97 views55 pages

Cohort Intelligence: A Socio-Inspired Optimization Method 1st Edition Anand Jayant Kulkarni All Chapters Instant Download

Cohort

Uploaded by

sporteshoda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 55

Download the Full Version of textbook for Fast Typing at textbookfull.

com

Cohort Intelligence: A Socio-inspired Optimization


Method 1st Edition Anand Jayant Kulkarni

https://textbookfull.com/product/cohort-intelligence-a-
socio-inspired-optimization-method-1st-edition-anand-jayant-
kulkarni/

OR CLICK BUTTON

DOWNLOAD NOW

Download More textbook Instantly Today - Get Yours Now at textbookfull.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Socio cultural Inspired Metaheuristics Anand J. Kulkarni

https://textbookfull.com/product/socio-cultural-inspired-
metaheuristics-anand-j-kulkarni/

textboxfull.com

Proceedings of the 2nd International Conference on Data


Engineering and Communication Technology ICDECT 2017 Anand
J. Kulkarni
https://textbookfull.com/product/proceedings-of-the-2nd-international-
conference-on-data-engineering-and-communication-technology-
icdect-2017-anand-j-kulkarni/
textboxfull.com

Bio-Inspired Algorithms in PID Controller Optimization


First Edition Ashour

https://textbookfull.com/product/bio-inspired-algorithms-in-pid-
controller-optimization-first-edition-ashour/

textboxfull.com

Nature Inspired Computing and Optimization Theory and


Applications 1st Edition Srikanta Patnaik

https://textbookfull.com/product/nature-inspired-computing-and-
optimization-theory-and-applications-1st-edition-srikanta-patnaik/

textboxfull.com
Nature-Inspired Algorithms and Applied Optimization 1st
Edition Xin-She Yang (Eds.)

https://textbookfull.com/product/nature-inspired-algorithms-and-
applied-optimization-1st-edition-xin-she-yang-eds/

textboxfull.com

Nature Inspired Optimization Techniques for Image


Processing Applications Jude Hemanth

https://textbookfull.com/product/nature-inspired-optimization-
techniques-for-image-processing-applications-jude-hemanth/

textboxfull.com

Modeling, Control, Estimation, and Optimization for


Microgrids: A Fuzzy-Model-Based Method 1st Edition
Zhixiong Zhong
https://textbookfull.com/product/modeling-control-estimation-and-
optimization-for-microgrids-a-fuzzy-model-based-method-1st-edition-
zhixiong-zhong/
textboxfull.com

Combustion Optimization Based on Computational


Intelligence 1st Edition Hao Zhou

https://textbookfull.com/product/combustion-optimization-based-on-
computational-intelligence-1st-edition-hao-zhou/

textboxfull.com

Artificial Intelligence for Big Data Complete guide to


automating Big Data solutions using Artificial
Intelligence techniques Anand Deshpande
https://textbookfull.com/product/artificial-intelligence-for-big-data-
complete-guide-to-automating-big-data-solutions-using-artificial-
intelligence-techniques-anand-deshpande/
textboxfull.com
Intelligent Systems Reference Library 114

Anand Jayant Kulkarni


Ganesh Krishnasamy
Ajith Abraham

Cohort
Intelligence: A
Socio-inspired
Optimization
Method
Intelligent Systems Reference Library

Volume 114

Series editors
Janusz Kacprzyk, Polish Academy of Sciences, Warsaw, Poland
e-mail: [email protected]
Lakhmi C. Jain, University of Canberra, Canberra, Australia;
Bournemouth University, UK;
KES International, UK
e-mail: [email protected]; [email protected]
URL: http://www.kesinternational.org/organisation.php
About this Series

The aim of this series is to publish a Reference Library, including novel advances
and developments in all aspects of Intelligent Systems in an easily accessible and
well structured form. The series includes reference works, handbooks, compendia,
textbooks, well-structured monographs, dictionaries, and encyclopedias. It contains
well integrated knowledge and current information in the field of Intelligent
Systems. The series covers the theory, applications, and design methods of
Intelligent Systems. Virtually all disciplines such as engineering, computer science,
avionics, business, e-commerce, environment, healthcare, physics and life science
are included.

More information about this series at http://www.springer.com/series/8578


Anand Jayant Kulkarni Ganesh Krishnasamy

Ajith Abraham

Cohort Intelligence:
A Socio-inspired
Optimization Method

123
Anand Jayant Kulkarni Ganesh Krishnasamy
Odette School of Business Department of Electrical Engineering,
University of Windsor Faculty of Engineering
Windsor, ON Universiti Malaya
Canada Kuala Lumpur
Malaysia
and
Ajith Abraham
Department of Mechanical Engineering, Machine Intelligence Research Labs
Symbiosis Institute of Technology (MIR Labs)
Symbiosis International University Scientific Network for Innovation
Pune, Maharashtra and Research Excellence
India Auburn, WA
USA

ISSN 1868-4394 ISSN 1868-4408 (electronic)


Intelligent Systems Reference Library
ISBN 978-3-319-44253-2 ISBN 978-3-319-44254-9 (eBook)
DOI 10.1007/978-3-319-44254-9
Library of Congress Control Number: 2016949596

© Springer International Publishing Switzerland 2017


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part
of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations,
recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission
or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar
methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this
publication does not imply, even in the absence of a specific statement, that such names are exempt from
the relevant protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this
book are believed to be true and accurate at the date of publication. Neither the publisher nor the
authors or the editors give a warranty, express or implied, with respect to the material contained herein or
for any errors or omissions that may have been made.

Printed on acid-free paper

This Springer imprint is published by Springer Nature


The registered company is Springer International Publishing AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Anand Jayant Kulkarni would like to dedicate
this book to his
loving wife ‘Prajakta’
and
lovely son ‘Nityay’
Preface

This book is written for engineers, scientists, and students studying/working in the
optimization, artificial intelligence (AI), or computational intelligence arena. The
book discusses the core and underlying principles and analysis of the different
concepts associated with an emerging socio-inspired AI optimization tool referred
to as cohort intelligence (CI).
The book in detail discusses the CI methodology as well as several modifications
for solving a variety of problems. The validation of the methodology is also pro-
vided by solving several unconstrained test problems. In order to make CI solve
real-world problems which are inherently constrained, CI method with a penalty
function approach is tested on several constrained test problems and comparison
of the performance is also discussed. The book also demonstrates the ability of
CI methodology for solving several cases of the combinatorial problems such as
traveling salesman problem (TSP) and knapsack problem (KP). In addition,
real-world applications of the CI methodology by solving complex and large-sized
combinatorial problems from the healthcare, inventory, supply chain optimization,
and cross-border transportation domain is also discussed. The inherent ability of
handling constraints based on the probability distribution is also revealed and
proved using these problems. A detailed mathematical formulation, solutions, and
comparisons are provided in every chapter. Moreover, the detailed discussion on
the CI methodology modifications for solving several problems from the machine
learning domain is also provided.
The mathematical level in all the chapters is well within the grasp of the sci-
entists as well as the undergraduate and graduate students from the engineering and
computer science streams. The reader is encouraged to have basic knowledge of
probability and mathematical analysis. In presenting the CI and associated modi-
fications and contributions, the emphasis is placed on the development of the
fundamental results from basic concepts. Numerous examples/problems are worked
out in the text to illustrate the discussion. These illustrative examples may allow the
reader to gain further insight into the associated concepts. The various algorithms
for solving have been coded in MATLAB software. All the executable codes are
available online at www.sites.google.com/site/oatresearch/cohort-intelligence.

vii
viii Preface

The book is an outgrowth of the three-year work by the authors. In addition,


Fazle Baki and Ben Chaouch from University of Windsor, ON, Canada, helped
with the complex combinatorial problem formulations. Over the period of 3 years,
the algorithms have been tested extensively for solving various real-world problems
as well as published in various prestigious journals and conferences. The sugges-
tions and criticism of various reviewers and colleagues had a significant influence
on the way the work has been presented in this book. We are much grateful to our
colleagues for reviewing the different parts of the manuscript and for providing
us valuable feedback. The authors would like to thank Dr. Thomas Ditzinger,
Springer Engineering In-house Editor, Studies in Computational Intelligence Series;
Prof. Janusz Kacprzyk, Editor-in-Chief, Springer Intelligence Systems Reference
Library Series; and Mr. Holger Schäpe, Editorial Assistant, Springer Verlag,
Heidelberg, for the editorial assistance and excellent cooperative collaboration to
produce this important scientific work. We hope that the reader will share our
excitement to present this volume on cohort intelligence and will find it useful.

Windsor, ON, Canada Anand Jayant Kulkarni


Kuala Lumpur, Malaysia Ganesh Krishnasamy
Auburn, WA, USA Ajith Abraham
May 2016
Contents

1 Introduction to Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 What Is Optimization? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.1 General Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.2 Active/Inactive/Violated Constraints . . . . . . . . . . . . . . . . . 3
1.1.3 Global and Local Minimum Points . . . . . . . . . . . . . . . . . . 3
1.2 Contemporary Optimization Approaches. . . . . . . . . . . . . . . . . . . . 4
1.3 Socio-Inspired Optimization Domain . . . . . . . . . . . . . . . . . . . . . . 6
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2 Socio-Inspired Optimization Using Cohort Intelligence . . . . . . . . . . 9
2.1 Framework of Cohort Intelligence . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2 Theoretical Comparison with Contemporary Techniques . . . . . . . 13
2.3 Validation of Cohort Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . 14
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3 Cohort Intelligence for Constrained Test Problems . . . . . . . . . . . . . 25
3.1 Constraint Handling Using Penalty Function Approach . . . . . . . . 25
3.2 Numerical Experiments and Discussion . . . . . . . . . . . . . . . . . . . . 26
3.3 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4 Modified Cohort Intelligence for Solving Machine
Learning Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.2 The Clustering Problem and K-Means Algorithm . . . . . . . . . . . . . 41
4.3 Modified Cohort Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
4.4 Hybrid K-MCI and Its Application for Clustering . . . . . . . . . . . . 43
4.5 Experiment Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

ix
x Contents

4.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
5 Solution to 0–1 Knapsack Problem Using Cohort
Intelligence Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
5.1 Knapsack Problem Using CI Method . . . . . . . . . . . . . . . . . . . . . . 55
5.1.1 Illustration of CI Solving 0–1 KP . . . . . . . . . . . . . . . . . . . 56
5.2 Results and Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
5.3 Conclusions and Future Directions . . . . . . . . . . . . . . . . . . . . . . . . 70
5.4 Test Cases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
6 Cohort Intelligence for Solving Travelling Salesman Problems . . . . 75
6.1 Traveling Salesman Problem (TSP) . . . . . . . . . . . . . . . . . . . . . . . 76
6.1.1 Solution to TSP Using CI . . . . . . . . . . . . . . . . . . . . . . . . . 76
6.2 Results and Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
6.3 Concluding Remarks and Future Directions . . . . . . . . . . . . . . . . . 85
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
7 Solution to a New Variant of the Assignment Problem
Using Cohort Intelligence Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 87
7.1 New Variant of the Assignment Problem . . . . . . . . . . . . . . . . . . . 87
7.2 Probable Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
7.2.1 Application in Healthcare . . . . . . . . . . . . . . . . . . . . . . . . . 90
7.2.2 Application in Supply Chain Management . . . . . . . . . . . . 91
7.3 Cohort Intelligence (CI) Algorithm for Solving the CBAP . . . . . . 91
7.3.1 A Sample Illustration of the CI Algorithm
for Solving the CBAP . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
7.3.2 Numerical Experiments and Results . . . . . . . . . . . . . . . . . 94
7.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
8 Solution to Sea Cargo Mix (SCM) Problem Using Cohort
Intelligence Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .... 101
8.1 Sea Cargo Mix Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . .... 102
8.2 Cohort Intelligence for Solving Sea Cargo Mix (SCM)
Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
8.3 Numerical Experiments and Results . . . . . . . . . . . . . . . . . . . . . . . 106
8.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
9 Solution to the Selection of Cross-Border Shippers (SCBS)
Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
9.1 Selection of Cross-Border Shippers (SCBS) Problem . . . . . . . . . . 118
9.1.1 Single Period Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
9.1.2 Multi Period Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Contents xi

9.2 Numerical Experiments and Results . . . . . . . . . . . . . . . . . . . . . . . 121


9.3 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
10 Conclusions and Future Directions . . . . . . . . . . . . . . . . . . . . . . . . . . 131
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
Chapter 1
Introduction to Optimization

1.1 What Is Optimization?

For almost all the human activities there is a desire to deliver the most with the
least. For example in the business point of view maximum profit is desired from
least investment; maximum number of crop yield is desired with minimum
investment on fertilizers; maximizing the strength, longevity, efficiency, utilization
with minimum initial investment and operational cost of various household as well
as industrial equipments and machineries. To set a record in a race, for example, the
aim is to do the fastest (shortest time).
The concept of optimization has great significance in both human affairs and the
laws of nature which is the inherent characteristic to achieve the best or most
favorable (minimum or maximum) from a given situation [1]. In addition, as the
element of design is present in all fields of human activity, all aspects of opti-
mization can be viewed and studied as design optimization without any loss of
generality. This makes it clear that the study of design optimization can help not
only in the human activity of creating optimum design of products, processes and
systems, but also in the understanding and analysis of mathematical/physical
phenomenon and in the solution of mathematical problems. The constraints are
inherent part if the real world problems and they have to be satisfied to ensure the
acceptability of the solution. There are always numerous requirements and con-
straints imposed on the designs of components, products, processes or systems in
real-life engineering practice, just as in all other fields of design activity. Therefore,
creating a feasible design under all these diverse requirements/constraints is already
a difficult task, and to ensure that the feasible design created is also ‘the best’ is
even more difficult.

© Springer International Publishing Switzerland 2017 1


A.J. Kulkarni et al., Cohort Intelligence: A Socio-inspired Optimization Method,
Intelligent Systems Reference Library 114, DOI 10.1007/978-3-319-44254-9_1
2 1 Introduction to Optimization

1.1.1 General Problem Statement

All the optimal design problems can be expressed in a standard general form stated
as follows:

Minimize objective function f ðXÞ ð1:1Þ


Subject to

s number of inequality constraints gj ðXÞ  0; j ¼ 1; 2; . . .; s ð1:2Þ

w number of equality constraints hj ðXÞ ¼ 0 ; j ¼ 1; 2; . . .; w ð1:3Þ

where the number of


design variables is given by xi ;
i ¼ 1; 2; . . .; n
8 9
> x1 >
< x2 >
> =
or by design variable vector X¼ ..
> >
: . >
> ;
xn

• A problem where the objective function is to be maximized (instead of mini-


mized) can also be handled with this standard problem statement since maxi-
mization of a function f ðXÞ is the same as minimizing the negative of f ðXÞ.
• Similarly, the ‘≥’ type of inequality constraints can be treated by reversing the
sign of the constraint function to form the ‘≤’ type of inequality.
• Sometimes there may be simple limits on the allowable range of value a design
variable can take, and these are known as side constraints:

xli  xi  xui

• where xli and xui are the lower and upper limits of xi , respectively. However,
these side constraints can be easily converted into the normal inequality con-
straints (by splitting them into 2 inequality constraints).
• Although all optimal design problems can be expressed in the above standard
form, some categories of problems may be expressed in alternative specialized
forms for greater convenience and efficiency.
1.1 What Is Optimization? 3

1.1.2 Active/Inactive/Violated Constraints

The constraints in an optimal design problem restrict the entire design space into
smaller subset known as the feasible region, i.e. not every point in the design space
is feasible. See Fig. 1.1.
• An inequality constraint gj ðXÞ is said to be violated at the point x if it is not
 
satisfied there gj ðXÞ  0 .
 
• If gj ðXÞ is strictly satisfied gj ðXÞ\0 then it is said to be inactive at x.
 
• If gj ðXÞ is satisfied at equality gj ðXÞ ¼ 0 then it is said to be active at x.
• The set of points at which an inequality constraint is active forms a constraint
boundary which separates the feasibility region of points from the infeasible
region.
• Based
 on the
 above definitions,
 equality
 constraints can only be either violated
hj ðXÞ 6¼ 0 or active hj ðXÞ ¼ 0 at any point x.
• The set of points where an equality constraint is active forms a sort of boundary
both sides of which are infeasible.

1.1.3 Global and Local Minimum Points

Let the set of design variables that give rise to a minimum of the objective function
f ðXÞ be denoted by X (the asterisk  is used to indicate quantities and terms
referring to an optimum point). An objective GðXÞ is at its global (or absolute)
minimum at the point X if:

f ðX Þ  f ðXÞ for all X in the feasible region

x2
x2

xc xa xa
xb
xb
0
)=
(x
g3

x1 x1
g 2(

0 h1 (x)
= =0
x)

x)
1(
=0

Fig. 1.1 Active/Inactive/Violated constraints


4 1 Introduction to Optimization

f(x)
global maximum

local maximum

local minimum
a local maximum b x
constraint boundary

constraint boundary
local minimum

global minimum

Fig. 1.2 Minimum and maximum points

The objective has a local (or relative) minimum at the point X if:

f ðX Þ  f ðXÞ for all feasible X


within a small neighborhood of X

A graphical representation of these concepts is shown in Fig. 1.2 for the case of
a single variable x over a closed feasible region a  x  b.

1.2 Contemporary Optimization Approaches

There are several mathematical optimization techniques being practiced so far, for
example gradient methods, Integer Programming, Branch and Bound, Simplex
algorithm, dynamic programming, etc. These techniques can efficiently solve the
problems with limited size. Also, they could be more applicable to solve linear
problems. In addition, as the number of variables and constraints increase, the
computational time to solve the problem, may increase exponentially. This may
limit their applicability. Furthermore, as the complexity of the problem domain is
increasing solving such complex problems using the mathematical optimization
techniques is becoming more and more cumbersome. In addition, certain heuristics
have been developed to solve specific problem with certain size. Such heuristics
have very limited flexibility to solve different class of problems.
In past few years a number of nature-/bio-inspired optimization techniques (also
referred to as metaheuristics) such as Evolutionary Algorithms (EAs), Swarm
Intelligence (SI), etc. have been developed. The EA such as Genetic Algorithm
(GA) works on the principle of Darwinian theory of survival of the fittest individual
1.2 Contemporary Optimization Approaches 5

in the population. The population is evolved using the operators such as selection,
crossover, mutation, etc. According to Deb [2] and Ray et al. [3], GA can often
reach very close to the global optimal solution and necessitates local improvement
techniques to incorporate into it. Similar to GA, mutation driven approach of
Differential Evolution (DE) was proposed by Storn and Price [4] which helps
explore and further locally exploit the solution space to reach the global optimum.
Although, easy to implement, there are several problem dependent parameters
required to be tuned and may also require several associated trials to be performed.
Inspired from social behavior of living organisms such as insects, fishes, etc.
which can communicate with one another either directly or indirectly the paradigm
of SI is a decentralized self organizing optimization approach. These algorithms
work on the cooperating behavior of the organisms rather than competition amongst
them. In SI, every individual evolves itself by sharing the information from others
in the society. The techniques such as Particle Swarm Optimization (PSO) is
inspired from the social behavior of bird flocking and school of fish searching for
food [4]. The fishes or birds are considered as particles in the solution space
searching for the local as well as global optimum points. The directions of
movements of these particles are decided by the best particle in the neighborhood
and the best particle in entire swarm. The Ant Colony Optimization (ACO) works
on the ants’ social behavior of foraging food following a shortest path [5]. The ant is
considered as an agent of the colony. It searches for the better solution in its close
neighborhood and iteratively updates its solution. The ants also updates their
pheromone trails at the end of every iteration. This helps every ant decide their
directions which may further self organize them to reach to the global optimum.
Similar to ACO, the Bee Algorithm (BA) also works on the social behavior of
honey bees finding the food; however, the bee colony tends to optimize the use of
number of members involved in particular pre-decided tasks [6]. The Bees
Algorithm is a population-based search algorithm proposed by Pham et al. [7] in a
technical report presented at the Cardiff University, UK. It basically mimics the
food foraging behavior of honey bees. According to Pham and Castellani [8] and
Pham et al. [7], Bees Algorithm mimics the foraging strategy of honey bees which
look for the best solution. Each candidate solution is thought of as a flower or a
food source, and a population or colony of n bees is used to search the problem
solution space. Each time an artificial bee visits a solution, it evaluates its objective
solution. Even though it has been proven to be effective solving continuous as well
as combinatorial problems Pham and Castellani [8, 9], some measure of the
topological distance between the solutions is required. The Firefly Algorithm
(FA) is an emerging metaheuristic swarm optimization technique based on the
natural behavior of fireflies. The natural behavior of fireflies is based on biolumi-
nescence phenomenon [10, 11]. They produce short and rhythmic flashes to
communicate with other fireflies and attract potential prey. The light
intensity/brightness
 I of the flash at a distance r obeys inverse square law, i.e.
I / 1 r2 in addition to the light absorption by surrounding air. This makes most of
6 1 Introduction to Optimization

the fireflies visible only till a limited distance, usually several hundred meters at
night, which is enough to communicate. The flashing light of fireflies can be for-
mulated in such a way that it is associated with the objective function to be opti-
mized, which makes it possible to formulate optimization algorithms [10, 11].
Similar to the other metaheuristic algorithms constraint handling is one of crucial
issues being addressed by researchers [12].

1.3 Socio-Inspired Optimization Domain

Every society is a collection of self interested individuals. Every individual has a


desire to improve itself. The improvement is possible through learning from one
another. Furthermore, the learning is achieved through interaction as well as
competition with the individuals. It is important to mention here that this learning
may lead to quick improvement in the individual’s behavior; however, it is also
possible that for certain individuals the learning and further improvement is slower.
This is because the learning and associated improvement depend upon the quality
of the individual being followed. In the context of optimization (minimization and
maximization) if the individual solution being followed is better, the chances of
improving the follower individual solution increases. Due to uncertainty, this is also
possible that the individual solution being followed may be of inferior quality as
compared to the follower candidate. This may make the follower individual solution
to reach a local optimum; however, due to inherent ability of societal individuals to
keep improving itself other individuals are also selected for learning. This may
make the individuals further jump out of the possible local optimum and reach the
global optimum solution. This common goal of improvement in the
behavior/solution reveals the self organizing behavior of the entire society. This is
an effective self organizing system which may help in solving a variety of complex
optimization problems.
The following chapters discuss an emerging Artificial Intelligence
(AI) optimization technique referred to as Cohort Intelligence (CI). The framework
of CI along with its validation by solving several unconstrained test problems is
discussed in detail. In addition, numerous applications of CI methodology and its
modified versions in the domain of machine learning are provided. Moreover, the
CI application for solving several test cases of the combinatorial problems such as
Traveling Salesman Problem (TSP) and 0–1 Knapsack Problem are discussed.
Importantly, CI methodology solving real world combinatorial problems from the
healthcare and inventory problem domain, as well as complex and large sized
Cross-Border transportation problems is also discussed. These applications under-
score the importance of the Socio-inspired optimization method such as CI.
References 7

References

1. Kulkarni, A.J., Tai, K., Abraham, A.: Probability collectives: a distributed multi-agent system
approach for optimization. In: Intelligent Systems Reference Library, vol. 86. Springer, Berlin
(2015) (doi:10.1007/978-3-319-16000-9, ISBN: 978-3-319-15999-7)
2. Deb, K.: An efficient constraint handling method for genetic algorithms. Comput. Methods
Appl. Mech. Eng. 186, 311–338 (2000)
3. Ray, T., Tai, K., Seow, K.C.: Multiobjective design optimization by an evolutionary
algorithm. Eng. Optim. 33(4), 399–424 (2001)
4. Storn, R., Price, K.: Differential evolution—a simple and efficient heuristic for global
optimization over continuous spaces. J. Global Optim. 11, 341–359 (1997)
5. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of IEEE International
Conference on Neural Networks, pp. 1942–1948 (1995)
6. Dorigo, M., Birattari, M., Stitzle, T.: Ant colony optimization: artificial ants as a
computational intelligence technique. IEEE Comput. Intell. Mag., 28–39 (2006)
7. Pham, D.T., Ghanbarzadeh, A., Koc, E., Otri, S., Rahim, S., Zaidi, M.: The bees algorithm.
Technical Note, Manufacturing Engineering Centre, Cardiff University, UK (2005)
8. Pham, D.T., Castellani, M.: The bees algorithm—modelling foraging behaviour to solve
continuous optimisation problems. Proc. ImechE, Part C, 223(12), 2919–2938 (2009)
9. Pham, D.T., Castellani, M.: Benchmarking and comparison of nature-inspired
population-based continuous optimisation algorithms. Soft Comput. 1–33 (2013)
10. Yang, X.S.: Firefly algorithms for multimodal optimization. In: Stochastic Algorithms:
Foundations and Applications. Lecture Notes in Computer Sciences 5792, pp. 169–178.
Springer, Berlin (2009)
11. Yang, X.S., Hosseini, S.S.S., Gandomi, A.H.: Firefly Algorithm for solving non-convex
economic dispatch problems with valve loading effect. Appl. Soft Comput. 12(3), 1180–1186
(2002)
12. Deshpande, A.M., Phatnani, G.M., Kulkarni, A.J.: Constraint handling in firefly algorithm. In:
Proceedings of IEEE International Conference on Cybernetics, pp. 186–190 (2013)
Chapter 2
Socio-Inspired Optimization Using Cohort
Intelligence

The nature-/bio-inspired optimization techniques such as genetic algorithm (GA),


particle swarm optimization (PSO), ant colony optimization (ACO), simulated
annealing (SA), Tabu search, etc., have become popular due to their simplicity to
implement and working based on rules. The GA is population based which is
evolved using the operators such as selection, crossover, mutation, etc. According
to Deb [1] and Ray et al. [2] the performance of GA is governed by the quality of
the population being evaluated and may often reach very close to the global optimal
solution and necessitates local improvement techniques to incorporate into it. The
paradigm of Swarm Intelligence (SI) is a decentralized self organizing optimization
approach inspired from social behavior of living organisms such as insects, fishes,
etc. which can communicate with one another either directly or indirectly. The
techniques such as Particle Swarm Optimization (PSO) is inspired from the social
behavior of bird flocking and school of fish searching for food [3]. The Ant Colony
Optimization (ACO) works on the ants’ social behavior of foraging food following
a shortest path [4]. Similar to ACO, the Bee Algorithm (BA) also works on the
social behavior of honey bees finding the food; however, the bee colony tends to
optimize the use of number of members involved in particular pre-decided tasks [5].
Generally, the swarm techniques are computationally intensive.
Kulkarni et al. [6] proposed an emerging Artificial Intelligence (AI) technique
referred to as Cohort Intelligence (CI). It is inspired from the self-supervised
learning behavior of the candidates in a cohort. The cohort here refers to a group of
candidates interacting and competing with one another to achieve some individual
goal which is inherently common to all the candidates. When working in a cohort,
every candidate tries to improve its own behavior by observing the behavior of
every other candidate in that cohort. Every candidate may follow a certain behavior
in the cohort which according to itself may result into improvement in its own
behavior. As certain qualities make a particular behavior which, when a candidate
follows, it actually tries to adapt to the associated qualities. This makes every
candidate learn from one another and helps the overall cohort behavior to evolve.
The cohort behavior could be considered saturated, if for considerable number of
© Springer International Publishing Switzerland 2017 9
A.J. Kulkarni et al., Cohort Intelligence: A Socio-inspired Optimization Method,
Intelligent Systems Reference Library 114, DOI 10.1007/978-3-319-44254-9_2
10 2 Socio-Inspired Optimization Using Cohort Intelligence

learning attempts the individual behavior of all the candidates does not improve
considerably and candidates’ behaviors become hard to distinguish. The cohort
could be assumed to become successful when for a considerable number of times
the cohort behavior saturates to the same behavior.
This chapter discusses the CI methodology framework in detail and further
validates its ability by solving a variety of unconstrained test problems. This
demonstrates its strong potential of being applicable for solving unimodal as well as
multimodal problems.

2.1 Framework of Cohort Intelligence

Consider a general unconstrained problem (in the minimization sense) as follows:

Minimize f ðxÞ ¼ f ðx1 ; . . .xi ; . . .; xn Þ


ð2:1Þ
Subject to Wlower
i  xi  Wupper
i ; i ¼ 1; . . .; N

As a general case, assume the objective function f ðxÞ as the behavior of an


individual candidate in the cohort which it naturally tries to enrich by modifying the
associated set of characteristics/attributes/qualities x ¼ ðx1 ; . . .xi ; . . .; xN Þ.
Having considered a cohort with number of candidates C, every individual
candidate c ðc ¼ 1; . . .; C Þ belongs a set of characteristics/attributes/qualities xc ¼
 c 
x1 ; . . .xci ; . . .; xcN which makes the overall quality of its behavior f ðxc Þ. The
individual behavior of each candidate c is generally being observed by itself and
every other candidate ðcÞ in the cohort. This naturally urges every candidate c to
follow the behavior better than its current behavior. More specifically, candidate
   
c may follow f  xðcÞ if it is better than f  ðxc Þ, i.e. f  xðcÞ \f  ðxc Þ. Importantly,
following a behavior f ðxÞ refers to following associated qualities x ¼
ðx1 ; . . .xi ; . . .; xN Þ with certain variations t associated with them. However, fol-
lowing better behavior and associated qualities is highly uncertain. This is because;
there is certain probability involved by which it selects certain behavior to follow.
In addition, a stage may come where the cohort behavior could become saturated. In
other words, at a certain stage, there could be no improvement in the behavior of an
individual candidate for a considerable number of learning attempts. Such situation
is referred to as saturation stage. This makes every candidate to expand its search
around the qualities associated with the current behavior being followed. The
mathematical formulation of the CI methodology is explained below in detail with
the algorithm flowchart in Fig. 2.1 [6, 7].
The procedure begins with the initialization of number of candidates C, sam-
pling interval Wi for each quality xi ; i ¼ 1; . . .; N, learning attempt counter n ¼ 1,
and the setup of sampling interval reduction factor r 2 ½0; 1, convergence param-
eter e ¼ 0:0001, number of variations t. The values of C; t and m are chosen based
on preliminary trials of the algorithm.
2.1 Framework of Cohort Intelligence 11

START

Initialize number of candidates C in the cohort,


quality variations t, and set up interval reduction

The probability associated with the behavior being


followed by every candidate in the cohort is calculated

Using roulette wheel approach every candidate selects


behavior to follow from within the C available choices

Every candidate shrinks/expands the sampling interval


of every quality i based on whether condition of
saturation is satisfied

Every candidate forms t behaviors by sampling the


qualities from within the updated sampling intervals

Every candidate follows the best behavior from within


its t behaviors

N Cohort behavior
saturated?

Y
N
Convergence ?

Accept the current


cohort behavior as
final solution

STOP

Fig. 2.1 Cohort intelligence (CI) flowchart


12 2 Socio-Inspired Optimization Using Cohort Intelligence

Step 1. The probability of selecting the behavior f  ðxc Þ of every associated


candidate c ðc ¼ 1; . . .; C Þ is calculated as follows:

1=f  ðxc Þ
pc ¼ PC ; ðc ¼ 1; . . .; CÞ ð2:2Þ
 c
c¼1 1=f ðx Þ

Step 2. Every candidate c ðc ¼ 1; . . .; C Þ generates a random number r 2 ½0; 1


and using a roulette wheel approach decides to follow corresponding 
  c½? c½? c½?
behavior f  xc½? and associated qualities xc½? ¼ x1 ; . . .xi ; . . .; xN .
The superscript ½? indicates that the behavior is selected by candidate
c and not known in advance. The roulette wheel approach could be most
appropriate as it provides chance to every behavior in the cohort to get
selected purely based on its quality. In addition, it also may increase the
chances of any candidate to select the better behavior as the associated
probability stake pc ðc ¼ 1; . . .; CÞ presented in Eq. (2.2) in the interval
½0; 1 is directly proportional to the quality of the behavior f  ðxc Þ. In other
words, better the solution, higher is the probability of being followed by
the candidates in the cohort.
Step 3. Every candidate c ðc ¼ 1; . . .; C Þ shrinks the sampling interval Wc½? ; i ¼
i
c½?
1; . . .; N associated with every variable xi ; i ¼ 1; . . .; N to its local
neighborhood. This is done as follows:
h i
c½? c½? c½?
Wi 2 xi  ðkWi k=2Þ; xi þ ðkWi k=2Þ ð2:3Þ

where Wi ¼ ðkWi kÞ  r.
Step 4. Each candidate c ðc ¼ 1; . . .; CÞ samples t qualities from within the
c½?
updated sampling interval Wi ; i ¼ 1; . . .; N associated with every
c½?
variable xi n ; i ¼ 1; . . .; N and computes
o a set of associated t behaviors,
i.e. Fc;t
¼ f ðx Þ ; . . .; f ðxc Þ j ; . . .; f ðxc Þt , and selects the best function
c 1

f  ðxc Þ from within. This makes the cohort is available withC updated
behaviors represented as FC ¼ f  ðx1 Þ; . . .; f  ðxc Þ; . . .; f  ðxC Þ .
Step 5. The cohort behavior could be considered saturated, if there is no
significant improvement in the behavior f  ðxc Þ of every candidate
c ðc ¼ 1; . . .; CÞ in the cohort, and the difference between the individual
behaviors is not very significant for successive considerable number of
learning attempts, i.e. if
  n  n1 
 
1. max FC max FC   e, and
  C n  C n1 
 
2. min F min F   e, and
2.1 Framework of Cohort Intelligence 13

  n  n 
3. max FC min FC   e, every candidate c ðc ¼ 1; . . .; CÞ
c½?
expands the sampling interval Wi ; i ¼ 1; . . .; N associated with
c½ ? 
every quality xi ; i ¼ 1; . . .; N to its original one
upper
Wlower
i  x i  Wi ; i ¼ 1; . . .; N.
Step 6. If either of the two criteria listed below is valid, accept any of the C
behaviors from current set of behaviors in the cohort as the final objective
function value f  ðxÞ as the final solution and stop, else continue to Step 1.
(a) If maximum number of attempts exceeded.
(b) If cohort saturates to the same behavior (satisfying the conditions in
Step 5) for smax times.

2.2 Theoretical Comparison with Contemporary


Techniques

Particle swarm optimization PSO is a population-based stochastic search algorithm


developed by Kennedy and Eberhart [3]. Due to its simple concept, it has been
applied to many optimization problems. The PSO itself did not work well in solving
constrained problems [8]. To overcome this shortcoming, many modified PSO
techniques such as Quantum-behaved PSO [9], Improved Vector PSO (IVPSO)
[10] and other techniques which controlled the velocity of the swarm were used. All
these techniques depend upon the control parameters in the velocity updating
model, which are the inertia weight and acceleration coefficients. Another technique
referred to as the Barebones PSO (BPSO) [11] used Gaussian normal distribution to
update the values of the particles in the solution. This removed the necessity of
inertia weight and acceleration coefficients. In the PSO variations the entire swarm
has a collective intelligence. While the individual particle keeps track of its own
best solution, every particle in the swarm is also aware of the best solution found by
the entire swarm [3]. The movement of each particle is some function of this
individual best and the group’s best values. The Fully Informed PSO (FIPSO) is
one technique that does not merely depend on the best solution offered globally.
This technique samples the solutions offered by its entire neighborhood and follows
a point in space that is calculated using this complete information [12].
Another technique popular today is the Genetic Algorithm (GA). This technique
follows the principle of survival of the fittest. The best solutions in a particular
generation are taken forward to the next one, and new solutions are generated by
using crossover as well as applying mutations on it. Ant Colony optimization
(ACO) [4, 13] follows autocatalytic behavior which is characterized by a positive
feedback, where the probability with which an agent chooses a path increases with
14 2 Socio-Inspired Optimization Using Cohort Intelligence

the number of agents that previously chose the same path [14]. However it is
difficult to solve continuous optimization problem using ACO directly, as there are
limitations in number of choices for ants at each stage. Some recent research has
extended classical ACO to solve continuous optimization problem.
In CI, however, the collective effort of the swarm is replaced by the competitive
nature of a cohort. Every candidate tries to follow the behavior of a candidate that
has shown better results in that particular iteration. In following this behavior, it
tries to incorporate some of the qualities that made that behavior successful. This
competitive behavior motivates each candidate to perform better, and leads to an
eventual improvement in the behaviors of all the candidates. This technique differs
from PSO and barebones PSO in that it does not check merely the best solution, but
is fully informed of the activities of its fellow candidates and follows a candidate
selected using the roulette wheel approach. However, it is also different from the
FIPSO which keeps track of the entire swarm in that it follows the behavior of only
one candidate, not a resultant of the results presented by the entire swarm. CI also
differs from GA, as there is no direct exchange of certain properties or even
mutation. Rather, candidates decide to follow a fellow candidate, and try to imbibe
the qualities that led that candidate to reach its solution. The values of these
qualities are not replicated exactly. They are instead taken from a close neighbor-
hood of the values of the qualities of the candidate being followed. This gives a
variation in the solutions obtained and this is how the cohort can avoid getting
trapped in local minima. CI differs from ACO as the autocatalytic nature of the ants
is replaced by competitive nature of the cohorts. Instead of having tendency to
follow most followed behavior, candidates in CI try to incorporate the best behavior
in every iteration. This prevents the algorithm from getting caught into local
minima by not relying on the behavior that is locally optimal. The CI algorithm has
shown itself to be comparable to the best results obtained from the various tech-
niques. The sharing of the best solution among candidates in CI gives a direction for
all the candidates to move towards, but the independent search of each candidate
ensures that the candidates come out of local minima to get the best solutions.

2.3 Validation of Cohort Intelligence

The performance of the proposed CI algorithm was tested by solving unconstrained


well known test problems such as Ackley, Dixon and Price, Griewank, Hartmann,
Levy, Michalewicz, Perm, Powell, Powersum, Rastrigin, Schwefel, Sphere,
Sum Square, Trid, Zakhrov, etc. with different problem sizes. The algorithm was
coded in MATLAB 7.8.0 (R2009A) on Windows platform using Intel Core 2 Duo
T6570, 2.10 GHz processor speed and 2 GB RAM. Every test problem was solved
20 times with number of candidates C, number of variations in the behavior
t chosen as 5 and 7, respectively. The values of the reduction factor r chosen for
2.3 Validation of Cohort Intelligence 15

Table 2.1 Summary of unconstrained test problem solutions with 5 variables


Problem True CI Function Standard Time
optimum algorithm evaluations deviation (s)
Best
Mean
Worst
Ackley 0.0000 2.04E−10 43,493 1.06E−08 1.42
5.58E−09
2.70E−08
Dixon and 0.0000 9.86E−32 82,770 4.28E−09 1.83
Price 1.43E−09
4.28E−09
Griewank 0.0000 0.00E+00 163,485 9.73E−03 5.15
1.80E−02
3.70E−02
Levy 0.0000 1.28E−21 38,993 7.70E−20 1.43
6.22E−20
2.18E−19
Michalewicz −4.687658 −3.96E+00 188,700 3.61E−01 5.61
−3.40E+00
−2.80E+00
Perm 0.0000 4.20E−01 225,960 2.79E+00 8.07
2.82E+00
9.83E+00
Powell 0.0000 8.73E−09 162,510 5.54E−06 5.65
2.31E−06
1.88E−05
Rastrigin 0.0000 9.95E−01 277,440 4.91E−01 5.99
1.50E+00
2.00E+00
Schwefel 0.0000 2.83E−06 1,570,718 1.70E−06 38.73
6.03E−06
8.86E−06
Sphere 0.0000 2.69E−29 12,345 5.84E−29 0.35
1.58E−28
2.55E−28
Sum Square 0.0000 1.56E−18 7470 1.37E−18 0.22
2.96E−18
6.70E−18
Zakhrov 0.0000 8.38E−19 7470 6.62E−19 0.23
1.95E−18
3.20E−18

every unconstrained test problem are listed in Table 2.6. These parameters were
derived empirically over numerous experiments.
The CI performance solving a variety of unconstrained test problems is pre-
sented in Tables 2.1, 2.2, 2.3, 2.4 and 2.5 with increase in the number of variables
16 2 Socio-Inspired Optimization Using Cohort Intelligence

Table 2.2 Summary of unconstrained test problem solutions with 10 variables


Problem True CI Function Standard Time
optimum algorithm evaluations deviation (s)
Best
Mean
Worst
Ackley 0.0000 8.97E−08 30,765 3.60E−07 1.39
4.58E−07
9.30E−07
Dixon and 0.0000 6.67E−01 359,640 1.48E−14 12.11
Price 6.67E−01
6.67E−01
Griewank 0.0000 7.48E−03 432,368 1.18E−02 17.49
2.50E−02
4.68E−02
Levy 0.0000 2.02E−06 44,798 2.78E−06 2.29
7.34E−06
1.12E−05
Powell 0.0000 7.73E−06 186,570 7.29E−05 9.46
6.28E−05
2.13E−04
Rastrigin 0.0000 6.96E+00 261,998 2.31E+00 10.04
1.06E+01
1.49E+01
Rosenbrock 0.0000 0.0000E 13,605 0.0000E+00 0.49
+00
0.0000E
+00
0.0000E
+00
Schwefel 0.0000 1.20E−06 2,023,103 1.66E−07 84.55
1.52E−06
1.74E−06
Sphere 0.0000 7.17E−22 18,668 2.73E−22 0.76
9.47E−22
1.62E−21
Sum Square 0.0000 1.32E−16 14,948 6.58E−15 0.61
2.37E−15
2.21E−14
Zakhrov 0.0000 3.22E−12 22,365 1.24E−12 0.90
5.15E−12
6.68E−12

associated with the individual problem. It is observed that with increase in number
of variables, the computational cost, i.e. function evaluations and computational
time was increased. However, the small standard deviation values for all the
2.3 Validation of Cohort Intelligence 17

Table 2.3 Summary of unconstrained test problem solutions with 20 variables


Problem True CI Function Standard Time
optimum algorithm evaluations deviation (s)
Best
Mean
Worst
Ackley 0.0000 2.97E−11 329,745 7.88E−11 21.87
6.04E−11
2.96E−10
Dixon and 0.0000 7.47E−01 358,800 2.31E−02 20.65
Price 7.77E−01
8.22E−01
Griewank 0.0000 0.00E+00 187,763 2.22E−03 14.00
7.40E−04
7.40E−03
Levy 0.0000 7.31E−15 329,768 3.28E−13 27.01
1.73E−13
8.79E−13
Powell 0.0000 8.46E−05 539,640 8.02E−05 44.86
2.18E−04
3.60E−04
Rastrigin 0.0000 2.19E+01 408,758 7.87E+00 24.60
3.88E+01
5.57E+01
Rosenbrock 0.0000 0.00E+00 17,288 7.92E−30 1.01
3.96E−30
1.98E−29
Schwefel 0.0000 4.38E−06 2,023,950 5.03E−07 134.37
5.56E−06
6.12E−06
Sphere 0.0000 6.22E−14 26,183 1.11E−14 1.78
7.60E−14
9.40E−14
Sum Square 0.0000 8.88E−11 37,335 5.17E−10 2.50
3.78E−10
1.88E−09
Zakhrov 0.0000 1.00E−06 37,290 1.08E−06 2.53
2.19E−06
4.97E−06

functions independent of the number of variables highlighted its robustness. The


effect of CI parameters such as number of candidates C, reduction rate r and number
of variations in behavior t was also analyzed on unimodal as well as multimodal
18 2 Socio-Inspired Optimization Using Cohort Intelligence

Table 2.4 Summary of unconstrained test problem solutions with 30 variables


Problem True CI Function Standard Time
optimum algorithm evaluations deviation (s)
Best
Mean
Worst
Ackley 0.0000 1.59E−07 299,835 6.41E−07 28.91
5.91E−07
1.79E−06
Dixon and 0.0000 9.89E−01 357,413 7.25E−02 29.07
Price 1.10E+00
1.23E+00
Griewank 0.0000 2.90E−06 398,918 2.86E−03 39.91
1.82E−03
7.57E−03
Levy 0.0000 4.89E−07 674,363 9.15E−05 80.24
4.62E−05
2.47E−04
Powell 0.0000 7.23E−02 743,595 1.23E−01 89.86
1.84E−01
4.23E−01
Rastrigin 0.0000 5.57E+01 296,745 1.41E+01 26.40
8.77E+01
1.00E+02
Rosenbrock 0.0000 0.00E+00 19,470 1.44E−29 1.58
7.22E−30
3.61E−29
Schwefel 0.0000 1.06E−05 2,023,290 8.74E−07 180.62
1.20E−05
1.37E−05
Sphere 0.0000 1.91E−13 26,235 7.38E−14 2.51
2.66E−13
4.51E−13
Sum Square 0.0000 1.28E−10 55,433 3.79E−04 5.23
2.52E−04
9.75E−04
Zakhrov 0.0000 1.63E−04 221,798 3.03E−04 20.98
6.93E−04
1.15E−03

functions. The effect is visible in Fig. 2.3 where effect of these parameters on
Sphere function and Ackley Function is presented as a representative to unimodal
and multimodal function, respectively. The visualization for the convergence of the
2.3 Validation of Cohort Intelligence 19

Table 2.5 Summary of solution to Powersum, Hartmann and Trid function


Problem No of True CI Function Standard Time
variables optimum algorithm evaluations deviation (s)
Best
Mean
Worst
Powersum 4 0.0000 1.38E 619,125 4.44E-05 17.48
−06
6.74E
−05
1.34E
−04
Hartmann 6 −3.86278 −3.32E 710,483 1.67E-03 31.53
+00
−3.32E
+00
−3.32E
+00
Trid 6 −50.0000 −4.87E 177,090 4.92E-03 4.27
+01
−4.87E
+01
−4.87E
+01

representative Ackley function is presented in Fig. 2.2 for learning attempts 1, 10,
15 and 30. For both types of functions, the computational cost, i.e. function eval-
uations and computational time was observed to be increasing linearly with
increasing number of candidates C (refer to Fig. 2.3a, b) as well as number of
variations in behavior t (refer to Fig. 2.3e, f, k and l). This was because, with
increase in number of candidates, number of behavior choices i.e. function evalu-
ations also increased. Moreover, with fewer number of candidates C, the quality of
the solution at the end of first learning attempt referred to as initial solution as well
as the converged solution were quite close to each other and importantly, the
converged solutions and the converged solution was suboptimal. The quality of
both the solutions improved with increase in number of candidates C. This was
because fewer number of behavior choices were available with fewer number of
candidates, whereas with increase in number of candidates the total choice of
behavior also increased as a result initial solution worsened whereas converged
solution improved as sufficient time was provided for saturation. Due to this a
widening gap between initial solution and converged solution was observed in the
Exploring the Variety of Random
Documents with Different Content
Hanger, Colonel George (afterwards Lord Coleraine), 192 and
note 141
Harcourt, Countess of, 50 note 39, 358
— Earl of, 67, 75
Hardwicke, Lord, 338
Hardy, Professor, 238
Harte, Rev. Walter, 32 and note 31, 33, 35-43, 49
Harvey, Fenton, 20
Hastings, Warren, 164
Hawke, Lord, 241
Hawkesbury, Lady, 267
Hawksworth, Dr., 343
Héritier, C. de l’, 367 and note
Hertford, Marquis of, 427
Hervey, General, 105, 106
— Lady Mary, 267 and note
Hesketh, Lady, 233
Hill, Rev. Rowland, 359
Hoar, Captain, 350, 352
Holdernesse, Lord, 56
Holroyd, John Baker (afterwards Lord Sheffield, q.v.), 58
Hoole, Mrs., 189, 198, 246, 250, 252
— John, 189
— Rev. Samuel, 189, 246, 251, 252, 361, 362
Horsley, Bishop, 359 and note 204
Howard, Sir Charles, 28
— John, 59, 60, 248
Howlett, Rev. —, 97, 285
Hunter, Dr. Alexander, 61 and note 53
Hutchinson, Rev. Mr., 336
Huthhausen, Baron, 173

Ingoldsby, Dr., 19
— General, 4, 5, 6, 7
— Mrs., 6, 10

Jarré, General, 123


Jarvis, Lord, 267
Jefferys, Mr. and Mrs., 73
Jenkinson, Mr., 256
Jenyns, Soame, 244, 245
Jermyn, Sir Thomas, 2
Jobson, Rev. Mr., 337
John of Austria, Archduke, 463, 464
Johnson, Dr., 26, 27, 32 note 31, 285 note 177, 353, 421, 422
Joy, Mrs., 19

Kalaskowski, Count, 144


Kames, Lord, 84 and note 63
Keene, Mr. and Mrs., 3
Kennon, Mrs. Sidney, 10, 11, 12, 13
Kenrick, Dr., 27 and note 14
Keppel, Lord, 108
Kingsborough, Lord, 76, 77-80
Kinsman, Mr. (master of Bury St. Edmunds School), 7
Knight, Cornelia, 368 and note 208

Lafayette, 191
Lamb, Mr. (King’s Messenger), 45
Lambert, Captain, 29
Langford, Dr., 142
Latrobe, B. H., 172
Lauderdale, Lord, 314, 397
Law, Thomas, 229
Lawrence, Dr., 345
Lazowski, M. de, 119 and note 91, 120-124, 154, 175
Leeds, Duke of, 51
— Sir George, 472
Leigh, Mr. (Clerk of the House of Commons), 261, 262
Liancourt, Duke of, 119 and note 90, 120-123, 154, 259, 382
Liverpool, Lord, 339
Llandaff, Bishop of. See Watson, Richard
Lofft, Capel, 101 and note 75, 102, 276, 291, 317, 318
Longford, Lord, 71
Loughborough, Lord, 86, 98, 207, 208, 219
Louisa, Princess (daughter of George II.), 16
Luther, Mr., 179

Macartney, Lord, 196


Macaulay, General, 460, 462
Macklin, Rev. Mr., 154, 267, 274
Macpherson, Sir John, 224, 225, 239
Macro, Mr., 170
Magellan, Mr., 150
Manchester, Duke of, 367, 395
March, Lord, 140, 141
Marlborough, Duke of, 26
Marshall, W., 427 and note 234, 429
Martin, Professor, 147
Massalski, Prince (Bishop of Wilna), 52
Mauduit, Israel, 255 and note
Medlicott, Mr., 71
Milbank, Lady, 350
Mildmay, Sir A. St. John, 348
Milner, Dean, 371 and note, 372
— Professor, 150
Miripoix, M. de (French Ambassador), 17
Moira, Lord, 243, 304
Moncrief, Sir Henry, 238
Montagu, Mrs., 233, 243, 244, 245, 312, 349, 358, 386
Montrose, Duke of, 245, 347
Mordaunt, Lady Mary, 52
More, Hannah, 233, 245, 246, 473
Mouron, M., 124
Murray, General, 171

Nepean, Mr., 341, 342


Neve, Miss, 467
Neville, Mr., 134
Newcastle, Duke of, 16
North, Lord, 60, 107, 201
— Rev. Mr., 344
Northey, Mr., 363

Oakes, Orbell Ray, 154, 266, 382, 388, 406


— Mrs. Orbell Ray, 266 and note, 270, 320, 321, 354, 359, 370
note 211, 382, 385, 388, 392, 438, 443, 444, 445, 446, 453
O’Connor, A., 317 and note
Oliver, Right Hon. Silver, 74
Onslow, Dr., 191
— General, 4, 15, 16, 191
— Lady, 20
— Mr. Speaker, 4, 20, 28
Orde, Mrs., 245
Orford, Earl of, 60, 206, 207, 233
Orwell, Lord, 16 note 9
Ossory, Lord, 331, 332, 395
Otto, L. W., Count of Morlay, 377 and note 215
Overton, J., 376 and note

Pakenham, Mr., 460, 461


Paley, Dr., 378
Parkyn, Mr., 367
Partridge, Rev. S., 280
Patulle, M., 36, 40, 42
Pearson, Dr., 376
Pelham, Lord, 381
Peterborough, Bishop of, 147
Peterson, Lady, 20
Petty, Lord Henry, 370
Phillips, Sir John, 17
Pigot, Admiral, 193
Pitt, William, 134, 137, 161, 166, 201, 203, 219, 221, 254, 255,
306, 314, 315, 327, 345, 346, 363, 371, 424, 427
Plampin, Betsy. See Oakes, Mrs.
— Captain John, 154, 423
Polignac, Prince and Princess de, 162 and note
Pope, Alexander (quoted), 136 note 102
Popple, Mr. (Governor of Bermuda), 13
Porteus, Bishop, 178, 179
Portland, Duke of, 51, 326
Potemkin, Prince, 102, 125
Poulett, Mr., 246
Preston, Lord, 367, 368
Priestley, Dr., 99, 150-153, 439

Queensberry, Duke of, 192

Radnor, Lord, 161


Richardson, Samuel, 192
Richmond, Legh, 460
Roberts, Dr., Provost of Eton, 142
— Lewis, 91 and note 67
Robertson, Messrs. (of Lynn), 22, 23
Rochefoucault, Counts de la, 119-121, 154
Rochester, Bishop of, 4, 28
Rockingham, Marquis of, 42, 50
Roper, Dr., 52
Rose, George (President of the Board of Trade), 132 and note
97, 137, 221, 241, 242
Ross, Bishop, 160
Rosslyn, Lord, 360
Rossmore, Lord, 347
Rostopchin, Count, 387, 401
Ruggles, Th., 194 and note 143
Rumford, Count, 323
Ryder, Lady Susan, 327, 328
— Mr., 327, 386

St. Vincent, Earl of, 418, 419


Sambosky, Rev. —, 124, 125
Saunderson, Dr. Nicholas, 14 note 8
Scott, Rev. Thomas, 349, 359, 391, 392
Seabright, Sir John, 463
Sheffield, Lord, 132, 220, 245, 258, 344 and note 199, 393, 395,
402, 407, 468, 469
Shelburne, Earl of (afterwards Marquis of Lansdowne), 67, 69,
102
Shelley, Mr., 20
Sheridan, R. B., 164 and note, 234
Shipley, Mr., 59 note 49
Sidmouth, Lord, 419, 434, 460
Simeon, Rev. Charles, 369, 395, 397, 398, 399, 400
Sinclair, Sir J., 159 and note 119, 160, 219, 220, 224, 241, 242,
243, 245, 247, 256, 299, 314, 315, 316, 413, 414, 437, 443,
464
Smirenove, Mr., 387, 400
Smith, Sydney (quoted), 445 note 239
Somers, Lord, 359
Somerville, Lord, 245, 315, 316, 318, 347, 361, 363, 384, 385,
404
Souga, Anthony (Austrian Consul), 169
Spencer, Lord, 313, 367
Stafford, Lord, 161
Stanhope, Lady, 313
— Philip, 33
Stanislas, King, 119
Stonehewer, Mr., 193, 346
Sturton, Sir Thomas, 344
Sutton, Dr. Robert, 8 and note 4
Symonds, Professor John, 103, 114 and note 84, 120-124, 129,
140, 144, 146, 154, 160, 184, 192, 201, 210, 236, 239, 253,
283, 295, 304, 344, 355, 362, 368, 400, 412, 419-421

Thornhill, Major, 78, 79


Thurlow, Lord, 161
Tillet, Mr. de, 170
Tomlinson, Mr., 22
— Mrs. (sister of Arthur Young), 20, 22, 126
Tour du Pin, Count de la, 257
Townsend, Rev. J., 407 and note 227
Townshend, Lord, 136
Trant, Mr. and Mrs., 73
Tuam, Archbishop of, 75
Turner, Miss, 15
Turton, Dr., 264, 273, 274

Valpy, Dr. Richard, 106 and note 81, 133, 297


Vancouver, 426 and note 232
Vansittart, Nicholas, (afterwards Lord Bexley), 426 and note 233,
435, 459
Vary, Mr., 140, 141, 193
Vassy, Governor, 4
Voltaire (quoted), 39

Wakefield, Edward, 75 and note


Washington, General, 189, 191, 360
Watson, Richard (afterwards Bishop of Llandaff), 97, 123, 124,
147, 150, 177-180, 236, 237, 254, 375
Way, Miss, 467
Wedderburn, Alexander (afterwards Earl of Rosslyn), 97 and
note 70
— Colonel, 16 note 9
Wellesley, Marquis of, 451
Wentworth, Lord, 245
Whitbread, Samuel, 52, 59, 161
Wight, Alexander, 84 and note
Wilberforce, William, 201, 287 and note 180, 288, 289, 297, 307,
325 and note, 326, 345, 348, 359, 371, 375, 407, 454 note,
457, 459
Wilkes, John, 10
Willes, Mr. Justice, 52
Willoughby, Sir C., 351, 413
Winchester, Earl of, 245
Winchilsea, Earl of, 244, 315, 351, 363, 388, 466, 470
Windham, William, 161
Wollaston, Dr., 277
Wurtemburg, Queen of, 425
Wyndham, M.P., Mr., 259

Yeldham, John, 47
York, Duke of, 16, 266, 327
York, Mrs., 245
Young, Rev. Dr. (father of Arthur Young, writer of the
Autobiography), 2-6, 8, 9, 10, 13, 14, 24
— Mrs. (mother of Arthur Young), 3, 22, 24, 28, 56, 57, 61, 77,
81, 126
— Arthur (son of Arthur Young), 51, 139, 143, 299, 317, 318,
323, 352, 363, 382, 402, 403, 406, 408, 415, 418, 428, 429,
432, 448, 456, 457, 464
— — (last descendant of Arthur Young), 127 note 94
— Mrs. Arthur (wife of Arthur Young), 32 and note 29, 46, 81,
142, 146, 319, 339, 389, 413, 424, 429, 438, 457, 460
— — — (daughter-in-law of Arthur Young), 323, 397, 398, 399,
404, 409, 415, 429, 446, 448, 457
— Bartholomew (grandfather of Arthur Young), 2
— Elizabeth (‘Bessy’) (second daughter of Arthur Young), 51,
146, 181, 189. See also Hoole, Mrs.
— Elizabeth Mary (‘Elisa Maria’) (sister of Arthur Young), 1, 15,
19, 20. See also Tomlinson, Mrs.
— Rev. Dr. John (brother of Arthur Young), 2, 9, 57, 107, 127,
132, 138-143
— Martha Ann (‘Bobbin’) (youngest daughter of Arthur Young),
110 and note 83, 158, 159 note 118, 184, 185, 263-284, 286,
287, 290, 294, 295, 298, 323, 382, 423
— Mary (eldest daughter of Arthur Young), 43, 184, 382, 425,
440, 454 note 244, 457, 472

PRINTED BY
SPOTTISWOODE AND CO., NEW-STREET SQUARE
LONDON

1. Lavenham is a very pretty village, with splendid church, lying


between Sudbury and Whelnethan, whilst Bury St. Edmunds is the
second town in Suffolk.
2. Memoirs of Richard Cumberland, 1806.
3. Elsewhere Arthur Young mentions a severe flogging ‘very
properly’ administered by his father for an act of cruelty, adding, ‘It
was the only time that I ever received any correction at his hands,
yet he was a remarkably passionate man.’
4. Robert Sutton, physician and inoculist, 1757, Dict. of Biography,
Sampson Low. Dr. Guy’s Public Health has the following: ‘The
Suttons were noted for their success in inoculation, but Dr. Gregory
gives more credit to diet and exposure to air than to the antimonial
and mercurial medicines they extolled.’
5. Hero of The Beaux’ Stratagem, G. Farquhar.
6. Till the last generation it was the fashion to brew one’s own beer
in Suffolk.
7. ‘The buildings and yards necessary for the business of a
farm.’—Webster.
8. Nicholas Saunderson, D.D., author of the Elements of Algebra,
in ten books, 1740.
9. The Mitchell Election, a petition brought by Lord Orwell and
Colonel Wedderburn against undue election and return for borough
of Mitchell, in Cornwall. See Commons’ Journals, xxii., xxiv. and
xxxii.
10. The Duke of Cumberland.
11. M. de Miripoix, then French Ambassador at St. James’s.
12. Dr. Charles Burney, author of the History of Music, father of
Madame d’Arblay.
13. The following notes are taken from a small memorandum-book
appended to memorials:—
‘1761. July 23.—Leak in full (meaning debts), 5l. 5s.
Sept. 22.—Coronation.
” 28.—To Court.
Oct. 9.—Blackheath; cards.
Dec. —To London with Ed. Allen.
” 31.—Debts 62l.
(My) History of the War published.’
14. Dr. Kenrick, critic of the Monthly Review, attacked Dr. Johnson,
who said, ‘I do not think myself bound by Kenrick’s rules.’
15. Joseph and Mary Collier; the first, author of a History of
England.
16. Gesner Solomon, born at Zurich, 1730.
17. The Jealous Wife.
18. The Alchemist.
19. The Provoked Wife.
20. The Man of Mode.
21. The Rehearsal.
22. Love Makes a Man.
23. The Wonder.
24. The Suspicious Husband.
25. The Beaux’ Stratagem.
26. She Stoops to Conquer.
This last play seems to have been first acted in 1773. See
Brewer’s Reader’s Handbook.
27. ‘The coarse pot-house valour of Sir John Brute, Garrick’s
famous part, is finely contrasted with the fine lady airs and
affectation of his wife.’—Chambers’s English Literature.
28. ‘All the domestic business will be taken from my wife’s hands, I
shall shall make the tea, comb the dogs, and dress the children
myself.’—Fribble, in Miss in her Teens (Garrick).
29. Mrs. Young was sister to Fanny Burney’s stepmother. The
marriage proved unhappy from the beginning.
30. See his work, Les Intérêts de la France mal entendus, Henri,
Comte de Boulainvilliers, voluminous author on French history, 1658-
1722.
31. Rev. W. Harte, poet, writer on rural affairs, historian, 1700-
1774. Dr. Johnson much commended Harte as a scholar and a man
of the most companionable talents he had ever known. He said the
defects in his history (Gustavus Adolphus) arose not from imbecility,
but from foppery. His Essays on Husbandry is an elegant, erudite,
and valuable work (Lowndes).
32. The accompanying letter is included in Arthur Young’s
correspondence of this year, and is given, although not addressed to
himself.
33. Duhamel du Monceau, botanist and agronome, contributor to
the Encyclopédie, 1700-1781.
34. Patulle. A French writer on agriculture.
35. Here is an illustration. The Suffolk husbandman’s afternoon
collation is invariably called ‘beaver.’ In Nares’ Glossary we find,
‘Bever, from the Sp. and It.: an intermediate refreshment between
breakfast and dinner.’ ‘Without any prejudice to their bevers,
drinkings, and suppers.’—B. and Fletcher, ‘The Woman Hater.’
36. ‘A child’s game, in which pins are pushed alternately.’—
Webster.
37. Included in A Farmer’s Letters.
38. ‘We were married more than two years.’ [Note by A. Y.]
39. ‘The Mashamshire Molly,’ afterwards Countess of Harcourt.
40. It must be remembered that turnips were a comparative
novelty at this date, not being cultivated as food for cattle till the
latter part of the last century.
41. Nicolo Piccini, 1728-1800, composer of the opera Zénobie, &c.
42. Samuel Whitbread, son of a great brewer, distinguished in
Parliamentary life as a vigorous assailant of Pitt; committed suicide
1815.
43. Entry in memorandum-book of this year: ‘The year’s receipts,
1,167l.’
44. Sic in author’s MS.; ‘translated’ would seem to be the word.
45. In a memorandum-book occurs the following entry: ‘1771.—
Receipts, 697l.; expenses, 360l.—I know not how.’
46. Estimated population of England and Wales in 1770,
7,428,000.—Haydn’s Dictionary of Dates.
47. The first census was taken in 1801.
48. 1740-1821. The friend and editor of Gibbon.
49. Founded 1754, mainly owing to the efforts of Mr. Shipley and
Lord Folkestone.
50. This evidently depended on the Society of Arts.
51. King’s waiter. I have not been able to discover the precise
nature of this sinecure.
52. ‘Mr. Young is not well, and appears almost overcome with the
horrors of his situation; in fact, he is almost destitute. This is a
dreadful trial for him, yet I am persuaded he will find some means of
extricating himself from his distress—at least, if genius, spirit, and
enterprise can prevail.’—Early Diaries of Fanny Burney.
53. Dr. Alexander Hunter, died 1809, editor of Evelyn’s Sylva, and
author of Georgical Essays, ‘an able and esteemed work’ (Lowndes).
54. Appears to have been brother to the Hon. Robert Arbuthnot,
third son of John, Viscount Arbuthnot, whose death is recorded in
the Annual Register of 1801.
55. First Marquis of Lansdowne; took part in Lord Chatham’s
Ministry.
56. This use of the word as respectworthy is noticeable.
57. Whisky: a light carriage built for rapid motion.—Webster.
58. Edward Wakefield, An Account of Ireland: Political and
Stastical. 1812.
59. Entries in memorandum-book ‘the year’s receipts, 1,145l.
Wrote Alcon and Flavia, a poem.’
60. This curious arrangement seems to have been faithfully kept,
as will be seen later on.
61. Wear: sea-term, to bring a ship on.—Bailey’s Dictionary.
62. In memorandum-book occurs this note: ‘Correspondence with
Wight printed in his reports.’ This seems to be Alexander Wight,
author of ‘An Enquiry into the Rise and Progress of Parliament,
chiefly in Scotland.’
63. Henry Home, a Scotch judge, better known by the title of Lord
Kames, author of several legal and other works, among them
‘Introduction to the Art of Thinking.’ Died 1782.
64. Corn bounty in Ireland, 1780. This was granted by the Irish
Parliament. The Lord Lieutenant, in his speech at the close of the
session, said: ‘Ample bounties on the export of your corn, your linen,
and your sail-cloth have been granted.’ See Annual Register, 1780,
p. 338.
65. Beat: participial adjective.—Webster.
66. Stetch: as much land as lies between one farm and another.—
Prov. Eng., Halliwell.
67. Lewis Roberts, The Merchant’s Map of Commerce, London,
1638. ‘The first systematic writer upon trade in the English language’
(Lowndes).
68. Had this sentence appeared in print anterior to Macaulay’s
famous passage, the latter might have been deemed a plagiarism.
69. Hugh Boyd, a writer whose real name was Macaulay, author of
two political tracts now forgotten. Died at Madras in 1791, having
dissipated his wife’s fortune and his own.
70. Alex. Wedderburn, Earl of Rosslyn, Baron Loughborough. In
1778 Attorney-General; in 1793 succeeded Lord Thurlow to the
Chancellorship. Died 1805.
71. Richard Watson, a celebrated prelate. In 1796 he published an
answer to Paine’s Age of Reason. He was left an estate worth
24,000l. by a Mr. Luther, an entire stranger to him, author of many
theological works and memoirs of himself. Died 1816.
72. Died in 1804. There is a notice of this writer in Watts’
Bibliotheca Britannica.
73. Irish Linen Board, established 1711; the Board abolished 1828.
We do not learn upon what business Mr. Arbuthnot had gone to
France.
74. That Arthur Young’s society was equally agreeable to the other
sex Fanny Burney tells us. In the gossipy, ecstatic journal of her
girlhood she writes: ‘Last night, whilst Hetty, Susey, and myself were
at tea, that lively, charming, spirited Mr. Young entered the room. Oh,
how glad we were to see him!’
75. A Suffolk squire, ardent Whig, and of considerable literary
attainments. At his expense was published Bloomfield’s Farmer’s
Boy.
76. Frederick Hervey, Episcopal Earl of Bristol. The Annual
Register for 1803 has the following: ‘His love of art and science was
only surpassed by love of his country and generosity to the
unfortunate of every country. He was a great traveller, and there is
not a country of Europe in which the distressed have not obtained
his succour. He was among the leaders of Irish patriots during the
American War, and a member of the Convention of Volunteer
Delegates in 1782. He was on this occasion escorted from Derry to
Dublin by volunteer cavalry, receiving military honours at every town.
He died at Albano, Rome, surrounded by artists whose talents his
judgment had directed and whose wants his liberality had supplied.’
77. By an irony of fate, Arthur Young, who had found farm after
farm in his own hands a disaster, was now by general acceptance
the first European authority on agriculture.
78. The History and Antiquities of Hawstead and Hardwicke, in
Suffolk. The second edition appeared in 1813, with notes by Sir T.
Gery-Cullum.
79. Author of many antiquarian treatises.
80. Sold by auction in December 1896.
81. Richard Valpy, D.D., 1754-1836, distinguished scholar,
voluminous writer on educational works, and author of the famous
Greek and Latin grammars.
82. This Bill to disable Revenue officers from voting in
Parliamentary elections was introduced April 16, 1782, and read a
third time on the 25th; read a third time in the House of Lords by 34
Contents to 18 Non-contents. See. Hansard.
83. ‘My lovely Bobbin’—christened Martha Ann—the adored child
whose loss at the age of fourteen was the great sorrow of Arthur
Young’s life. The pet name of ‘Bobbin’ originated in that of ‘Robin,’
which the child gave herself but could not pronounce.
84. Dr. J. Symonds, Professor of Modern History at Cambridge,
was LL.D., and wrote a book, Hints and Observations on Scripture.
85. The Bishop misquotes from memory. The quotation is from
Horace, Ep. Bk. I. iii. 21; agis should be audes.
86. Published 1703, giving an account of the trial of Charles I., of
Montrose, &c.
87. Died in great poverty, 1808, and was buried in St. Paul’s
Cathedral.
88. The society of Arts, Adelphi.
89. This apparently refers to Barry’s report of the Royal Academy.
90. The friend of Louis XVI., who summoned courage to announce
the fall of the Bastille. ‘It is a revolt?’ said the King. ‘Sire,’ replied the
Duke, ‘it is a revolution.’ This amiable and well-intentioned man
leaned towards a constitutional monarchy; finding this hopeless, he
emigrated, returning after exile to Liancourt (Seine and Oise), ending
his days among a community he had raised morally and materially.
Died 1827.
91. His brother must not be wholly judged from Madame Roland’s
portrait, penned in prison. The ‘Queen of the Gironde’ no more than
her fellow-partisans was free from political animus. It is true that
Lazowski threw himself into the very heart of Sans-culottisme, and
that his funeral oration (1792) was pronounced by Robespierre. His
alleged share in the September massacres requires stronger
evidence than that of his bitterest enemies at bay.
92. ‘That part of a horse’s foot between the toe and heel, being the
side of the coffin.’—Farrier’s Dict.
93. This project developed into one much more formidable than
the writer at this period conceived, namely, that monumental history
—or, rather, encyclopædia—of agriculture never destined to see the
light. For three-quarters of a century the ten folio volumes of
manuscript garnished the library of Bradfield Hall, perhaps once in
twenty years to be taken down by some curious guest. What was to
have been Arthur Young’s crowning achievement and legacy to
future ages is, fortunately, not wholly lost to posterity. The ten
volumes are now housed in the MS. department of the British
Museum.
94. Bradfield Hall was sold on the death of Arthur Young’s last
descendant, the late Arthur Young, Esq., in 1896.
95. Proverb, ‘The pot calls the kettle black.’—Bailey’s Dict.
96. The Peace of Westphalia.
97. George Rose, President of the Board of Trade. Died 1818.
98. This measure is referred to on page 137.
99. Arthur Young’s fishing parties are described in Fanny Burney’s
Camilla.
100. Founded 1785.
101. Robert Bakewell, died 1795, a celebrated grazier. It was
wittily remarked that ‘his animals were too dear for anyone to buy,
and too fat for anyone to eat.’
102. ‘Turnip Townshend,’ ancestor of the Lord Townshend here
named, was celebrated in the famous lines—

‘Why of two brothers, rich and restless, one


Ploughs, burns, manures, and toils from sun to sun;
The other slights for women, sports, and wines,
All Townshend’s turnips and all Grosvenor’s mines.’
Pope’s 6th translation of Horace.
103. This seems to refer to Mr. Pitt’s resolutions upon the
commercial intercourse between England and Ireland. The debate
thereon began February 22, 1785. See. Hansard.
104. How different would be the list of a labouring man’s
‘necessaries’ in these days!
105. Arthur Young’s only son, born 1769.
106. Express, n., a messenger sent on a special errand.—
Webster.
107. Published in the Annals.
108. Dr. Egan, Royal Park Academy.
109. The writer’s memory is at fault here. His correspondence with
Dr. Priestley is dated 1783. The letters, however, are given here, as
otherwise they would not be intelligible.
110. Mr. Magellan. This gentleman, often mentioned in A. Y.’s
correspondence as descendant of the great Portuguese discoverer,
seems to have attained some proficiency—even eminence—in
science.
111. Secretary to the Society of Arts.
112. An old Suffolk family. Captain Plampin, mentioned in the
French travels, is noticed in the new Dictionary of National
Biography.
113. M. Lazowski’s broken English is given as we find it.
114. Catalonia.
115. It has been found impossible to include this letter from
want of space.
116. A sort of plough for sowing grain in drills.
117. A chance or passing boat.
118. As Arthur Young’s letters, with trifling excisions, are
incorporated into the famous travels, I do not give them here. His
anxiety about Bobbin is ever apparent. ‘Give Bobbin a kiss for
me. God send her well,’ he writes to his eldest daughter Mary;
and, in another letter, ‘Remember me to your mother, and tell
Bobbin I never forget her.’ ‘The Robin,’ or Bobbin, was now five
years old.
119. Statist, political and agricultural writer; born 1754, died
1835. Sat in Parliament for several constituencies, and took an
active part in political and scientific movements; was also a
voluminous writer.
120. The Prince and Princess de Polignac, after receiving
countless honours, privileges, and substantial favours from Louis
XVI. and the Queen, were among the first to desert them. The
present head of this ancient house married a daughter of Mr.
Singer, inventor of the sewing-machine.
121. A Bill prohibiting the exportation of wool passed the
House of Commons, May 15, 1788.
122. President of the Royal Society, and supporter of the
cause of agriculture and science; died 1810.
123. ‘Then came the Oude case, that lasted no less than
twenty-one days, and ended by a speech from Sheridan on
which great labour and pains had been bestowed. This speech
had been looked forward to as rivalling the great Begum speech
of the same orator’ (Knight). Is not A. Y. here thinking of the great
Begum speech of an earlier session?
124. The author of Sandford and Merton died 1789 from the
kick of a colt, which he had refused to have broken in on account
of the cruelty usually involved in the process.
125. Died 1809. One of the most active members of the Royal
Humane Society; fell a victim to his devotion in attending the sick
and wounded Austrian soldiers on the field of Wagram.
126. See vol. x. of the Annals of Agriculture.
127. Sic.
128. Written about 1816.
129. Wild chicory or succory, used by the French as a winter
salad, and in the adulteration of coffee.
130. Refers (see below) to a work by Baron Huthhausen on
the servitude of the Silesian peasantry.
131. For this article see Annals of Agriculture, vol. ix. p. 479.
132. Son of a Protestant pastor of Nîmes, member of the
Constituent Assembly; guillotined 1784. See Letters of Helen
Maria Williams.
133. His country by adoption; Lazowski was a Pole.
134. Abbot.
135. ‘January 30, 1790. To Bradfield, and here terminate, I
hope, my travels.’—Travels in France, Bohn’s Library.
136. This letter is interesting as written by the last
representative of that unhappy country in England. We read in
Knight’s History of England, vol. v., that, on the reassembling of
Parliament after the partition of Poland no allusion whatever was
made in the House of Commons to that event. The final partition
treaty was signed in 1795 by Russia, Prussia, and Austria.
137. The passage occurs in the small memorandum-book from
which I have occasionally quoted particulars of yearly expenses,
&c.
138. Vol. xv. 1791, My Own Memoirs.
139. These letters were sold by Sotheby, Wilkinson & Co.,
London, December 1896.
140. Louis d’or at this time worth 24 francs.—Littré.
141. The well-known Colonel George Hanger, afterwards
fourth Lord Coleraine. ‘He served in the Army during the
American War, and was afterwards a distinguished character in
high society. Wrote his Life, Adventures, and Opinions.’—Annual
Register, 1824.
142. See on this subject Gibbon’s Rome, vol. xi. ch. lxi.
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

textbookfull.com

You might also like