A Pattern Recognition Method For Electronic Noses
A Pattern Recognition Method For Electronic Noses
Title
A pattern recognition method for electronic noses based on an olfactory neural network
Permalink
https://escholarship.org/uc/item/9tr82425
Journal
Sensors and Actuators B, 125
Authors
Fu, Jun
Li, Guang
Qin, Yuqi
et al.
Publication Date
2007
Peer reviewed
Received 7 November 2006; received in revised form 27 February 2007; accepted 27 February 2007
Available online 12 March 2007
Abstract
Artificial neural networks (ANNs) are generally considered as the most promising pattern recognition method to process the signals from a
chemical sensor array of electronic noses, which makes the system more bionics. This paper presents a chaotic neural network entitled KIII, which
modeled olfactory systems, applied to an electronic nose to discriminate six typical volatile organic compounds (VOCs) in Chinese rice wines.
Thirty-two-dimensional feature vectors of a sensor array consisting of eight sensors, in which four features were extracted from the transient
response of each TGS sensor, were input into the KIII network to investigate its generalization capability for concentration influence elimination
and sensor drift counteraction. In comparison with the conventional back propagation trained neural network (BP-NN), experimental results show
that the KIII network has a good performance in classification of these VOCs of different concentrations and even for the data obtained 1 month
later than the training set. Its robust generalization capability is suitable for electronic nose applications to reduce the influence of concentration
and sensor drift.
© 2007 Published by Elsevier B.V.
Keywords: Artificial neural networks; Electronic nose; Pattern recognition; Transient phase; Olfactory model; Sensor drift
Fig. 4. An example of time series from (a) P2, G2, E1, A1 and (b) M1 node of the 32-channel KIII network with constant stimulus from 50 to 250 steps injected via
receptor. (c) The phase portrait of attractor with M1 node against G2 node in OB (start from red, then to black and end in blue) (For interpretation of the references
to colour in this figure legend, the reader is referred to the web version of the article.).
central sources of noise in olfactory systems. It offers a conver- training can be expressed by a vector:
gence of statistical measures on the KIII output trajectories under
perturbations of initial conditions of variables and parameter Φ = [SD(1), SD(2), . . . , SD(n)] (5)
values [22]. In training phase, every time the modified Hebbian learning
rule and the habituation learning rule [24] are employed to mod-
3.2. Learning rule and classification algorithm ify lateral weights Wmml between all M1 nodes in the OB layer
(short for Wij and Wji), as shown in Eq. (6). If the activities of two
When a pattern to be learned expressed by an n-dimension nodes, M1(i) and M1(j) for each pair of i and j, are larger than
vector is input in parallel into an n-channel KIII network, the the mean activity of the OB layer, they are considered to be co-
system that presents an aperiodic oscillation in its basal state at activated by the external stimulus and their connection weights
first will soon go to a specific local basin of an attractor wing, are strengthened by the modified Hebbian learning rule. Other-
with a gamma range of quasi-periodic burst, corresponding to wise their connection weights decrease at the habituation rate
this pattern, as shown in Fig. 4. The system memory is defined hhab and eventually diminish asymptotically toward zero after
as the collection of basins and attractor wings of the KIII net- several learning cycles.
work, and a recall is the induction by a state transition of a
spatiotemporal gamma oscillation [24]. When used for pattern IF SD(i) > (1 + K)SDm AND SD(j) > (1 + K)SDm
recognition, the outputs of the KIII network are expressed in the THEN Wij = hHeb , Wji = hHeb (6)
form of a spatial amplitude modulated (AM) pattern of a chaotic ELSE Wij = hhab Wij , Wji = hhab Wji
oscillation in the multi-channel OB layer. Many mathematical
methods to extract information from the outputs of the model are where SDm = (1/n) nk=1 SD(k) and i, j, k = 1, 2, . . ., n and i = j.
proposed, such as standard deviation (SD) [24], singular-value W stands for the weight after learning, while W is the original
decompositions (SVD) [34], root mean square (RMS), principal weight; hHeb and hhab are the learning constant of the Hebbian
components analysis (PCA) and fast Fourier transform (FFT) reinforcement and habituation, respectively. The bias coefficient
[35]. K is defined to avoid the saturation of the weight space.
In this work, the SD method is adopted. The burst in each M1 The learning process continues until the weight changes of
node is portioned into s equal segments, and the mean value of Wmml converge to a desired level. At the end of learning, the
the individual SD of these segments is calculated as SD(k), as cluster centroid of every pattern Ci is determined and the connec-
in Eq. (4). tion weights are fixed in order to perform classification using the
trained network. While inputting an unknown sample t from the
s
1 test set, the Euclidean distances from the corresponding activity
SD(k) = SDr , k = 1, 2, ..., n (4)
s vector Φt to those training pattern cluster centroids Ci are esti-
r=1
mated, and the minimum distance determines the classification.
When a new sample is presented to the KIII network with n All calculations and data processing in this study were imple-
channels, the activity measure over the whole OB layer in this mented in MATLAB (version 7.1, Mathworks, USA) on a
J. Fu et al. / Sensors and Actuators B 125 (2007) 489–497 493
Fig. 6. Euclidean distance from all samples in Dataset I to different cluster centroids of: (a) lactic acid, (b) ethanol, (c) acetic acid, (d) ethyl acetate, (e) isoamyl
alcohol and (f) acetaldehyde. Symbols: (♦) lactic acid, () ethanol, () acetic acid, () ethyl acetate, (*) isoamyl alcohol and (夽) acetaldehyde.
494 J. Fu et al. / Sensors and Actuators B 125 (2007) 489–497
Table 1
Correction rate of classification (%) of the KIII trained by 30 mL/3000 mL,
50 mL/3000 mL and 70 mL/3000 mL samples, respectively
30 mL/3000 mL 50 mL/3000 mL 70 mL/3000 mL
Table 2
Classification correction rates (%) of Datasets I, II and III using the KIII, NPA and BP-NN
Dataset I (May) Dataset II (June) Dataset IIIa (August)
Lactic acid 85.0 100 87.7 98.0 100 78.3 50.0 83.3 48.6
Ethanol 100 100 100 100 100 87.6 100 100 62.0
Acetic acid 100 100 86.3 100 100 74.5 86.7 100 52.6
Ethyl acetate 100 100 100 100 100 99.3 0 0 44.6
Isoamyl alcohol 100 100 97.3 100 100 86.5 86.7 93.3 52.0
Acetaldehyde 100 100 86.0 65.0 0 18.3 46.7 0 28.7
Average 97.5 100 92.9 93.9 83.3 74.1 61.6 62.7 48.1
a Only the samples with concentration of 30 mL/3000 mL in Dataset III are used to make the results comparable with Datasets I and II.
of six trials are shown in Table 2. For the KIII network, the The BP-NN used in this paper is composed of 32 input nodes,
average correction rates for the samples of Datasets I, II and 10 hidden nodes and six output nodes representing the clusters.
III are 97.5%, 93.9% and 61.6%, respectively. It is clear that The tan-sigmoid transfer function is selected for both the hid-
the classification accuracy declined a little 1 month later though den and the output layers. A gradient descent with a learning
it dropped dramatically 3 months later. It can be seen that the ratio of 0.05 is chosen. To make fair comparison, both BP-NN
KIII network has capability to counteract the sensor drift in 1 and the KIII were trained using the same training set until both
month. It is not surprising that the KIII network misrecognized mean-squared errors reached the same order of magnitude. The
all ethyl acetate as acetaldehyde in Dataset III since the samples neuron in BP-NN with the highest score in the output layer
of ethyl acetate obtained in August moved into the region of indicates which class the input sample belongs to. This tolerant
acetaldehyde, as shown in the PCA plots in Fig. 7. classification criterion is similar to what used with the KIII. Five
A simple nonparametric algorithm (NPA) based on the different runs were conducted for each trial to reduce the effect
Euclidean distance metric was used for comparison. In our of the random initial weights in the training phase. The classi-
experiments, the training set was adopted as the pattern tem- fication results of BP-NN are presented in Table 2 along with
plates and the Euclidean distances between testing samples and those of the KIII. The performance of BP-NN is not as good
templates were calculated. The nearest neighbors are classified as the KIII under similar situation, although an optimization by
as one class. The same classification criteria were employed in trial and error may improve it quite a lot.
the KIII application. Comparing the classification accuracy of
the KIII and NPA (as shown in Table 2), we can find that the 5. Conclusions
KIII network has better generalization capability.
In this paper, a biologically inspired neural network, based
4.4. Performance comparison with BP-NN on anatomical and electroencephalographic studies of biological
olfactory systems, is applied to pattern recognition in elec-
Usually, PARC selections are application-oriented and empir- tronic noses. Classifying six VOCs commonly presented in the
ical. Some criteria, including high classification accuracy, fast, headspace of Chinese rice wine, its performance to eliminate the
simple to train, low memory requirement, robust to outliers and concentration influence and counteract sensor drift is examined
to produce a measure of uncertainty, are proposed to attempt to and compared with the simple nonparametric algorithm and the
determine the optimal classifier [38]. And several researchers well-known BP-NN.
have compared different PARCs employed by electronic noses The KIII neural network has a good performance in classi-
[38,39]. To compare the performances, a conventional ANN, fication of six VOCs of different concentrations, even for the
the back propagation trained neural network (BP-NN), as well patterns obtained 1 month later than what was used for training.
as the KIII network, was applied to the classification. The BP- Its flexibility and robust fault tolerance are quite suitable for
NN algorithm was taken from the neural network toolbox in electronic nose applications, subjecting to the problems associ-
MATLAB. ated with the susceptibility to concentration influence and sensor
Being one of the most popular ANNs in electronic noses, BP- drift. Compared with BP-NN, the application of the KIII neu-
NN has become the de facto standard for pattern recognition of ral network is time-consuming and requires a lot of memory
signals from a chemical sensor array. BP is a supervised learn- to solve lots of ODEs constructing the KIII; e.g., a 32-channel
ing algorithm based on the generalized delta rule, usually using KIII network consists of over 200 ODEs. Although one classifi-
gradient descent for minimizing the total squared output error cation performance required about 1 min in our experiments, it is
between the desired and the actual net outputs. The performance fast enough to satisfy application requirement. Efficient numeri-
of BP-NN is dependent on several factors, e.g., the number of cal computation methods and DSP and VLSI hardware specially
hidden layers, learning rate, momentum and training data. More designed for parallel implementation are under research for other
details can be referred to Ref. [40]. real-time applications.
496 J. Fu et al. / Sensors and Actuators B 125 (2007) 489–497
The purpose of this paper is not to prove that the KIII net- [17] M. Garcia, M. Aleixandre, J. Gutierrez, M.C. Horrillo, Electronic nose
work is superior to other techniques of signal processing in the for wine discrimination, Sensor Actuators B Chem. 113 (2006) 911–
electronic nose community. Instead, we just like to introduce a 916.
[18] C. Di Natale, A. Macagnano, A. D’Amico, F. Davide, Electronic-nose mod-
new method to process sensor array signals and to attract more eling and data analysis using a self-organizing map, Meas. Sci. Technol. 8
researchers to pay attentions to this biological model of olfac- (1997) 1236–1243.
tory systems. Future works will be addressed on improving the [19] Y. Yao, W.J. Freeman, Model of biological pattern recognition with spa-
performance of the KIII network in electronic nose application, tially chaotic dynamics, Neural Netw. 3 (1990) 153–170.
especially fully utilizing the spatio-temporal dynamics proper- [20] W.J. Freeman, Neurodynamics: An Exploration of Mesoscopic Brain
Dynamics, Springer-Verlag, London, UK, 2000.
ties of the model for time series signals from chemical sensor [21] H.J. Chang, W.J. Freeman, Parameter optimization in models of the olfac-
arrays. More biologically oriented learning and classification tory neural system, Neural Netw. 9 (1996) 1–14.
rules are still under investigation. We are sure that the study [22] H.J. Chang, W.J. Freeman, B.C. Burke, Biologically modeled noise stabi-
on electronic noses will help to understand signal processing in lizing neurodynamics for pattern recognition, Int. J. Bifurc. Chaos 8 (1998)
biological olfactory systems and vice versa. 321–345.
[23] H.J. Chang, W.J. Freeman, B.C. Burke, Optimization of olfactory model in
software to give 1/f power spectra reveals numerical instabilities in solu-
Acknowledgements tions governed by aperiodic (chaotic) attractors, Neural Netw. 11 (1998)
449–466.
This research is supported by the National Basic Research [24] R. Kozma, W.J. Freeman, Chaotic resonance–methods and applications for
robust classification of noise and variable patterns, Int. J. Bifurc. Chaos 11
Program of China (973 Program, project No. 2004CB720302), (2001) 1607–1629.
the National Natural Science Foundation of China (No. [25] R. Kozma, W.J. Freeman, Classification of EEG patterns using nonlin-
60421002) and the Y.C. Tang Disciplinary Development Fund. ear dynamics and identifying chaotic phase transitions, Neurocomputing
The authors thank Mr. Jun Zhou and Miss. Lehan He for their 44–46 (2002) 1107–1112.
experimental assistance and fruitful discussions. [26] X. Li, G. Li, L. Wang, W.J. Freeman, A study on a bionic pattern clas-
sifier based on olfactory neural system, Int. J. Bifurc. Chaos 16 (2006)
2425–2434.
References [27] R. Gutierrez-Osuna, A. Gutierrez-Galvez, Habituation in the KIII olfactory
model with chemical sensor arrays, IEEE Trans. Neural Netw. 14 (2003)
[1] L. Buck, R. Axel, A novel multigene family may encode odorant receptors: 1565–1568.
a molecular basis for odor recognition, Cell 65 (1991) 175–187. [28] A. Gutierrez-Galvez, R. Gutierrez-Osuna, Increasing the separability of
[2] J.W. Gardner, P.N. Bartlett, A brief history of electronic noses, Sensor chemosensor array patterns with Hebbian/anti-Hebbian learning, Sensor
Actuators B Chem. 18 (1994) 211–220. Actuators B Chem. 116 (2006) 29–35.
[3] J.W. Gardner, P.N. Bartlett, Electronic Noses: Principles and Applications, [29] Z.D. Bao, R.N. Xu, Analysis of the flavor components in Yellow rice wine,
Oxford University Press, New York, 1999. Liquor Mak. 5 (1999) 65–67 (in Chinese).
[4] K.C. Persaud, P. Wareham, A.M. Pisanelli, E. Scorsone, Electronic nose—a [30] E. Llobet, J. Brezmes, X. Vilanova, J.E. Sueiras, X. Correig, Qualitative
new monitoring device for environmental applications, Sens. Mater. 17 and quantitative analysis of volatile organic compounds using transient and
(2005) 355–364. steady-state responses of a thick-film tin oxide gas sensor array, Sensor
[5] R.E. Baby, M. Cabezas, E.N. Walsöe de Reca, Electronic nose: a useful tool Actuators B Chem. 41 (1997) 13–21.
for monitoring environmental contamination, Sensor Actuators B Chem. [31] W.J. Freeman, Mass Action in the Nervous System, Academic Press, New
69 (2000) 214–218. York, 1975.
[6] M.P. Marti, R. Boqué, O. Busto, J. Guasch, Electronic noses in the qual- [32] W.J. Freeman, Simulation of chaotic EEG patterns with a dynamic model
ity control of alcoholic beverages, Trac-Trends Anal. Chem. 24 (2005) of the olfactory system, Biol. Cybern. 56 (1987) 139–150.
57–66. [33] W.J. Freeman, Nonlinear gain mediating cortical stimulus-response rela-
[7] S. Ampuero, J.O. Bosset, The electronic nose applied to dairy products: a tions, Biol. Cybern. 33 (1979) 237–247.
review, Sensor Actuators B Chem. 94 (2003) 1–12. [34] S. Quarder, U. Claussnitzer, M. Otto, Using singular-value decomposi-
[8] E. Schaller, J.O. Bosset, F. Escher, Electronic noses and their application tions to classify spatial patterns generated by a nonlinear dynamic model
to food, LWT-Food Sci. Technol. 31 (1998) 305–316. of the olfactory system, Chemometr. Intell. Lab. Syst. 59 (2001) 45–
[9] J.W. Gardner, H.W. Shin, E.L. Hines, An electronic nose system to diagnose 51.
illness, Sensor Actuators B Chem. 70 (2000) 19–24. [35] K. Shimoide, W.J. Freeman, Dynamic neural network derived from the
[10] J. Yinon, Detection of explosive by electronic noses, Anal. Chem. 75 (2003) olfactory system with examples of applications, IEICE Trans. Fundam.
98A–105A. Electron. Commun. Comput. Sci. E78-A (1995) 869–884.
[11] K.J. Albert, N.S. Lewis, C.L. Schauer, G.A. Sotzing, S.E. Stitzel, T.P. Vaid, [36] M. Holmberg, F. Winquist, I. LundstrÖm, F. Davide, C. Dinatale, A.
D.R. Walt, Cross-reactive chemical sensor arrays, Chem. Rev. 100 (2000) D’Amico, Drift counteraction for an electronic nose, Sensor Actuators B
2595–2626. Chem. 36 (1996) 528–535.
[12] D. James, S.M. Scott, Z. Ali, W.T. O’Hare, Chemical sensors for electronic [37] T. Artursson, T. Eklöv, I. Lundström, P. Mårtensson, M. Sjöstrom, M.
nose systems, Microchim. Acta 149 (2005) 1–17. Holmberg, Drift correction for gas sensors using multivariate methods,
[13] E.L. Hines, E. Llobet, J.W. Gardner, Electronic noses: a review of signal J. Chemometr. 14 (2000) 711–723.
processing techniques, IEE Proc. Circuit Device Syst. 146 (1999) 297–310. [38] R.E. Shaffer, S.L. Rose-Pehrsson, R.A. McGill, A comparison study of
[14] R. Gutierrez-Osuna, Pattern analysis for machine olfaction: a review, IEEE chemical sensor array pattern recognition algorithms, Anal. Chim. Acta
Sens. J. 2 (2002) 189–202. 384 (1999) 305–317.
[15] A.K. Srivastava, Detection of volatile organic compounds (VOCs) using [39] M. Bicego, G. Tessari, G. Tecchiolli, M. Bettinelli, A comparative anal-
SnO2 gas-sensor array and artificial neural network, Sensor Actuators B ysis of basic pattern recognition techniques for the development of
Chem. 96 (2003) 24–37. small size electronic nose, Sensor Actuators B Chem. 85 (2002) 137–
[16] M. Sriyudthsak, A. Teeramongkolrasasmee, T. Moriizumi, Radial basis 144.
neural networks for identification of volatile organic compounds, Sensor [40] S. Haykin, Neural Networks: A Comprehensive Foundation, 2nd ed.,
Actuators B Chem. 65 (2000) 358–360. Prentice-Hall, Englewood Cliffs, NJ, 1999.
J. Fu et al. / Sensors and Actuators B 125 (2007) 489–497 497