100% found this document useful (10 votes)
52 views43 pages

Constructing Classification Rules Based On SVR and Its Derivative Characteristics 1st Edition by Dexian Zhang, Zhixiao Yang, Yanfeng Fan, Ziqiang Wang 9783540738701 Download

The document discusses a new approach for constructing classification rules based on Support Vector Regression (SVR) and its derivative characteristics, aimed at improving data mining tasks. It introduces a measure for determining the importance of attributes using trained SVR, which enhances the validity of extracted classification rules, especially for complex problems. The paper outlines the methodology, representation of classification rules, and presents experimental results demonstrating the effectiveness of the proposed approach.

Uploaded by

kreckhalasf2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (10 votes)
52 views43 pages

Constructing Classification Rules Based On SVR and Its Derivative Characteristics 1st Edition by Dexian Zhang, Zhixiao Yang, Yanfeng Fan, Ziqiang Wang 9783540738701 Download

The document discusses a new approach for constructing classification rules based on Support Vector Regression (SVR) and its derivative characteristics, aimed at improving data mining tasks. It introduces a measure for determining the importance of attributes using trained SVR, which enhances the validity of extracted classification rules, especially for complex problems. The paper outlines the methodology, representation of classification rules, and presents experimental results demonstrating the effectiveness of the proposed approach.

Uploaded by

kreckhalasf2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

Constructing Classification Rules Based on SVR

and Its Derivative Characteristics 1st Edition


by Dexian Zhang, Zhixiao Yang, Yanfeng Fan,
Ziqiang Wang 9783540738701 pdf download
https://ebookball.com/product/constructing-classification-rules-
based-on-svr-and-its-derivative-characteristics-1st-edition-by-
dexian-zhang-zhixiao-yang-yanfeng-fan-ziqiang-
wang-9783540738701-11154/

Explore and download more ebooks or textbooks


at ebookball.com
Get Your Digital Files Instantly: PDF, ePub, MOBI and More
Quick Digital Downloads: PDF, ePub, MOBI and Other Formats

A Link Based Rank of Postings in Newsgroup 1st Edition by Hongbo Liu,


Jiahai Yang, Jiaxin Wang, Yu Zhang 97835407349873

https://ebookball.com/product/a-link-based-rank-of-postings-in-
newsgroup-1st-edition-by-hongbo-liu-jiahai-yang-jiaxin-wang-yu-
zhang-97835407349873-9280/

LNAI 2903 Cooperative Learning in Self Organizing ELearner Communities


Based on a Multi Agents Mechanism 1st Edition by Fan Yang, Peng Han,
Ruimin Shen, Bernd Kraemer, Xinwei Fan ISBN 9783540200574 354020057X

https://ebookball.com/product/lnai-2903-cooperative-learning-in-
self-organizing-elearner-communities-based-on-a-multi-agents-
mechanism-1st-edition-by-fan-yang-peng-han-ruimin-shen-bernd-
kraemer-xinwei-fan-isbn-9783540200574-354020/

An IP Traceback Scheme Integrating DPM and PPM 1st edition by Fan Min,
Jun yan Zhang, Guo wie Yang ISBN 3540202080 9783540202080

https://ebookball.com/product/an-ip-traceback-scheme-integrating-
dpm-and-ppm-1st-edition-by-fan-min-jun-yan-zhang-guo-wie-yang-
isbn-3540202080-9783540202080-10454/

Spatial Fuzzy Clustering Using Varying Coefficients 1st Edition by


Huaqiang Yuan, Yaxun Wang, Jie Zhang, Wei Tan, Chao Qu, Wenbin He
9783540738701

https://ebookball.com/product/spatial-fuzzy-clustering-using-
varying-coefficients-1st-edition-by-huaqiang-yuan-yaxun-wang-jie-
zhang-wei-tan-chao-qu-wenbin-he-9783540738701-9228/
Space Science and Technologies spacecraft autonomous navigation
Technologies based on multi source information fusion 1st edition by
Dayi Wang, Maodeng Li, Xiangyu Huang, Xiaowen Zhang 9789811548796
981154879X
https://ebookball.com/product/space-science-and-technologies-
spacecraft-autonomous-navigation-technologies-based-on-multi-
source-information-fusion-1st-edition-by-dayi-wang-maodeng-li-
xiangyu-huang-xiaowen-zhang-9789811548796-9811/

Chinese Patent Mining Based on Sememe Statistics and Key Phrase


Extraction 1st Edition by Bo Jin, Hong Fei Teng, Yan Jun Shi, Fu Zheng
Qu 9783540738701

https://ebookball.com/product/chinese-patent-mining-based-on-
sememe-statistics-and-key-phrase-extraction-1st-edition-by-bo-
jin-hong-fei-teng-yan-jun-shi-fu-zheng-qu-9783540738701-12954/

LNCS 2834 Study on CORBA Based Load Balance Algorithm 1st edition by
Gongxuan Zhang, Jianfang Ge, Changan Jiang ISBN 3540200541
978-3540200543

https://ebookball.com/product/lncs-2834-study-on-corba-based-
load-balance-algorithm-1st-edition-by-gongxuan-zhang-jianfang-ge-
changan-jiang-isbn-3540200541-978-3540200543-9814/

Battery Management System and Its Applications 1st edition by Xiaojun


Tan, Andrea Vezzini, Yuqian Fan, Neeta Khare, You Xu, Liangliang Wei
ISBN 1119154006 978-1119154006

https://ebookball.com/product/battery-management-system-and-its-
applications-1st-edition-by-xiaojun-tan-andrea-vezzini-yuqian-
fan-neeta-khare-you-xu-liangliang-wei-
isbn-1119154006-978-1119154006-24040/

An Investigation into the Detection of Human Scratching Activity Based


on Deep Learning Models 1st edition by Kevin Wang ISBN 979-8350399035
979-8350399028

https://ebookball.com/product/an-investigation-into-the-
detection-of-human-scratching-activity-based-on-deep-learning-
models-1st-edition-by-kevin-wang-
isbn-979-8350399035-979-8350399028-24008/
Constructing Classification Rules Based on SVR
and Its Derivative Characteristics

Dexian Zhang1 , Zhixiao Yang1 , Yanfeng Fan2 , and Ziqiang Wang1


1
College of Information Science and Engineering, Henan University of Technology,
Zhengzhou 450052, China
[email protected]
2
Computer College, Northwestern Polytecnical University, Xi’an 710072, China

Abstract. Support vector regression (SVR) is a new technique for pat-


tern classification , function approximation and so on. In this paper we
propose an new constructing approach of classification rules based on
support vector regression and its derivative characteristics for the classi-
fication task of data mining. a new measure for determining the impor-
tance level of the attributes based on the trained SVR is proposed. Based
on this new measure, a new approach for clas-sification rule construction
using trained SVR is proposed. The performance of the new approach
is demonstrated by several computing cases. The experimen-tal results
prove that the approach proposed can improve the validity of the ex-
tracted classification rules remarkably compared with other constructing
rule approaches, especially for the complicated classification problems.

1 Introduction

The goal of data mining is to extract knowledge from data. Data mining is an
inter-disciplinary field, whose core is at the intersection of machine learning,
statistics and databases. There are several data mining tasks, including classifi-
cation, regression, clustering, dependence modeling, etc. Each of these tasks can
be regarded as a kind of problem to be solved by a data mining approach. In
classification task, the goal is to assign each case (object, record, or instance) to
one class, out of a set of predefined classes, based on the values of some attributes
(called predictor attributes) for the case. In this paper we propose an new con-
structing approach of classification rules based on support vector regression and
its derivative characteristics for the classification task of data mining. The clas-
sification rule extraction has become an important aspect of data mining, since
human experts and corporate managers are able to make better use of the classi-
fication rules for making decision and easily discover unknown relationships and
patterns from a large data set than other expression forms of knowledge.
The existing approaches for constructing the classification rules can be roughly
classified into two categories, data driven approaches and model driven ap-
proaches. The main characteristic of the data driven approaches is to extract
the symbolic rules completely based on the treatment with the sample data.

R. Alhajj et al. (Eds.): ADMA 2007, LNAI 4632, pp. 300–309, 2007.

c Springer-Verlag Berlin Heidelberg 2007
Constructing Classification Rules Based on SVR 301

The most representative approach is the ID3 algorithm and corresponding C4.5
system proposed by J.R.Quinalan. This approach has the clear and simple theory
and good ability of rules extraction, which is appropriate to deal with the prob-
lems with large amount of samples. But it still has many problems such as too
much dependence on the number and distribution of samples, excessively sensi-
tive to the noise, difficult to deal with continuous attributes effectively and etc.
The main characteristic of the model driven approaches is to establish a model
at first through the sample set, and then extract rules based on the relation
between inputs and outputs represented by the model. Theoretically, these rule
extraction approaches can overcome the shortcomings of data driven approaches
mentioned above. Therefore, the model driven approaches will be the promis-
ing ones for rules extraction. The representative approaches are rules extraction
approaches based on neural networks [1-8]. Though these methods have certain
effectiveness for rules extraction, there exist still some problems, such as low
efficiency and validity, and difficulty in dealing with continuous attributes etc.
There are two key problems required to be solved in the classification rule
extraction, i.e. the attribute selection and the discretization to continuous at-
tributes. Attribute selection is to select the best subset of attributes out of
original set. The attributes that are important to maintain the concepts in the
original data are selected from the entire attributes set. How to determine the
importance level of attributes is the key to attribute selection. Mutual informa-
tion based attribute selection [9-10] is a common method of attribute selection, in
which the information content of each attribute is evaluated with regard to class
labels and other attributes. By calculating mutual information, the importance
levels of attributes are ranked based on their ability to maximize the evaluation
formula. Another attribute selection method uses entropy measure to evaluate
the relative importance of attributes [11]. The entropy measure is based on the
similarities of different instances without considering the class labels. In paper
[12], the separability-correlation measure is proposed for determining the impor-
tance of the original attributes. The measure includes two parts, the intra-class
distance to inter-class distance ratio and an attributes-class correlation mea-
sure. Through attributes-class correlation measure, the correlation between the
changes in attributes and their corresponding changes in class labels are taken
into account when ranking the importance of attributes. The attribute selec-
tion methods mentioned above can be classified into the sample driven method.
Their performance depends on the numbers and distributions of samples heav-
ily. It is also difficult to use them to deal with continuous attributes. Therefore,
it is still required to find more effective heuristic information for the attribute
selection and the discretization to continuous attribute in the classification rule
extraction.
In this paper, we use trained SVR to obtain the position and shape charac-
teristics of the classification hypersurface. Based on the analysis of the relations
among the position and shape characteristics of classification hypersurface, the
partial derivative distribution of the outputs of trained SVR to its corresponding
inputs and the importance level of attributes to classifications, this paper mainly
302 D. Zhang et al.

studies on the measure method of the classification power of attributes on the


basis of differential information of the trained SVR and develops new approach
for the rule extraction.
The rest of our paper is organized as follows. Section 2 discusses the represen-
tation of the classification rules . Section 3 describes the classifier construction
based on SVR. Section 4 presents the measure method for attribute importance
ranking. The rules extraction method is presented in section 5. Experimental
results and analysis are reported in Section 6. Finally, we give the conclusion in
Section 7.

2 Representation of the Classification Rules

Classification rules should be not only accurate but also comprehensible for the
user. Comprehensibility is important whenever classification rules will be used
for supporting a decision made by a human user. After all, if classification rules is
not comprehensible for the user, the user will not be able to interpret and validate
it. In this case, probably the user will not trust enough the classification rules
to use it for decision making. This can lead to wrong decisions.
In this paper, the classification rule is often expressed in the form of IF-
THEN rules which is commonly used, as follows: IF <conditions > THEN <
class>.The rule antecedent (IF part) contains a set of conditions, connected by
a logical conjunction operator (AND). In this paper we will refer to each rule
condition as a term, so that the rule antecedent is a logical conjunction of terms
in the form: IF term 1 AND term 2 AND ... Each term has two kind of forms.
One kind of the form is a triple <attribute, operator, value>. The operator can
be < ,≥ or = . Another kind of the form is a triple <attribute, ∈ , value range>.
The rule consequent (THEN part) specifies the class label predicted for cases
whose attributes satisfy all the terms specified in the rule antecedent.
This kind of classification rule representation has the advantage of being in-
tuitively comprehensible for the user, as long as the number of discovered rules
and the number of terms in rule antecedents are not large.

3 Constructing the Classifier Based on SVR

SVR is a new technique for pattern classification , function approximation and so


on. The SVR classifiers have the advantage that we can use them for classification
problem with more than 2 class labels. In this paper, we use an SVR classifier to
determine the importance level of attributes and construct classification rules.
Given a set of training sample points,{(xi , zi ), i = 1, .., l},such that xi ∈ Rn
is an input and zi ∈ R1 is a target output, The primal form of Support vector
Regression is
1
l
1
minw,b,ξ,ξ∗,ε wT w + C(νε + (ξi + ξi∗ )) (1)
2 l i=1
Constructing Classification Rules Based on SVR 303

subject to
(wT φ(xi ) + b) − zi ≥ ε + ξi (2)
zi − (w φ(xi ) + b) ≥ ε +
T
ξi∗ (3)
ξi , ξi∗ ≥ 0, i = 1, .., l, ε > 0 (4)
Here sample vector x are mapped into a higher dimensional space by the func-
tion φ.C > 0is the penalty parameter of the error item. In this paper we
usually let C = 10 ∼ 105 .The ν and ε are two parameters. The parameter
ν, ν ∈ (0, 1],control the number of support vectors. In this paper we usually set
ν = 0.5,ε = 0.1. Furthermore, K(xi , xj ) ≡ φ(xi )T φ(xj ). is the kernel function. In
this paper, we use the following radial basis function (RBF) as kernel function.

K(xi , xj ) = exp(−γ||(xi − xj )2 ||), γ > 0 (5)

Here,γ is kernel parameter. In this paper we usually let γ = (0.1 ∼ 1)/n, nis the
number of attributes in training sample set. The formula (1) dual is
1
minα,α∗ (α − α∗ )T Q(α − α∗ ) − z T (α − α∗ ) (6)
2
subject to
eT (α − α∗ ) = 0, eT (α + α∗ ) ≤ Cν (7)
0 ≤ αi , α∗ ≤ C, i = 1, . . . , l (8)
Where e is the vector of all ones, Q is a l by l positive semidefinite matrix. The
output function of SVR classifier is


l
Z(x) = (−α + α∗ )K(xi , x) + b (9)
i=1

Assuming the attribute vector of classification problems is x=[X1 , X2 , . . . ,Xn ],


where nis the number of attributes in sample set, and the corresponding clas-
sification label is z and z ∈ R1 , then the sample of classification problems can
be represented as < X, z >.In order to constructing SVR classifiers as the form
shown by formula (9), quan-tification and normalization of the attribute values
and classification labbels will be carried out as follows.
The quantification is performed for the values of discrete attributes and clas-
sification labels. In this paper, values of discrete attributes and classification
labels are quantified as integer numbers in some order, for example,0,1,2,3,....
The normalization is performed to adjust the of SVR input ranges. For a given
attribute value space Ω,utilizing the following linear transformation to map the
attribute value X of sample to the SVM inputx ,making every elements in the
x with the same range of [Δ, −Δ].

x = bX + b0 (10)
where b = (bij ) is a transformation coefficient matrix.Here
304 D. Zhang et al.
 2Δ
MaxXi −MinXi :j=i
bij = (11)
0: otherwise

b0 = (b0i ) is a transformation vector.Here,

b0i = Δ − aii M axXi (12)

The parameter Δ affects the generalization of trained SVM, in this paper we


usually set Δ = 0.5 ∼ 2.
During the construction of classification rules, only the attribute space covered
by the sample set should be taken into account. Obviously according to formula
(9), when the kernel function of SVR is the radial basis function shown by
formula (5), any order derivatives of network output Z(x) to each SVR input xk
exist.

4 Measure for Attribute Importance Ranking


Without losing the universality, next we will discuss the classification problems
with two attributes and two class labels.
For a 2-dimension classification problem, assuming the shape of classification
hypersurface in the given area Ω is as shown in Fig.1, in which the perpendic-
ular axis is attribute Z(x)is class label, the area A and B are the distribution
area of different classes. In the cases (a) and (b),the importance level of x1 for
classification is obviously higher, so area should be divided via attribute x1 .In
the case (c), attribute x1 and attribute x2 have the equal classification powers.
Therefore, for a given attribute value space Ω , the importance level of each at-
tribute depends on the mean perpendicular degree between each attribute axis
and classification hypersurface in space Ω or its adjacent space. The higher is
the mean perpendicular degree, the higher is the importance level.
For a given sample set, the attribute value space Ω is defined as follows.

Ω = {x|M inxk ≤ xk ≤ M axxk , k = 1, . . . , n} (13)

Where M inxk and M axxk are the minimal and maximal value of k-th attribute
in the given sample set, respectively.
For a given trained SVM and the attribute value Γ ,Γ ⊂ Ω, the perpendicu-
lar level between classification hypersurface and attribute axis xk is defined as
follows.
| ∂Z(x)
∂xk |
Pxk (x) =  (14)
∂Z(x) 2
k [( ∂xk ) + 1]
According to formula (14), the value of the perpendicular level Pxk (x), mainly
depends on the value ∂Z(x)
∂xk . Therefore for the convenience of computing, we can
use the following formula to replace formula (14).
∂Z(x)
Pk (x) = | | (15)
∂xk
Constructing Classification Rules Based on SVR 305

Z(x)
Z(x)

A
A
x1
x1
B
B
x2
x2
(a) (b)
Z(x)

x1

B
x2
(c)

Fig. 1. Typical shapes of classification hypersurface

From the formula (9) and (5), we can get

∂Z(x) 
l
= 2(−α + α∗ )γ(xik − xk )K(xi , x) (16)
∂xk i=1

Here the xik and the xk are the k-th attribute values of the j-th Support vector
and the sample point x, respectively.
For a given attribute value space Γ , Γ ⊂ Ω ,the measurement of classification
power of attribute xk is defined as follows.
Ê Pk (x)dx 

Ê
dx
: VΓ
dx = 0
JP (xk ) = VΓ (17)
0: otherwise

The importance level measure JP (xk ) of the attribute xk represents the influ-
ence degree of attribute xk to classification. So in the process of rules extraction,
the value JP (xk ) is the important instruction information for selecting attributes
and dividing attribute value space.
The typical classification problem of weather type for playing golf is employed
to demonstrate the performance of the new measure method. The computing re-
sults are shown as table 1. The attributes and their values are as follows: Outlook
306 D. Zhang et al.

has the value of sunny, overcast and rain, quantified as 0, 1, 2. Temperature has
the value of 65 ∼ 96. Humidity has the value of 65 ∼ 96. Windy has the value
of true, false, quantified as 0, 1. The size of the training sample set is 14.

Table 1. Computing Results of Measurement Value JP (xk )

Attributes Whole Area Outlook=sunny Outlook=rain


Outlook 0.0547 – –
Temperature 0.0292 0.0551 0.0334
Humidity 0.0353 0.0861 0.0552
Windy 0.0362 0.0457 0.0929

From table 1, in the whole space the measure value of importance level of
attribute outlook is the biggest, therefore attribute outlook should be selected
as the root node of the decision tree, and the attribute value space should be
divided by its values. While in the subspace of outlook= rain, measure value of
attribute windy is the biggest. Thus according to this information, the optimal
decision tree and corresponding classification rules can be generated.

5 Rules Extraction Method


The algorithm for classification rule construction from trained SVR proposed in
this paper is described as follows.
Step 1: Initializing.
a) Divide the given sample set into two parts, the training sample set and the
test set. According to the training sample set, generate the attribute value space
Ω by formula (13).
b) Set the interval number of attributes and the predefined value of error rate.
Step 2: Rule generating.
a) Generate a queue R for finished rules and a queue U for unfinished rules.
b) Select attribute xk with the biggest value of JP (xk ) computed by formula
(17) as the extending attribute out of the present attributes. Divide the attribute
xk into intervals according to the chosen interval number. Then for each interval,
pick attribute xj with the biggest JP (xk ) as the extending attribute for each
interval. Merge the pairs of adjacent intervals with the same extending attribute
and same class label with the largest proportion in all of the class labels. A rule
is generated for each merged interval. If the class error of the generated rule is
less than the predefined value, put it into the queue R, otherwise put it into the
queue U .
c) If U is empty, the extraction process terminates, otherwise go to d).
d) Pick an unfinished rule from the queue U by a certain order, and perform
division and mergence. A rule is generated for each merged interval. If the class
error of the generated rule is less than the predefined value, then put it into the
queue R, otherwise put it into the queue U . Go to c).
Constructing Classification Rules Based on SVR 307

Step 3: Rule Processing.


Check the rule number of each class label. Let the rules of the class label with
the largest number of rules be default rules.

6 Experiment and Analysis


The spiral problem [13] and congressional voting records(voting for short), he-
patitis, iris plant(iris for short), statlog australian credit approval(credit-a for
short) in UCI data sets [14] are employed as computing cases, shown in table 2.
The attribute value distribution of spiral problem is shown as Fig.2, in which
solid points are of Class C0, empty points are of Class C1.

Table 2. Computing Cases

Spiral Voting Hepatitis Iris Credit-A


Total Samples 168 232 80 150 690
Training Samples 84 78 53 50 173
Testing Samples 84 154 27 100 517
Classification Numbers 2 2 2 3 2
Total Attributes 2 16 19 4 15
Discrete Attributes 0 16 13 0 9
Continuous Attributes 2 0 6 4 6

Table 3. Experimental Results Comparison between New Approach(NA) and C4.5R

#Rules(NA: C4.5R) Err.Train(NA: C4.5R) Err.Test(NA: C4.5R)


Spiral 8: 3 0: 38.1% 1.1: 40.5%
Voting 3: 4 2.5%: 2.6% 2.5%: 3.2%
Hepatitis 3: 5 7.5%: 3.8% 11.1%: 29.6%
Iris 4: 4 0%: 0% 4%: 10%
Credit-A 5: 3 12.1%: 13.9% 14.5%: 14.9%

Since no other approaches extracting rules from SVR are available, we include
a popular rule learning approach i.e.C4.5R for comparison. The experimental re-
sults are tabulated in Table 3. For the spiral problem and the Iris plant problem,
the rules set extracted by the new approach are shown in Table 4 and Table 5,
respectively. Table 3 shows that the rules extraction results of the new approach
are obviously better than that of C4.5R, especially for spiral problem. For the
case of spiral problem, C4.5R is difficult to extract effective rules, but the new
approach has so impressive results that are beyond our anticipation. This means
that the new approach proposed can improve the validity of the extracted rules
for complicated classification problems remarkably. Moreover, the generalization
ability of those rules extracted by the new approach is also better than that of
rules extracted by the C4.5R.
308 D. Zhang et al.

Table 4. Rules Set of Spiral Problem Gemerated by the Algorithm Proposed

R1 x0 ≥ 2.22 −→ C1
R2 x0 [−2.22, −1.33) ∧ x1 < 1.68 −→ C1
R3 x0 [−1.33, 0) ∧ x1 < −1.75 −→ C1
R4 x0 [−1.33, 0) ∧ x1 [0.76, 1.2) −→ C1
R5 x0 [0, 1.33) ∧ x1 < −2.2 −→ C1
R6 x0 [0, 1.33) ∧ x1 [−0.76, 1.75] −→ C1
R7 x0 [1.33, 2.22) ∧ x1 < −1.68 −→ C1
R8 Def ault −→ C0

Table 5. Rules Set for the Iris Plant Problem Generated by the Algorithm Proposed

R1 petalwidth < 0.72 −→ Iris − setosa


R2 petalwidth ≥ 1.66 −→ Iris − virginica
R3 petalwidth[1.34, 1.66] ∧ petallength ≥ 4.91 −→ Iris − virginica
R4 Def ault −→ Iris − versicolor

7 Conclusions

In this paper, based on the analysis of the relations among the characteristics
of position and shape of classification hypersurface, the partial derivative distri-
bution of the trained SVR output to the corresponding inputs, a new measure
for determining the importance level of the attributes based on the differential
information of trained SVR is proposed, which is suitable for both continuous
attributes and discrete attributes, and can overcome the shortcomings of the
measure method based on information entropy. On the basis of this new mea-
sure, a new approach for rules extraction from trained SVR is presented, which
is also suitable for classification problems with continuous attributes. The per-
formance of the new approach is demonstrated by several typical examples, the
computing results prove that the new approach can improve the validity of the
extracted rules remarkably compared with other rule extracting approaches, es-
pecially for complicated classification problems.

References
1. Fu, L M.: Rule generation from neural network. IEEE Trans on Sys, Man and
Cybernetics 8, 1114–1124 (1994)
2. Towell, G., Shavlik, J.A.: The extraction of refined rules from knowledge-based
neural networks. Machine Learning 1, 71–101 (1993)
3. Lu, H.J., Setiono, R., Liu, H.: NeuroRule: a connectionist approach to data mining.
In: Proceedings of 21th International Conference on Very Large Data Bases, Zurich,
Switzerland, pp. 81–106 (1995)
4. Zhou, Z.H., Jiang, Y., Chen, S.F.: Extracting symbolic rules from trained neural
network ensembles. AI Communications 6, 3–15 (2003)
Constructing Classification Rules Based on SVR 309

5. Sestito, S., Dillon, T.: Knowledge acquisition of conjunctive rules using multilayered
neural networks. International Journal of Intelligent Systems 7, 779–805 (1993)
6. Craven, M.W., Shavlik, J.W.: Using sampling and queries to extract rules from
trained neural networks. In: Proceedings of the 11th International Conference on
Machine Learning, New Brunswick, NJ, pp. 37–45 (1994)
7. Maire, F.: Rule-extraction by backpropagation of polyhedra. Neural Networks 12,
717–725 (1999)
8. Setiono, R., Leow, W.K.: On mapping decision trees and neural networks. Knowl-
edge Based Systems 12, 95–99 (1999)
9. Battiti, R.A.: Using mutual information for selecting featuring in supervised net
neural learning. IEEE Trans on Neural Networks 5, 537–550 (1994)
10. Bollacker, K.D., Ghosh, J.C.: Mutual information feature extractors for neural clas-
sifiers. In: Proceedings of 1996 IEEE international Conference on Neural Networks,
Washington, pp. 1528–1533 (1996)
11. Dash, M., Liu, H., Yao, J.C.: Dimensionality reduction of unsupervised data. In:
Proceedings of the 9th International Conference on Tools with Artificial Intelli-
gence, Newport Beach, pp. 532–539 (1997)
12. Fu, X.J., Wang, L.P.: Data dimensionality reduction with application to simplifying
RBF network structure and improving classification performance. IEEE Transac-
tions on Systems, Man and Cybernetics, Part B - Cybernetics. 33, 399–409 (2003)
13. Kamarthi, S.V., Pittner, S.: Accelerating neural network training using weight
extrapolation. Neural Networks 12, 1285–1299 (1999)
14. Blake, C., Keogh, E., Merz, C.J.: UCI repository of machine learning databases,
Department of Information and Computer Science, University of California, Irvine,
CA(1998), [http://www.ics.uci.edu/∼ meearn/MLRepository.htm]
Another Random Scribd Document
with Unrelated Content
Substanz für unsere Welt mindestens drei Attribute zukommen:
Geist, Energie, Materie (oder was für Materie stehen kann). Die
allgemeine Substanz soll ja unendlich viele Attribute haben. So ist es
durch nichts ausgeschlossen, daß unsere Welt in der Tat diese drei
oder vielleicht noch mehr Attribute ausmacht. Und andere Welten,
von denen schon die Alten träumten, können mit diesen noch
andere Attribute bedeuten in beliebiger Zahl. Wer also ein Ding-an-
sich mit unendlich vielen Attributen annimmt, muß unendlich viele
Welten zugestehen und vielleicht noch Übergänge und
Überdeckungen zwischen ihnen. Und er darf sogar die Leben-Reihe
(S. 221) durch diese verschiedenen Welten führen. Auch eine solche
Auffassung bildet einen Monismus und gehörte in die Anschauungen
des so umfassenden M o n i s t e n b u n d e s. Aber freilich verirrt sich
schon manches ins Mystische, wohin Kant auch seine „Träume eines
Geistersehers“ geführt haben, da wir von nichts wissen als allein von
unserer Welt.
Namen- und Sachregister.
A
Aberglaube 27, 434.
Abspiegelung, die Welt als 300, 379.
Abwehrformeln 190.
Achamoth 273 ff.
Adam 150.
Adam von Bremen 93.
Afrika 20 ff., 57 ff. u. a. a. O.
Agnostizismus 420.
Agrippa Cornelius 312.
Ägypter 100 ff., 129, 132, 155, 158, 165, 180, 184, 188, 198,
219, 228 u. a. a. O.
Ahnenkult 43 ff., 46 u. a. a. O.
Aion 271 u. a. a. O.
Akademie 248, 355.
Akzidens 289.
Alanus 303.
Albertus Magnus 294.
Alchemie 290.
d’Alembert 435.
Allegorie 98.
Alte vom Tage 291.
Altruismus 437 u. a. a. O.
Amerika 17 ff. u. a. a. O.
Ammonios Sakkâs 277.
Amulett 40.
Ananke, ἀνάγκη 250, 352, 422.
Anaxagoras 243, 269, 423.
Anaximandros 237, 269.
Anaximenes 236.
Andrée 164.
Animismus 36 ff. u. a. a. O.
Anschauungsformen 360 ff., 475 f. u. a. a. O.
Anselm 300.
Antinomien 363 ff.
Apokastase 253.
Apollonios von Tyana 256.
A posteriori 360 ff.
Appetition 341 ff.
A priori 360 ff.
Araber 108, 208.
Archeus 329.
Archonten 257, 273.
Archytas 242.
Ariost 18.
Aristoteles 249 ff. u. a. a. O.
Arkesilaos 355.
Arnobios 269.
Arrhenius 179, 448.
ἀρχαί 7 ff.
Aschariten 288.
Äschylos 185.
Assimilation 372, 473.
Assoziationsprinzipe 359, 400, 411, 415, 418.
Assyrier 108 ff.; s. a. Babylonier.
Astrologie 109, 188, 290.
Atheismus 154.
Äther 465 f.
Atomistik 422 ff., 465 f. u. a. a. O.
Attribute 335, 392 ff., 461 f., 484 u. a. a. O.
Auerbach, Felix 463, 482 f.
Auferstehung 192 ff.
Aufklärungsphilosophie 347, 434 ff.
Augustinus 281, 315.
Australien 17 u. a. a. O.
Automatisches 419, 428, 479 u. a. a. O.
Averroes 288, 295.
Avicebrol 291.
Avicenna 288.
Avidyia 350.
Awatars 169.
B
Babylonier 108 ff., 130, 134, 153, 158, 160, 161, 178, 183, 196.
Bacon Roger 299.
— von Verulam 402 ff.
Barden 95.
Basilides 268 ff.
Bastian Adolf 66 ff.
Baur 267.
Begriffsgottheiten 141 f.
Belebung 32 ff., 424, 436 u. a. a. O.
Beneke 413.
Berkeley 356 ff.
Bernhardy 426.
Bernouilli 441.
Beschwörung 54, 190.
Beseelung 36 ff., 319, 382 ff., 390, 424, 436 u. a. a. O.
Besessene 56.
Bessarion 309.
Bewegungsmoment 459.
Bewußtsein 400, 474 ff. u. a. a. O.
Bewußtseinsleiter 399.
Bewußtseinsschwelle 399.
Bhagavad-Gîtâ 115 u. a. a. O.
Bibel 106 f., 157, 177, 184 u. a. a. O.
Biogenetisches Grundgesetz 449.
Blavatsky, Helene Petrowna 330.
Böhme, Jakob 323 ff. u. a. a. O.
Boltzmann 440.
Bonaventura 303.
Boyle, Robert 434.
Brahmaismus 120.
Brugsch 100, 146 u. a. a. O.
Bruno Giordano 317 ff. u. a. a. O.
Büchner 438.
Buddha 215 u. a. a. O.
Buddhismus 120, 122, 214.
Bürger 55.

C
Calvin 315.
Campanella 322.
Cartesius 333 ff.
Cäsar 88 f., 94, 97, 216.
Centrosom 450.
Chachma 266.
Chamberlain 83.
Chauvinismus 83.
Cherubim 280.
Chinesen 120 f., 143, 164, 210, 235.
Christian Science 257.
Chromatin 450.
Chrysippos 233.
Cicero 99, 253.
Civitas dei 294.
Claudius, Kaiser 95.
Clausius 441.
Comte 40, 415 ff.
Condillac 411.
Constant, Benjamin 29.
Contemplatio 304.
Cooper 23.
Curtis 50 f.
Cusanus, Nikolaus 306.
Czolbe 438.

D
Dämonenglaube 43 ff., 53, 257.
Dante 206, 297.
Darwin 445 ff., 450 ff.
Deismus 253 ff.
Delitzsch, F. 153.
Demiurg 268 f. u. a. a. O.
Demokritos 423 f.
Dennis 205.
Derwische 261.
Descartes 333 ff. u. a. a. O.
Deszendenzlehre 445 ff., 449.
Determinismus 296.
Dharma 215.
Diderot 435.
Diels 352 u. a. a. O.
Dies fasti et nefasti 191.
Ding-an-sich 365, 390 ff., 461 ff. u. a. a. O.
Diogenes von Apollonia 236.
Dionysios Areopagita 280.
Dissimilation 372, 455.
Doketen 276.
Driesch 343, 370, 450 f.
Droßbach 345.
Dualismus 13, 148, 267, 458 ff. u. a. a. O.
Dubois Reymond 438, 471, 483.
Dühring, Eugen 416 ff., 463, 481.
Duns Scotus 298.

E
Eckehart, Meister 303.
Edda 93 f.
Egoismus 435.
Ei 450 ff.
εἴδωλον 37, 193.
Einheitlichkeit der Welt 16 ff.
Eklektiker 261.
Ektropie 480 ff.
Eleaten 351 u. a. a. O.
Elemente 423.
Elysium 204.
Emanationslehre 253 ff., 263 f., 278 u. a. a. O.
Empedokles 220, 423.
Empirismus 376, 401 ff., 454 ff., 483.
Endlichkeit 363 f., 464 f.
Energetiker 454, 483.
Energie 421 ff., 454 ff. u. a. a. O.
— psychische 455 ff.
Energismus 420 ff.
Engel 261 u. a. a. O.
Entelechie 250 u. a. a. O.
Entropie 440 u. a. a O.
Entwicklungslehre 443 ff.
Enzyklopädismus 435.
Epigenesis 448.
Epikuros 425.
Eranier 110, 131, 139, 148, 159, 162, 166, 174, 202, 216.
Erhaltungsprinzipe 440.
Eristik 354.
Eschatologie 71 ff., 192 ff.
Eschmunazar, König 108.
Essäer 276.
Ethische Gottheiten 138 f.
Etrusker 205.
Eucken 390.
Euemeros 52, 97.
Eukleides aus Megara 354.
Euripides 96 f., 183, 244.
Evolution 258, 269, 282, 295, 420, 447 f., 452.

F
Fakire 261.
Fananybrauch 42.
Fatalismus 138, 234, 437.
Faust 54, 266.
Fechner 398 f.
Fegefeuer 192 ff.
Ferment 329.
Fervers 111, 208.
Fetische 39 ff., 90, 104 u. a. a. O.
Fetischismus 36 ff.
Feuerbach, Ludwig 438.
Feuth, Ludwig 113.
Fichte 358, 373 f.
Flammarion 399.
Florenz, Karl 124.
Fludd 329.
Flutsagen 161 ff.
Fohi 121.
Foismus 122.
Formen 249, 383 u. a. a. O.
Frank, Sebastian 316.
Franziskus der Heilige 300.
Freiheit 182 ff., 367, 393, 479 u. a. a. O.
Fresnel 441.
Frobenius 5, 42, 57 ff., 74, 79.
Frohars 111, 208.
Funktionsübertragung 453.

G
Ganglien 455, 473.
Gassendi 431.
Gastrulation 450.
Gegengottheiten 148 f.
Gegenstandsgottheiten 127 ff.
Geheimbünde 57, 254.
Geheimlehre 189.
Geister 55.
Geisterglaube 48 ff., 53.
Geistige Vorgänge 455 ff.
Generatio equivoca 384, 446.
Germanen 88 ff., 136 f., 140, 156, 158, 167, 173, 206.
— Nord- 92 ff.
Gerson, Johann 303.
Gespenster 53.
Geulincx 336.
Gilgamesepos 109, 188, 196.
Gleichartigkeit der Menschheit 11 f.
Gnostiker 267 ff., 277 u. a. a. O.
Goethe 266, 275, 277, 307, 396, 445.
Gorgias 355.
Götterglaube 56.
Götterneid 185.
Götzenbilder 48.
Götzendienst 43 ff.
Graßmann 103 u. a. a. O.
Grey 4, 53.
Griechen 50, 95 ff., 130, 137 f., 143, 162, 167, 171, 203, 230 u.
a. a. O.
Grimm, Jakob 37, 45, 73, 75, 90 ff.
Gruppe 255 f.
Gubernatis 114.

H
Häckel 401, 449, 461 ff.
Hades 204.
Hafis 261.
Halbkulturvölker 4.
Haller, Albrecht v. 447.
Hammurabi 110.
Hanusch 86.
Harmonie 241.
— Gesetz der 453.
— Prästabilierte 338 ff.
Harnack, Adolf 267 ff., 274.
Harpyen 73.
Hartmann, Eduard v. 386 ff.
— Franz 330.
Hebräer 49, 106 ff., 144, 156, 177, 192.
Hegel 376.
Heimarmene, εἰμαρμένη, 239, 422.
Hel 206 f.
Helmholtz 448, 471.
Helmont, Baptist van 328.
— Franz van 338.
Hemsterhuis 413.
Henotheismus 144, 147 ff.
Herakleitos 238, 353.
Herbart 347.
Herder 435.
Hermogenes 269.
Herodot 45 f., 50, 89, 106.
Hesiod 158, 172.
Hexen 53 ff.
Hierarchie 280.
Himmel der Erde gleich 17 ff.
Himmelskörper 466 u. a. a. O.
Hinduismus 119.
Hippias 355.
Hobbes 432 f.
Hoffmann, E. T. A. 479.
Holbach 435 ff.
Hölle 78, 192 ff., 208.
Homa 96, 155 f., 171.
Homer 112.
Homoiomerien 423.
Horatius 188.
Hugo von St. Victor 300.
Hume 358, 407 ff.
Hylodeismus 236 ff.
Hylopsychismus 235 ff.
Hylozoismus 235 ff.
Hypostasie 366, 437.

I
Ideale 365.
Idealismus 349 ff., 360 ff. u. a. a. O.
Idealphilosophie 253, 349 u. a. a. O.
Ideen 244 ff., 264, 335, 364, 405 u. a. a. O.
Ideenassoziation 432 u. a. a. O.
Identitätsphilosophie 385 ff. u. a. a. O.
Idololatrie 50 f.
Illusionslehre 350.
Indier 113 ff., 132, 135, 139, 156 f., 161, 162, 166, 176, 187,
200, 211, 228, 258, 350 f., 424.
Indifferentismus 234.
Individuation 295.
Induktion 402.
Inkarnationen 169.
Intelligenzen 288.
Intelligible Dinge 282 u. a. a. O.
Intensität 457.
Intuition 263 u. a. a. O.
Irreversibilität 440 f.
Isaak 302.
Istars Höllenfahrt 197.
Italiker 158.

J
Jamblichos 277.
Japaner 123 f., 133, 164, 210, 235.
Jehuda Halevi 293.
Jenseits der Kulturvölker 192 ff.
— — Naturvölker 71 ff.
Jeremias, Alfred 110 u. a. a. O.
Jezira 291.
Johannes, Apostel 265.

K
Kabbala 290 ff.
Kalpa 201.
Kant, Immanuel 359 ff., 484 u. a. a. O.
Kapazität 457.
Karäer 293.
Karlstadt 316.
Karma 215, 330.
Karneades 355.
Katasterismen 34.
Kategorien 222 f., 360 ff. u. a. a. O.
Kategorischer Imperativ 367.
Kausalität 409, 414 f., 474 ff. u. a. a. O.; s. auch Kategorien,
Regulative.
Keimteilchen 423, 447.
Kelten 94 ff., 208, 216.
Kepler 345.
Kiesewetter 293.
Kismet 138, 288.
Klopstock 150.
Kohärente Systeme 441, 467.
Konstitutionsprinzip 366 ff.
Kopernikus 321, 428, 432.
Kosmos, κόσμος 238, 240, 481 u. a. a. O.
Kräfte als Wille 382 ff.
Krause 328.
Kritizismus 355 u. a. a. O.
Krönig 441.
Kultur 80 ff.
Kulturvölker 4, 6, 80 ff.
Kyrenaiker 355 f.

L
Labilität 439 ff.
Lactantius 270.
Lamaismus 122.
Lamarck 445.
Lamettrie 435 ff.
Lange, F. A. 422, 438.
Lao-tsse 210, 220, 235, 350.
Lasson, Adolf 319, 401.
Lavater 396.
Leben 16 ff., 482 u. a. a. O.
Lebenanschauung, Erklärung 1 ff.
Leben-Reihe 220 ff.
Lebensschicksal 182 ff.
Leere 240, 424.
Leibniz 340 ff. u. a. a. O.
Lessing 346, 434.
Leukippos 424.
Liber scriptus 299.
Liber vivus 299.
Lichtreich 272.
Lilith 150.
Lippert 40 ff., 84 ff., 90 ff., 102 f., 112, 124 ff., 131.
Littauer 85, 87, 164.
Livingstone 74.
Locke, John 402 ff.
Logos 142, 262, 265 ff., 422 u. a. a. O.
Longfellow 21.
Longinus 277.
Lotze 396 f.
Lucanus 95.
Lucretius Carus 426 ff., 441.
Lullus, Raimundus 305.
Luther 265, 314.

M
Mach, Ernst 414, 417 ff.
Magier 112.
Maimonides 293.
Malebranche 336.
Mandäer 276.
Manichäer 270 ff.
Manu 213, 217.
Märchen 22.
Marsilius Ficinus 309.
Maschinen 369, 457.
Materialismus 420 ff., 467 f., 477 ff. u. a. a. O.
Materie 246, 279 u. a. a. O.
Maui 4 ff., 21.
Maxwell 441, 466.
Maya 350.
Mayavölker 125.
Mechanismus 420 ff. u. a. a. O.
„Medizin“ 42 ff.
Melanchthon 314.
Melissos aus Samos 354.
Mendelssohn, Moses 345.
Menschenopfer 46 u. a. a. O.
Messiasidee 169.
Metamorphosen 37.
Metempsychose 214 f.
Metensomatose 214 ff.
Mexikaner 124.
Meyer, Ludwig 391.
Milton 150.
Mirabaud 435.
Modus 335, 392.
Mohammedanismus 208.
Moleschott 13 u. a. a. O.
Monaden 339, 340 ff., 397 u. a. a. O.
Monismus 13 u. a. a. O.
Monistenbund 484.
Monolatrie 70.
Monotheismus 151 ff.
Montaigne 330.
Montesquieu 413.
Morphologisch-Biologisches Ordnungsgesetz 453.
Morphologisches Gesetz 451.
Mose de Leon 291.
Motakhallim 287.
Muatazile 287.
Müller, Johannes 470.
— K. O. 205.
— Max 20, 24 ff., 33 f., 38, 81, 115, 211, 217, 265.
Multismus 13.
Mysterien 57.
Mystiker 290 f., 293 ff., 300 ff. u. a. a. O.
Mystizismus 254 u. a. a. O.
Mythologie 56.

N
Natur 251.
Naturalismus 401 ff.
Naturgesetze 439 ff.
Naturphilosophen 231 u. a. a. O.
Naturphilosophie 376.
Naturreligion 56 f.
Naturvölker 4, 16 ff.
Naturzweck 369 ff.
Nekromantie 254.
Nephesch 292.
Neschamah 292.
Neuplatonismus 277 ff.
Neupythagoräer 255 ff.
Neuspinozismus 396.
Neuvitalismus 483.
Newton 428, 434.
Nichtordnung, elementare 441, 482.
Nichtumkehrbarkeit 440 f.
Nietzsche 385.
Nikolaus Cusanus 306.
Nirvana 211 ff., 226, 480 u. a. a. O.
Nominalisten 293 f. u. a. a. O.
Noumena 363.
Nukleus 450.
Nus, νοῦς 243 u. a. a. O.

O
Objektivation 379 ff.
Occasionalismus 336, 413 u. a. a. O.
Occultismus 254.
Offenbarung 30, 294.
Ontogenie 445 ff., 449.
Ophir 61.
Ophiten 276.
Orakel 189.
Orcus 205.
Organisierte Dinge 369.
— Wesen 369 ff.
Orphiker 255 f.
Ossian 95.
Ostwald, Wilhelm 454 ff., 462 u. a. a. O.
Otto III., Kaiser 300.
Ovid 100.
Ozeanier 17 ff.

P
Palingenesie 449.
πᾶν, τό 231, 255.
Pan 255.
Pandämonismus 56.
Pandeismus 227 ff., 254 u. a. a. O.
Panpsychismus 235 ff. u. a. a. O.
Panspermie 317, 447.
Pantheismus 227, 390 ff.
Pantheos 234.
Paracelsus 316.
Paradies 78, 159, 192 ff., 208.
Paraklet 270.
Parallelen, anthropologische 10 ff.
Parallelismus, psychophysischer 393, 397 ff. u. a. a. O.
Paralogismen 363 ff.
Parmenides 352.
Pascal, Blaise 329.
Patristische Philosophie 281.
Patritius 317.
Paulus Diaconus 45.
Pausanias 97.
Peraten 276.
Peripatetiker 253.
Peruaner 125.
Perzeption 341.
Pessimismus 384 ff.
Peters, Carl 61.
Phänomenalismus 356 ff. u. a. a. O.
Pherekydes 256.
Philolaos 242.
Philon 261 ff.
Philostratos 257.
Phönizier 108.
Phylogenetische Evolution 452.
Phylogenie 445 ff., 449.
Physische Energien 468 ff., 473 u. a. a. O.
Physizismus 420 ff.
Pico Giovanni 310 f.
Pistis Sophia 274.
Planck, Max 418, 440, 442.
Platon 217 f., 221, 244 ff. u. a. a. O.
Pleroma 272.
Plethon Gemistos 309.
Plotinos 277 ff.
Pluralismus 13.
Plutarchos 257.
Poincaré 417.
Polarität 324, 450.
Polygnotos 204.
Polylatrie 70 f.
Polynesier 66 u. a. a. O.
Polytheismus 127 ff.
Porphyrios 277.
Positivismus 394, 401 ff.
Potentialität 250, 414.
Potenz, morphologische 451.
Prana 223, 230.
Prädestination 285 u. a. a. O.
Prädetermination 414 u. a. a. O.
Prediger 186.
Preußen 87.
Primalitäten 322.
Prinzipe 7 ff., 367 ff.
Prodikos 355.
Propheten 107.
Protagoras 355.
Protoplasma 450 ff., 468.
ψυχή 36 ff. u. a. a. O.
Psychische Energien 454 ff., 468.
Psychologie, assoziative 359, 397, 400, 411, 415, 418 u. a. a. O.
Psychom 461.
Psychophysik 398.
Psychophysiologie 399.
Psychoplasma 477.
Pyramidentexte 198.
Pyrrhon 355.
Pythagoräer 217, 239.
Pythagoras 239.

Q
Qualitates occultae 313.
Quellgeister 325 ff.
Quincke 468.
Quinta essentia 313, 316.

R
Rabbaniten 293.
Ramanuga 258.
Rationalismus 333.
Ratzel 61.
Raum 246, 360, 433 u. a. a. O.
Realen 347.
Realismus 349 ff., 401 ff. u. a. a. O.
Realisten 293 f. u. a. a. O.
Regeneration 453.
Regulation 454, 481.
— biologische 454.
— morphologische 454.
Regulative 360 ff., 365 ff., 439 ff. 471.
Reinkarnation 211 ff.
Reinke 483.
Religionsursprung 23 ff., 434 ff.
Rephaim 193.
Restitution 452 f.
Resurrektion 79.
Reuchlin, Johann 311.
Reversibilität 440 f.
Richard von St. Victor 302.
Riehl 401.
Rigveda 20 u. a. a. O.
Ritter, Heinrich 466.
Römer 50, 98 ff., 129, 143, 172, 203.
Rousseau 413.
Ruach 292.
Rudimentäre Organe 450.
Runen 94.
Ruysbroek 304.
S
Salomon ben Gabirol 291.
Samenteilchen, Keimteilchen 423 ff.
Sankara 260, 351.
Sankhya 259.
Satan 149 ff.
Schamanismus 39, 44 ff., 122.
Scheible 293.
Schelling 327, 375.
Schicksalsgottheiten 136 ff.
Schiller 367.
Schleiermacher 378.
Schmitt, Heinrich 267, 271, 274.
Scholastiker 293 f., 305 u. a. a. O.
Schopenhauer 379 ff.
Schöpfer 131 u. a. a. O.
Schöpfung 464 u. a. a. O.
Schwartz, W. 132.
Schwenkfeld 316.
Scotus Erigena 283 f.
Seele 36 ff., 45, 71 ff., 343 u. a. a. O.
Seelenarten 221 ff.
Seelenkult 43 ff. u. a. a. O.
Seelentätigkeiten 221 ff., 473 ff. u. a. a. O.
Seelenwanderung 211 ff. u. a. a. O.
Selbsterhaltung 347, 436, 444.
Sensualismus 356, 394, 401 ff., 425 u. a. a. O.
Seraphim 280.
Shintoismus 123.
Sigê 271.
Simon Magus 276.
Sirenen 73 f.
Skeptizismus 354 ff. u. a. a. O.
Slawen 85 ff., 208 u. a. a. O.
Sohar 294.
Sokrates 219, 248.
Soma 117.
Sophia 267 ff., 272.
Sophisten 354 f.
Speiseverbote 191.
Spektralanalyse 442.
Spencer, Herbert 420.
Speusippos 249.
Sphärenharmonie 242.
Spieß 77.
Spinoza 337, 390 ff., 484 u. a. a. O.
Spinozismus 390 ff. u. a. a. O.
Spiritismus 254.
Stabilität 439 ff.
Stammannahmen 7 ff.
Welcome to Our Bookstore - The Ultimate Destination for Book Lovers
Are you passionate about books and eager to explore new worlds of
knowledge? At our website, we offer a vast collection of books that
cater to every interest and age group. From classic literature to
specialized publications, self-help books, and children’s stories, we
have it all! Each book is a gateway to new adventures, helping you
expand your knowledge and nourish your soul
Experience Convenient and Enjoyable Book Shopping Our website is more
than just an online bookstore—it’s a bridge connecting readers to the
timeless values of culture and wisdom. With a sleek and user-friendly
interface and a smart search system, you can find your favorite books
quickly and easily. Enjoy special promotions, fast home delivery, and
a seamless shopping experience that saves you time and enhances your
love for reading.
Let us accompany you on the journey of exploring knowledge and
personal growth!

ebookball.com

You might also like