Random Field Models in Earth Sciences
5/5
()
About this ebook
This graduate-level text surveys the problems of earth and environmental sciences by means of theoretical models that have as an essential basis a purely random element. In addition to introducing students to spatiotemporal modeling as a fundamental methodology in the earth sciences, this volume illustrates the role of spatiotemporal modeling in the general framework of the scientific method.
Starting with discussions of the science of the probable, the various theories of probability, and the physical significance of the random field model, the text explores a variety of problems in earth sciences in which the random field model constitutes an effective approach. A critical and concise summary of the fundamental concepts and results of the general random field theory is followed by considerations of the intrinsic spatial random field model, the factorable random field model, the spatiotemporal random field model, and space transformations of random fields.
Additional topics include random field modeling of natural processes, the simulation of natural processes, estimation in space and time, and sampling design.
George Christakos
George Christakos is a Professor in the Department of Geography at San Diego State University (USA) and in the Institute of Island & Coastal Ecosystems, Ocean College at Zhejiang University (China). He is an expert in spatiotemporal random field modeling of natural systems, and his teaching and research focus on the integrative analysis of natural phenomena; spatiotemporal random field theory; uncertainty assessment; pollution monitoring and control; human exposure risk and environmental health; space-time statistics and geostatistics.
Related to Random Field Models in Earth Sciences
Related ebooks
Linear Systems and Operators in Hilbert Space Rating: 0 out of 5 stars0 ratingsProbabilistic Metric Spaces Rating: 3 out of 5 stars3/5The Convolution Transform Rating: 0 out of 5 stars0 ratingsIntroduction to Differentiable Manifolds Rating: 5 out of 5 stars5/5Theory of Linear Physical Systems: Theory of physical systems from the viewpoint of classical dynamics, including Fourier methods Rating: 0 out of 5 stars0 ratingsAnalysis and Control of Linear Systems Rating: 0 out of 5 stars0 ratingsEssays on the Theory of Numbers Rating: 5 out of 5 stars5/5Harmonic Analysis and the Theory of Probability Rating: 0 out of 5 stars0 ratingsHilbert Space Methods in Partial Differential Equations Rating: 4 out of 5 stars4/5Mathematics for the Physical Sciences Rating: 0 out of 5 stars0 ratingsFunctional Analysis Rating: 4 out of 5 stars4/5A First Course in Wavelets with Fourier Analysis Rating: 4 out of 5 stars4/5Geometry of Complex Numbers Rating: 4 out of 5 stars4/5Linear Algebra and Projective Geometry Rating: 0 out of 5 stars0 ratingsNonlinear Differential Equations Rating: 0 out of 5 stars0 ratingsBoundary Value Problems and Fourier Expansions Rating: 0 out of 5 stars0 ratingsStationary and Related Stochastic Processes: Sample Function Properties and Their Applications Rating: 4 out of 5 stars4/5Some Mathematical Methods of Physics Rating: 0 out of 5 stars0 ratingsIntroduction to Functional Analysis Rating: 0 out of 5 stars0 ratingsInfinite Matrices and Sequence Spaces Rating: 0 out of 5 stars0 ratingsAxiomatics of Classical Statistical Mechanics Rating: 5 out of 5 stars5/5Lectures on the Calculus of Variations Rating: 0 out of 5 stars0 ratingsLinear Algebra and Matrix Theory Rating: 5 out of 5 stars5/5Nonlinear Filtering and Smoothing: An Introduction to Martingales, Stochastic Integrals and Estimation Rating: 0 out of 5 stars0 ratingsDifferential Topology: An Introduction Rating: 0 out of 5 stars0 ratingsTopological Vector Spaces and Distributions Rating: 3 out of 5 stars3/5Partial Differential Equations: An Introduction Rating: 2 out of 5 stars2/5Theory of Markov Processes Rating: 0 out of 5 stars0 ratingsRadiative Transfer Rating: 4 out of 5 stars4/5Elements of the Theory of Functions Rating: 4 out of 5 stars4/5
Earth Sciences For You
Weather For Dummies Rating: 4 out of 5 stars4/5I Contain Multitudes: The Microbes Within Us and a Grander View of Life Rating: 4 out of 5 stars4/5The Rise and Fall of the Dinosaurs: A New History of a Lost World Rating: 4 out of 5 stars4/5SAS Survival Handbook, Third Edition: The Ultimate Guide to Surviving Anywhere Rating: 4 out of 5 stars4/5Salt: A World History Rating: 4 out of 5 stars4/5The Lost Book of Dr Sebi Self-Healing Bible Rating: 5 out of 5 stars5/5The Witch's Yearbook: Spells, Stones, Tools and Rituals for a Year of Modern Magic Rating: 5 out of 5 stars5/5Mother of God: An Extraordinary Journey into the Uncharted Tributaries of the Western Amazon Rating: 4 out of 5 stars4/5Rockhounding for Beginners: Your Comprehensive Guide to Finding and Collecting Precious Minerals, Gems, Geodes, & More Rating: 0 out of 5 stars0 ratingsHow to Lie with Maps Rating: 4 out of 5 stars4/5The Mother Plane: UFO's Rating: 4 out of 5 stars4/5The Phantom Atlas: The Greatest Myths, Lies and Blunders on Maps Rating: 4 out of 5 stars4/5Rocks and Minerals of The World: Geology for Kids - Minerology and Sedimentology Rating: 5 out of 5 stars5/5Answers to Questions You've Never Asked: Explaining the 'What If' in Science, Geography and the Absurd Rating: 3 out of 5 stars3/5Fantasy Map Making: Writer Resources, #2 Rating: 4 out of 5 stars4/5The Great Deluge: Hurricane Katrina, New Orleans, and the Mississippi Gulf Coast Rating: 4 out of 5 stars4/5A Fire Story: A Graphic Memoir Rating: 4 out of 5 stars4/5Nuclear War Survival Skills: Lifesaving Nuclear Facts and Self-Help Instructions Rating: 4 out of 5 stars4/5What Stands in a Storm: A True Story of Love and Resilience in the Worst Superstorm in History Rating: 4 out of 5 stars4/5Northwest Treasure Hunter's Gem and Mineral Guide (6th Edition): Where and How to Dig, Pan and Mine Your Own Gems and Minerals Rating: 0 out of 5 stars0 ratingsThe Wild Journal: A Year of Nurturing Yourself Through Nature Rating: 0 out of 5 stars0 ratingsDevil's Gate: Brigham Young and the Great Mormon Handcart Tragedy Rating: 4 out of 5 stars4/5
Reviews for Random Field Models in Earth Sciences
1 rating0 reviews
Book preview
Random Field Models in Earth Sciences - George Christakos
Copyright
Copyright © 1992 by George Christakos
All rights reserved.
Bibliographical Note
This Dover edition, first published in 2005, is a corrected republication of the work originally published by Academic Press, Inc., San Diego, in 1992.
9780486160917
Manufactured in the United States of America
Dover Publications, Inc., 31 East 2nd Street, Mineola, N.Y 11501
For Lana and Maria
Table of Contents
Title Page
Copyright Page
Dedication
Foreword
Preface
1 - Prolegomena
2 - The Spatial Random Field Model
3 - The Intrinsic Spatial Random Field Model
4 - The Factorable Random Field Model
5 - The Spatiotemporal Random Field Model
6 - Space Transformations of Random Fields
7 - Random Field Modeling of Natural Processes
8 - Simulation of Natural Processes
9 - Estimation in Space and Time
10 - Sampling Design
References
Index
A CATALOG OF SELECTED DOVER BOOKS IN SCIENCE AND MATHEMATICS
Foreword
The emphasis on modern research in the earth sciences has shifted from the explanatory to the predictive. In the past, geologists, hydrologists, oceanographers, and other students of the earth were content to observe and describe natural phenomena and to expostulate on how things came to be the way they are. Now these scientists not only theorize about the operation of natural processes, they must devise ways to test their theories and predict the possible consequences implicit in the theories. The human race is no longer restricted to a minor role as a gatherer of the wealth created by nature. Instead, modern society exerts an increasingly important influence on the physical and chemical processes that operate on and near the surface of the earth and within the atmosphere and oceans of our planet. To understand these processes requires that we devise models that describe the dynamics of their behavior and manipulate these models in order to estimate the consequences of our activities.
Geological models have evolved from purely deterministic to probabilistic to geostatistical models incorporating the hybrid concept of regionalized variables. Geostatistical methods are among the most powerful techniques presently available for constructing models of the spatial variation in natural properties. However, these geostatistical models are only static representations of the state of nature at an instant in time; they must be combined with other models (in the current state of the art, usually deterministic) that simulate dynamic behavior. Spatiotemporal models, as described in this seminal volume by George Christakos, represent the next evolutionary step in geological modeling, because such models combine dynamic processes with spatial variability and incorporate the inevitable uncertainties that result from incomplete knowledge of both spatial patterns and dynamic behavior.
This book will place serious demands on the reader because it is both heavily mathematical and very philosophical. George perhaps could have written a more approachable treatment of spatiotemporal models by concentrating on methodological nuts and bolts, emphasizing anecdotes, examples, and case studies. However, such an approach would not have served the dual purpose of this volume: (a) to lay a foundation for spatiotemporal modeling as a fundamental methodology in the earth sciences and (b) to fit spatiotemporal modeling into the general framework of the scientific method. George’s objective has necessitated the careful exposition of assumptions and the painstaking development of the philosophical justifications behind them, as well as the detailed derivations of consequences that spring from the fundamental assumptions. Of course, it may also be true that his concern for philosophical underpinnings reflects in part the Christakos heritage, rooted as it is in the land of Greece, the birthplace of western philosophy.
After completing undergraduate work at the University of Athens, George Christakos continued engineering studies at the University of Birmingham (U.K.), receiving a master’s degree in soil mechanics, and at M.I.T., where he obtained a master’s degree in civil engineering. This was followed by a one-year stay at the Centre de Géostatistique in Fontainebleau, France, where he acquired an extensive background in geostatistics. He then returned to Athens, where he incorporated his knowledge of both deterministic and geostatistical modeling into his doctoral research in mining engineering. Noting the criticisms that have been directed at early geostatisticians for their failure to demonstrate the many connections between geostatistics and other forms of mathematical modeling, George, in this book, has taken great pains to demonstrate how the random field model relates to the various aspects of geostatistics, to the classical models of time series analysis and stochastic processes, and to the other variant models of physical phenomena.
Following his graduate studies, George came to the United States as a visiting research scientist at the Kansas Geological Survey. For two years, the Kansas Survey provided an environment where he could organize his thoughts and consolidate them into the manuscript that eventually resulted in this book. With the cooperation of the Survey, preparation of the text continued over the following two years while George pursued a second Ph.D. and conducted research in the Division of Applied Sciences at Harvard University. The Mathematical Geology Section of the Kansas Geological Survey takes special pride in George’s contribution and is pleased to have been instrumental in the preparation and publication of this volume. This book is not only a valuable contribution to science but also a testimony to the benefits that come from our support and encouragement of the exchange of scientists between nations. George Christakos is one in a succession of international scholars that have worked and studied at the Kansas Geological Survey; all have brought new ideas and viewpoints and have made lasting contributions to both our organization and to science in general. It is our hope that George’s book will direct researchers into new lines of investigation and increase the interaction between geoscientists and environmental scientists in all parts of the world. After all, the problems that must be addressed in energy, natural resources, and environment are global in nature. Fortunately, George Christakos has assembled a powerful collection of tools with which to address them.
John C. Davis
Kansas Geological Survey
Lawrence, Kansas
Preface
If you do not fix your foot outside the earth, you will never make it to stay on her.
O. Elytis
This book is about modeling as a principal component of scientific investigations. In general terms, modeling is the fundamental process of combining intellectual creativity with physical knowledge and mathematical techniques in order to learn the properties of the mechanisms underlying a physical phenomenon and make predictions. The book focuses on a specific class of models, namely, random field models and certain of their physical applications in the context of a stochastic data analysis and processing research program. The term application is considered here in the sense wherein the mathematical random field model is shaping, but is also being shaped by, its objects.
Since the times of Bacon, Mill, Whewell, Pierce, and other great methodologists of science, it has been recognized that in scientific reasoning it is as important to operate with the right concepts and models as it is to perform the right experiments. Conceptual innovation and model building have always been central to any major advance in the physical sciences. Scientific reasoning employs to a large extent probabilistic concepts and stochastic notions. Indeed, scientific induction (in the Baconian tradition) is concerned with hypotheses about physical situations as well as with the gradation of the inductive support that experimental results give to these hypotheses. Such a gradation is needed, for these hypotheses are expected to generate predictions that extrapolate beyond the existing experimental data. The gradation is also necessary because all models possess some evidential support and counterexamples. To choose between them, the degree of support must be addressed. In scientific hypothetico-deduction (in the sense of Pierce and Popper), on the other hand, one first formulates a hypothesis and then exposes it and its logical consequences to criticism, usually by experimentation. In both cases (induction and hypothetico-deduction), the fact that one makes hypotheses implies that one is not dealing with certain knowledge but rather with probable knowledge where the gradation of the support that the experimental results give to hypotheses is achieved by means of probabilistic (stochastic) terms. In Poincaré’s words: Predictable facts can only be probable.
Lastly, modeling is an important component of sophisticated instrumentation, which forms the conditions for and is the mediator of much of modern scientific knowledge (as is emphasized by instrumental realists, like Ihde, Hacking, and others).
Unfortunately, it seems that these fundamental truths are not always well appreciated nowadays. In particular, unlike physics where conceptualization—modeling and observation—experimentation are closely linked to each other following parallel paths, in some geological and environmental fields measurement is heavily overemphasized, while little attention is given to important modeling issues; even less attention is given to the problem of the rationality of model testing. Undoubtedly, such an approach, besides being very unpleasant to one’s sense of symmetry, violates the most central concepts of scientific reasoning, the latter being considered hypothetico-deductive, neo-inductive, or instrumental-realistic. As a consequence, it may be a particularly inefficient and costly approach, which provides poor representations of the actual physical situations and leads to serious misinterpretations of experimental findings.
Furthermore, it is sometimes argued that conceptual innovation and advanced modeling are not likely to be practical and, hence, one should restrict oneself to classical methods and techniques that have been in use for long periods of time. Besides being distinctively opposed to the very essence of scientific progress, this view grossly misinterprets the real meaning of both terms, practical
and classical.
Regarding the former term, it suffices to state Whittle’s own words: The word ‘practical’ is one which is grossly and habitually misused and the common antithesis between it and the word ‘theoretical’ is largely false. A practical solution is surely one which, for all it may be approximate, is approximate in an enlightened sense, shows insight and gets to the bottom of things. However the term is much more often used for a solution which is quick and provisional—quick and dirty might be nearer the mark. The world being what it is, we may need quick, provisional solutions, but to call these ‘practical’ is surely degradation of an honourable term.
In fact, before any meaningful practical solution to a physical problem is obtained, the fundamental conceptual and physical aspects of the problem must be first completely understood and a powerful theory must be developed. A good example is the problem of fluid flow turbulence. Despite its great practical importance and intensive applied research over several decades, a completely satisfactory practical solution to the problem is still not available. And this is largely due to the fact that the fundamental theoretical aspects of turbulence are still unresolved.
As regards the term classical,
the most influencial writers in the area of modern scientific methodology (such as Kuhn, Lakatos, and Nagel) have repeatedly emphasized the fact that scientific achievement is but an endless series of historical data where problems that could not be handled by the then classical approaches were solved in terms of novel concepts and mathematically more advanced methods; which then became classical themselves, only to be replaced in turn by more powerful new models and techniques. This has always been the way that science progresses. In fact, according to Lakatos, all great scientific achievements had one characteristic in common: They were all based on new concepts and models, and they all predicted novel facts, facts that had been either undreamt of or had indeed been contradicted by previous classical theories. Scientific progress is a revolutionary process in the sense of Kuhn. Moreover, as Rescher stated: Progress in basic natural science is a matter of constantly rebuilding from the very foundations. Significant progress is generally a matter, not of adding further facts, but of changing the framework itself. Science in the main does not develop by sequentially filling-in of certain basically fixed positions in greater and greater detail.
Modern quantum mechanics, for example, has predicted and explained an enormous number of effects in physics and chemistry that could not be predicted or explained in terms of classical mechanics. Quantum mechanics, however, is not a refined or extended version of classical mechanics; it is rather a revolutionary step toward changing the classical framework itself (e.g., the renowned von Neumann’s world, which is entirely quantum, contains no classical physics at all).
In earth sciences and environmental engineering, important problems nowadays include the assessment of the space—time variability of hydrogeologic magnitudes for use in analytical and numerical models; the elucidation of the spatiotemporal evolution characteristics of the earth’s surface temperature and the prediction of extreme conditions; the estimation of atmospheric pollutants at unmeasured points in space and time; the study of transport models that are the backbone of equations governing atmospheric and groundwater flow as well as pollutant fate in all media; the quantitative modeling and simulation of rainfall for satellite remote-sensing studies; the design of optimal sampling networks for meteorological observations; and the simulation of oil reservoir characteristics as a function of the spatial position and the production time.
These are all problems where the development and implementation of the appropriate model is of great significance. The importance of the modeling aspect becomes even more profound in physical situations at large space—time scales where controlled experimentation is very difficult or even impossible. Furthermore, all the above problems are characterized by the significant amount of uncertainties in the behavior of the natural processes involved. Such uncertainties constitute an essential part of many controversial scientific investigations and policy responses. A good, timely example is the global warming problem. Global warming from the increase in greenhouse gases has become a major scientific and political issue during the past decade. This is due mainly to the huge uncertainty involved in all global warming studies. For example, forecasts of the space—time variability of natural processes, such as soil moisture or precipitation, have large uncertainties. Also, uncertainties in the future changes in greenhouse gas concentrations and feedback processes that are not properly accounted for in the models could produce greater or smaller increases in the surface temperature. Policy responses are delayed because scientists are not able to properly quantify these uncertainties and use them in the context of climate models. Therefore, modeling tools and approaches leading to satisfactory solutions to these difficult problems are extremely important. And this is why the problem of uncertainty, probability, and probable knowledge is not a problem of armchair philosophers. It has grave scientific, ethical, and political implications and is of vital social and economic relevance.
Of course, the fact that research in earth sciences and environmental engineering faces a series of difficult problems nowadays should by no means dishearten us. On the contrary, it should encourage us to reconsider the usefulness of many of our traditional approaches and techniques and develop novel, more sophisticated models. At the same time, we should increase our confidence in pursuing difficult problems. This last issue is very important, for it provides the surest guarantee for the continuing vitality and rapid growth of any scientific discipline. The great significance of the confidence issue in scientific research underlies Einstein’s aphorism: I have little patience with scientists who take a board of wood, look for the thinnest part and drill a great number of holes where drilling is easy.
In the light of the above considerations, this book is concerned with the study of problems of earth and environmental sciences by means of theoretical models that have as an essential basis a purely random (stochastic) element. In particular, the term stochastic data analysis and processing refers here to the study of spatial and spatiotemporal natural processes in terms of the random field model. As we saw above, spatial and spatiotemporal natural processes occur in nearly all the areas of earth sciences and environmental engineering, such as hydrogeology, environmental engineering, climate predictions and meteorology, and oil reservoir engineering. In such a framework, geostatistics, stochastic hydrology, and environmetrics are all considered as subdomains of the general stochastic research program.
From a mathematical viewpoint, random fields (spatial or spatiotemporal) constitute an area that studies random (nondeterministic) functions. This is an area of mathematics that is usually called stochastic functional analysis and deals with any topic covered by the ordinary (deterministic) theory of functions. In addition, the existence of the random component makes stochastic functional analysis a much larger, considerably more complex, and also more challenging subject than the ordinary theory of functions. The mathematical theory of random fields works in all these physical situations, where traditional (deterministic) models do not, because (a) it has the clearest theoretical justification and captures important characteristics of the underlying natural processes that traditional methods do not; and (b) it has a superb analytical apparatus and is able to solve complex physical problems on which the traditional methods fail.
In fact, as one probes more deeply into the origin of this highly mathematical discipline, it becomes quite clear that the pioneers of the random field theory are in fact hardheaded realists, driven to develop the new approach only because of the failure of the esteemed traditional approaches to provide an accurate description of nature in problems such as those described above. By now, the stochastic approach has been applied successfully to several engineering applications. As Medawar could have stated it, engineering is complex, richly various, and challenging—just like real life. Perhaps it travels slower nowadays than quantum physics or nuclear chemistry, but it travels nearer to the ground. Hence, it should give us a specially direct and immediate insight into science in the making.
Certainly the choice between the various possible models depends partly on one’s guess about the outcome of future experiments in earth sciences and environmental engineering and partly on one’s philosophical view about the world. For the mathematical methods, though, to become operational and to obtain an objective meaning in the sense of positive sciences, it is necessary to be associated with the empirical theses and computational notions of the stochastic data analysis and processing research program. Certainly the utilization of the stochastic research program in practice raises technical questions that can be answered by way of a framework weighting all sorts of data and knowledge available, describing the specific objectives, and choosing the appropriate technique. The computer as a research instrument provides the powerful means of implementing stochastic data analysis and processing in complex physical situations. The technology that emerges from the use of computers has exciting implications as regards the traditional relationship between theory—modeling and observation—experimentation.
More specifically, the book is organized as follows: We start with discussions of the science of the probable, the various theories of probability, and the physical significance of the random field model. Random field representations of unique natural processes are rich in physical content and can account for phenomena possessing complex macroevolution and microevolution features. In this context, various problems in earth sciences, where the use of the random field approach is completely justified, are reviewed throughout the book. The subject of random fields is vast. Inevitably we have to restrict our interests to reasonably specific areas. The choice of these areas is directly dependent on their importance in the context of applied environmental sciences. The entropy-related sysketogram function is introduced and its advantages over the traditional correlation functions are discussed. Following a critical and concise summary of the fundamental concepts and results of the general random field theory, the intrinsic spatial random field model, which describes generally nonhomogeneous distributions in space, is established in terms of generalized functions. The latter involve more complex mathematical concepts and tools. However, it pays here to use more sophisticated mathematics, since this provides us with a more complete description of random fields, which strengthens the theoretical support of the intrinsic model and leads to novel results. The power of the underlying mathematical structure lies in its capacity to capture essential features of complex physical processes, to replace assumptions regarding physical processes by more powerful and realistic ones, and to pave the way for establishing important connections between the intrinsic spatial random field model and stochastic differential equations. The study of natural processes in space—time is achieved by introducing the spatiotemporal random field concept. More precisely, a theory of ordinary as well as generalized random fields is built on the appropriate space—time structure. The results obtained act then as the theoretical support to practical space—time variability models and optimal space—time estimation methods. The concept of factorable random fields provides the means for studying an important set of problems of nonlinear systems analysis and estimation in several dimensions. Space transformation is an operation that can solve multidimensional problems by transferring them to a suitable unidimensional setting. The underlying concept has both substance and depth, possessing elegant and comprehensive representations in both the physical and frequency domains. It can be used as a valuable tool in testing the permissibility of spatial and spatiotemporal correlation functions, in the study of differential equation models governing subsurface processes, as well as in the simulation of environmental properties. The spatial and spatiotemporal estimation problems are solved in all generality. A heuristic adopted to the stochastic research program yields a Bayesian-maximum-entropy approach to the spatial estimation problem, which incorporates into analysis prior information and knowledge that are highly relevant to the spatial continuity of the natural process under estimation. The Bayesian-maximum-entropy concept may have significant applications in multiobjective decision analysis and in artificial intelligence studies. Interesting solutions can be obtained concerning certain important time-series-related problems, such as system nonlinearity. These time series are involved in a variety of water resources and environmental problems, including streamflow forecasting, flood estimation, and environmental pollution monitoring and control. Multidimensional simulation is a valuable tool in applied sciences. In the book various random field simulation techniques are reviewed and their relative advantages are discussed. Lastly the sampling design problem is discussed. An estimation variance factorization scheme with attractive properties is studied, which leads to an efficient and quick multiobjective sampling design method. Several other sampling methods of considerable importance in earth sciences and environmental engineering are reviewed too.
This work has been influenced by discussions with many friends, colleagues, and even certain theoretical opponents. My sincere thanks are due to Drs. M. B Fiering, J. J. Harrington, and P. P. Rogers of Harvard University; A. G. Journel of Stanford University; J. C. Davis, R. A. Olea, and M. Sophocleous of Kansas Geological Survey; C. Panagopoulos, I. Ikonomopoulos, K. Mastoris, N. Apos-tolidis, and P. Paraskevopoulos of National Technical University of Athens; P. Whittle and J. Skilling of University of Cambridge, England; G. B. Baecher and D. Veneziano of Massachusetts Institute of Technology; G. Baloglou of State University of New York at Oswego; M. David of Montreal University, Canada; R. Dimitrakopoulos of McGill University, Canada; and V. Papanicolaou of Duke University. The author is grateful to all these friends, as well as to Mrs. C. Cowan, who typed the original manuscript, and Mrs. R. Hensiek, who drafted several of the illustrations.
George Christakos
1
Prolegomena
Common sense is the layer of prejudice laid down in the mind prior to the age of eighteen.
A. Einstein
1. The Science of the Probable and the Random Field Model
There are numerous phenomena in the physical world a direct (deterministic) study of which is not possible. In fact, physics, geology, meteorology, hydrology, and environmental engineering have introduced us to a realm of phenomena that cannot give rise to certainty in our knowledge. However, a scientific knowledge of these phenomena is possible by replacing the study of individual natural processes by the study of statistical aggregates to which these processes may give rise. A statistical aggregate is a configuration of possibilities relative to a certain natural process. The properties of such aggregates are expressed in terms of the concept of probability, more specifically, under the form of a probability law. It is important to recognize that the probability law is a perfectly determined concept. The difference between a probabilistic and a deterministic law is that, while in the deterministic law the states of the system under consideration directly characterize an individual natural process, in a probabilistic law these states characterize a set of possibilities regarding the process.
The application of the mathematical theory of probability to the study of real phenomena is made through statistical concepts. Therefore, it is essentially in the form of statistical knowledge that a science of the probable is constituted. This implies that the science of the probable replaces a direct study of natural processes by the study of the set of possibilities to which these processes may give rise.
Since knowledge regarding these processes is achieved only indirectly by way of statistical concepts, it will be characterized as probable or stochastic knowledge. Modern science provides convincing evidence that such probable knowledge is no less exact than certain knowledge. Naturally, the constitution of a science of the probable raises two fundamental philosophical problems:
(a) The elucidation of the content of the concept of probability.
(b) The foundation of probable or stochastic knowledge as a concept directly related to the application of the concept of probability to the study of real phenomena.
Clearly, these two problems are closely related to each other. For example, the concept of probability must be such that it can be used in the empirical world. Each one of these problems, however, possesses certain distinct aspects that deserve to be studied separately.
Problem (a) can be considered under the light of either a subjectivist explanation of the notion of probability, taken in itself, or an objectivist explanation. More specifically, according to the subjectivist explanation, probability is a measure attached to the particular state of knowledge of the subject. It may correspond to a degree of certitude or to a degree of belief (say, about where the actual state of nature lies); or to the attitude with which a rational person will approach a given situation that is open to chance (say, the attitude with which one is willing to place a bet on an event whose outcome is not definitely known in advance). According to explanations of the objectivist type, probability is a measure attached to certain objective aspects of reality. Such a measure may be regarded as the ratio of the number of particular outcomes, in a specific type of experiment, over the total number of possible outcomes; or as the limit of the relative frequency of a certain event in an infinite sequence of repeated trials; or as a characteristic property (particularly, the propensity) of a certain experimental arrangement. The main point of the last view is that it takes as fundamental the probability of the outcome of a single experiment with respect to its conditions, rather than the frequency of outcomes in a sequence of experiments. (See, e.g., Keynes, 1921; von Mises, 1928; Popper, 1934, 1972; Jeffreys, 1939; Savage, 1954; Byrne, 1968.)
In conclusion, there exist more than one meaning of probability. Figure 1.1 merely represents a skeleton outline of various complex and diversified analyses of the notion of probability considered over the years by several eminent mathematicians, scientists, and philosophers (see Poincaré, 1912, 1929; Borel, 1925, 1950; Kolmogorov, 1933; Reichenbach, 1935; Nagel, 1939; Boll, 1941; Gendre, 1947; Servien, 1949; Carnap, 1950; Polya, 1954; Polanyi, 1958; Fisher, 1959; Russel, 1962; Jaynes, 1968; de Finetti, 1974).
e9780486160917_i0002.jpgFigure 1.1 The various meanings of the concept of probability P
A subjective (sociological, Psc, or psychological, Py) concept of probability is of importance in social and psychological sciences, but it cannot serve as a basis for inductive logic or a calculus of probability applicable as a general tool of science. An objective (logical relationship, Pr, or propensity Pp) notion of probability is based on the assumption that objectivity is equivalent to formalization. It is, however, open to question whether formal logic can achieve the goals of the Pr concept, as has been demonstrated by Graig’s theorem regarding empirical logic and by Godel’s theorem on the limitation of formalization. Moreover, modern approaches to logic argue that the world obeys a nonhuman kind of reasoning and, hence, to cope with natural processes we must scrap our mode of human reasoning in favor of a new so-called quantum logic. The Pp concept, on the other hand, has been seriously criticized on the basis of the argument that what people understand by probability is broader than mathematical formalization. Last, to restrict probability to a mathematical meaning (Pm) is for many philosophers an ineffective approach, because the notion of probability transcends the bounds of mathematics (e.g., Byrne, 1968; Benenson, 1984).
Naturally, this variability of theoretical viewpoints reflects to an analogous variability in the practical implementation of the theory of probability. It seems that in the various fields of science and engineering, people do not stick to a unique meaning of probability. Occasionally, they prefer to choose what they consider to be the most appropriate meaning for the specific problem at hand.
The problem of probable knowledge [problem (b) above] is closely related to important modern scientific areas such as, for example, artificial intelligence and expert systems. With regard to this problem, two types of answers have been given: One is related to a subjectivist interpretation of probable knowledge, and the other is related to an objectivist interpretation. According to the former, the phenomena we are studying with the aid of probability theory are in themselves entirely determined and, therefore, they could be, ideally, the object of certain knowledge. And, if we are obliged to restrict ourselves to a probable knowledge of these phenomena, it is merely because we have at our disposal only incomplete information. The limitation of our information can be conceived either as purely contingent (due to insufficient sources of knowledge, inadequate measuring instruments and computers, etc.), or as a limitation in principle (because our capacities are inherently limited). The former point of view is used by classical statistical mechanics, while the latter is used in the context of the so-called orthodox theory of quantum mechanics. On the other hand, the objective interpretation of probable knowledge assumes that the incompleteness of our information is due to the object itself. Probable knowledge is then the expression of an objective contingency of the real phenomena. This contingency reflects either a principle of chance that exists in the very elementary components of the physical phenomena, or the lack of access to the various processes that determine these phenomena.
Evidently, there is a mutual relationship between the two aforementioned sets of problems: the subjective (objective) explanation of probability is well suited to the subjective (objective) explanation of the foundation of probable knowledge. But this is not always the case. For example, a subjective explanation of probability can well be used in the context of an objective interpretation of probable knowledge. Therefore, it is necessary that these two sets of problems be distinguished one from the other.
In any case, satisfactory answers to problems (a) and (b) above clearly belong to the field of epistemology and, thus, they require access to multidimensional philosophical considerations. In particular, any argument concerning the objectiveness or the subjectiveness of the concept of probability demands a deeper understanding of human nature and knowledge. To adopt the subjectivist or the objectivist interpretation of probable knowledge is a decision closely related to understanding of the nature of the world. In this regard, all the aforementioned attempts to answer the fundamental philosophical problems (a) and (b) are without doubt inadequate. It is far from being evident that we have at our disposal today the philosophical tools necessary to obtain a true understanding of the concepts of probability and probable knowledge. There is, perhaps, in the probabilistic concept the emergence of a type of knowledge very different from that considered by traditional schools of philosophy. In fact, people begin to realize that full consciousness of what is involved in knowledge of this sort is going to oblige us to modify fundamental concepts such as truth, knowledge, and experience.
In view of the above considerations, in this book we will not define probability as a concept in itself and will not indulge the epistemological problematics of probable knowledge. Had we decided to do so we would then have had the extremely difficult task of providing sound justification for a number of issues: If we had adopted the subjectivist explanation, we should explain why and how we have the right to suppose that the natural phenomena are entirely determined in themselves, and also why and how our knowledge, supposedly inadequate (be it in principle or merely in fact), turns out nevertheless to be quite adequate at the level of the statistical aggregates. If we had chosen the objectivist interpretation, we should justify why and how contingency appears in the physical phenomena, and why and how phenomena supposedly undetermined in themselves can give rise to statistical aggregates that are, for their part, entirely determined.
This book will focus attention on the language of probability, which is not at all constituted from some given epistemology, but from certain concrete problems that the traditional methods were not able to solve. Our concern will be on issues of application of the science of the probable in the context of the so-called random field (RF) model. In particular, the RF model will be considered a statistical aggregate about which we will make two a priori assumptions:
(i) Randomness is not a property of reality itself but, instead, a property of the RF model used to describe reality.
(ii) Probable knowledge cannot be considered as an image of reality. Through it we aim at reality and we learn something about it, but the relationship between our knowledge and its object becomes indirect and remote.
Under the light of assumptions (i) and (ii), the concept of probability in all its richness of content is of far greater importance for real world applications than the words and terms used to express it. By using probable knowledge the real is approached only through an abstract construction that involves the possible and is rather like a detecting device through which one grasps certain aspects of reality. Thanks to the detecting device, one can register certain reactions of reality and thus know it, not through an image of it, but through the answers it gives to one’s questions.
The RF formalism does not restrict our concept of probable science to a physical theory of natural phenomena, governed by randomness
or chance
; or to a logical theory of plausible reasoning in the presence of incomplete knowledge. These theories, as well as several others, are viewed as potential modeling tools and detecting devices, rather than as unique realities. Within the RF context, we are looking at real problems that are in principle very subtle and complex. Therefore, like many other human enterprises, the practice of the science of the probable requires a constantly shifting balance between a variety of theories and methods, such as stochastic calculus, probability and statistics, logic and information theory.
As a matter of fact, RF methods have proven themselves very useful, although the notion of probability itself has not been philosophically well defined. This is true for almost all scientific theories. For example, despite the fact that terms such as mass, energy, and atom are philosophically undefined or ill-defined, theories based on these terms have led to extremely valuable applications in science and engineering.
In this book we study the use of RF models in the context of stochastic data analysis and processing. More precisely, the term stochastic data analysis and processing refers to the study of a natural process on the basis of a series of observations measured over a sample region of space (spatial series), over a sample period of time (time series), or over a spatial region for a sample time period (space-time series). In general, the aim of such a study is to evaluate and reconstruct the properties of the underlying unique physical process from fragmentary sampling data, by accommodating the probabilistic notion of RF. Hence, before proceeding with the description of the stochastic data analysis and processing research program, it is appropriate to discuss the physical content implicit in the RF representation of a natural process.
2. The Physical Significance of the Random Field Model
In this section our efforts will be focused on an exposition of theses and arguments that justify the use of RF models to represent physical processes that vary in space and/or time. Let χi, i = 1, 2, . . ., m be a spatial series of values of a physical variate χ. For illustration, a porosity profile (%; Christakos, 1987b) is depicted in Fig. 1.2; also, a lead concentration surface (Pb in ppb; Journel, 1984) around a Dallas smelter site is shown in Fig.
e9780486160917_i0003.jpgFigure 1.2 A typical soil porosity profile
1.3. The pattern of change of these series in space constitutes an evolution process. Particularly, by careful examination of these figures one notices two important descriptive features of the evolution process:
(i) A well-defined spatial structure at the macroscopic level (i.e., well-defined trends in the spatial variability of the porosity; also, a high dome centered at the smelter site, an NE trend of high lead values corresponding to the direction of prevailing winds, areas where changes in lead are rapid, areas with less rapid change, etc.).
(ii) A very irregular character at the microscopic level (that is, complex variations of the porosity within short distances; erratic local fluctuations in the lead surface, etc.).
These