On the principle of maximum entropy and the earthquake frequency -magnitude relation

Size: px
Start display at page:

Download "On the principle of maximum entropy and the earthquake frequency -magnitude relation"

Transcription

1 Geophys. J. R. astr. SOC. (1983) 74, On the principle of maximum entropy and the earthquake frequency -magnitude relation 21 P. Y. Shen and L. Mansinha Department of Geophysics, University of Western Ontario, London, Ontario N6A 587, Canada Received 1982 December 28; in original form 1982 July 3 Summary. The entropy S for a continuous distribution p (x) is defined by where m (x) is the prior distribution representing our complete ignorance about x. The difficulty in using this definition arises from the problem of arbitrariness or subjectiveness in assigning the prior distribution m (x). Thus in maximum entropy inference, it is customary to arbitrarily adopt a uniform prior distribution and write the entropy as This expression, however, is a measure of uncertainty relative to the coordinate x so that the probability distribution p (~) generated from the principle of maximum entropy depends on the choice of x. Only when the chosen parameter actually has a uniform prior distribution, can we expect the generated distribution to conform with the empirical data. For a physical system in which the independent variable x is measured to only limited accuracy, the prior distribution m(x) can be shown to be inversely proportional to the measurement error of x. A parameter with uniform prior distribution, then, is one that can be measured with equal accuracy throughout its range. In this context, the magnitude of an earthquake is such a parameter because using this parameter in the principle of maximum entropy leads to the empirically determined Gutenberg-Richter frequency-magnitude relation. Other proposed frequency-magnitude relations can also be generated from the principle of maximum entropy by imposing appropriate constraints. However, it is emphasized that such relations are generated from the principle as null hypotheses, to be tested by the empirical regional or global seismicity data.

2 778 Introduction P. Y. Shen and L. Mansinha The principle of maximum entropy can be formalized as follows. Let x be a discrete random variable whose values (xi, i = 1,2,..., N) form a mutually exclusive and exhaustive set and let pi = pi(xi) be the associated probability distribution. Define the entropy S as (Shannon 1948) N S=- C pilnpi i= 1 and let the distribution pi be subject to the following constraints: N 1 Pi=l i=l and N gipi=gi, j=1,2,..., J i= 1 where gi is any function of x and gi its expected value. We seek the distribution pi which maximizes the entropy S, subject to these restraints. If x is a continuous random variable, equations (l), (2) and (3) are usually replaced by s=- 1 2 l: and p (x) dx = 1, p(x) In p (x) dx, [*gi(x)p(x)dx=gi, x1 < x Q x2 j=1,2,..., J Berrill & Davis (1980) recently applied the principle of maximum entropy to the determination of earthquake recurrence relationship. The well-known frequency-magnitude relation of Gutenberg & Richter (1944) is h N(M) = - bg M (7) where N(M) is the number of earthquakes with magnitude greater than or equal to M, and ag and bg are parameters describing regional seismicity. Berrill & Davis showed that (7) is a consequence of the maximum entropy principle, provided that there is no upper bound to M and a mean magnitude exists. It was also shown that under the physically reasonable assumption that the magnitude is bounded (Knopoff & Kagan 1977), the principle of maximum entropy generates the truncated exponential distribution (Riznichenko 1964; Cornell & Vanmarcke 1969): (4) (5) exp (- brm) - exp (- brm1) N(M) = T, O<M<M1 1 - exp (-brml) where T is the total number of earthquakes with 0 < M Q MI. Berrill & Davis interpreted these findings as an independent confirmation of the frequency-magnitude relationships. A

3 Maximum entropy and the frequency-magnitude relation 779 closer examination, however, indicates that the application of the maximum entropy principle is not without ambiguity and care must be exercised in drawing such conclusions. To begin with, a probability distribution can be viewed either objectively (Feller 1950) or subjectively (Jeffreys 1939). Only with a subjective view can we regard the probability distribution generated from the maximum entropy principle as justified independently of empirical verification. Secondly, the entropy defined by (4) for a continuous distribution is not a formal extension of (1). It is not invariant under coordinate transformations and the probability distribution p (x) generated from the principle depends on the choice of the independent variable x. The correct entropy is given by where m (x) is the prior probability distribution (Jaynes 1968). When we compare (9) with (4), we see that adopting definition (4) as entropy is equivalent to arbitrarily assigning a uniform prior distribution m (x). Thus only when the chosen independent variable actually has a uniform prior distribution, can we expect to generate the correct probability distribution. For physical systems not free from measurement errors, such a variable is the one that can be measured with equal accuracy throughout its range. Other choices of the independent parameter would likely lead to absurd probability distributions. The principle of maximum entropy In probability theory, there are two fundamentally different views toward the interpretation of a probability distribution. For an objectivist, the probability of an event is considered an objective property of that event which can always be measured as a frequency ratio in a random experiment. A subjectivist, on the other hand, regards the probability of an event as a formal expression of our expectation that the event will, or did, occur, based on information that is available. The mathematical formulations of the maximum entropy principle under these two views are identical. However, in the objective school, physical probability distributions are generated from the maximum entropy principle simply as null hypotheses, to be subject to experimental verification (Good 1963). A subjective (or non-physical) probability distribution, on the other hand, is considered justified independently of any physical argument and experimental verification (Jaynes 1957). Thus if the probability distribution generated from the principle of maximum entropy agrees with the empirical one, a subjectivist can interpret it as an independent confirmation. But this interpretation depends critically on the existence of a unique and invariant entropy. If the entropy lacks invariance under coordinate transformations, the probability distribution generated from the maximum entropy principle would not be unique and we must take the objective view. Shannon (1948) chose (4) as the entropy for a continuous distribution p(x). This entropy is a measure of uncertainty relative to the coordinate x and is not invariant under coordinate transformations. Shannon, however, was not concerned with this limitation because in information theory, the quantity of interest is the rate of information, or channel capacity. This quantity is the entropy difference of two distributions which are absolutely continuous with respect to each other, and consequently does not depend on the choice of parameters, In maximum entropy inference, the quantity of interest is the absolute entropy itself. Probability distributions are generated by maximizing the entropy. Thus a lack of invariance of entropy under coordinate transformations implies non-uniqueness in the derived probability distribution. As we have mentioned earlier, the correct extension of (1) to a continuous random variable is given by (9). Let us examine the intuitive meaning of

4 780 P. Y. Shen and L, Mansinha the prior distribution m (x) by considering a case in which there are no constraints on p (x) except that the integral of p(x) over all x is equal to 1. Using (9), instead of (4), for the entropy, the principle of maximum entropy generates the following probability distribution Now, the presence of no constraints means that we have no prior information about the random variable x. Thus except for a constant factor, m (x) is the distribution representing our complete ignorance about x (Jaynes 1968). Expression (9) satisfies Shannon s (1948) requirements for an information measure and is invariant under any coordinate transformations because p (x) and m (x) transform in the same way. It is an absolute measure of uncertainty about the random variable x and can be interpreted as the information needed to change the probability distribution from m (x) to p (x) (Good 1950; Kullback & Leibler ; Kullback 1959). Thus the problem of defining entropy for a continuous distribution is reduced to one of assigning objectively the prior distribution. Once the prior distribution is assigned, the principle of maximum entropy will provide a unique, parameter-independent method for selecting probability distributions. But how do we find the prior distribution that represents complete ignorance? The suggestion of T. Bayes and P. S. Laplace is that in some cases, we express complete ignorance by assigning a uniform prior probability density. Jeffreys (1939, 1957), on the other hand, suggested a dx/x prior probability for a continuous variable x known to be positive. Each of these rules has its domain of applications. The problem is that there appears to be no obvious criterion for choosing any particular prior distributions. Some ideas of utilizing transformation groups to classify parameters according to their prior distributions have recently been discussed by Jaynes (1 968). The group theoretical approach, however, is not essential when the independent variable is a physical quantity that can be measured to only limited accuracy. For such physical systems, we can divide the range of the independent variable x into intervals according to the accuracy with which x is measured so that two adjacent intervals are only just discernible. The entropy for the continuous variable can then be thought of as that of a discrete one and given by (Good 1963 ; Jaynes 1963; Purcaru 1973) S=-Zp(x)ln [p(x)ax] Ax. (10) Each of the just discernible intervals has equal prior probability (Lindley 1961 ; Good 1962). The length of these intervals, however, may vary from one part of the range to another and will certainly vary if the range is infinite. In other words, intervals of equal length may have unequal prior probabilities, depending on the variation of measurement error across the range. Let us denote the measurement error of x by u(x). Passing (10) to the continuous limit, we get s = - IX: p(x) In [p(x) u(x)l dx, x1 Q x < x2 Comparing this expression with (9), we see that m (x) = l/u(x), i.e. the prior probability density is inversely proportional to the error with which the independent parameter is measured. Let us illustrate the above results by a simple example. Consider a parameter x which is known to have a uniform distribution between, say, x1 = 0 and x2 = 1OO.x is also known to be measured to only limited accuracy, the measurement error being u(x) = xlc where X is a constant. We assume, for convenience, that X = 0.1 and consider 100 samples of x. Since x

5 Maximum entropy and the frequency-magnitude relation 781 is uniformly distributed, we would expect one sample to lie in each of the 100 intervals (0, l), (1,2),...,(99, 100). However, the samples which have values within the interval (xo- O.lxo,xo + 0.1~~) are indistinguishable, owing to the measurement error. Thus we could conceivably assign the value of xo to all these samples. For example, we could have nine samples with the value of x = 90, eight samples with x = 80, and so on. The resulting distribution, then, is not p(x) = constant but p(x) = 0.01~. This is proportional to o(x). Clearly, if we want to obtain the correct distribution, we must assign to each sample a weight in inverse proportion to the measurement error. In other words, we must assign a prior distribution of l/o(x). Notice that the important quantity here is the variation of measurement error across the range of the variable and not the absolute value of the error. In other words, the prior distribution of a physical parameter depends not on the method of measurement but on the method s capability (or incapability) of achieving uniform measurement accuracy across the parameter s range. When m (x) = constant, equation (9) reduces to (4) except for an additive constant. Thus Shannon s (1948) definition of entropy can be adopted for maximum entropy inference when the independent variable is measured to equal accuracy throughout its range. The same definition must also be used when no information is available about the accuracy of measurement (Good 1963). This is the approach generally adopted in the maximum entropy inference. Some recent examples are Rietsch (1977), Gull & Daniell (1978), Rubincam (1979, 1981) and Berrill & Davis (1980). Since Shannon s entropy is defined relative to the arbitrarily chosen independent variable, the probability distribution generated from the principle will be parameter-dependent. Thus we must take an objectivist s view and consider the generated distribution as simply a null hypothesis to be tested by the empirical data. If the generated distribution agrees with the empirical one, we interpret it as implying that the parameter chosen is the one which is measured to equal accuracy throughout its range. It is interesting to point out that although Gull & Daniell (1978) chose Shannon s definition for entropy, their constraint that the data have Gaussian errors is equivalent to the assignment of a prior distribution. Maximum entropy principle and earthquake frequency-magnitude relations In applying the maximum entropy principle to the inference of earthquake frequencymagnitude relations, Berrill & Davis (1980) chose the surface wave magnitude M as the independent parameter. Using Shannon s definition for entropy and assuming the existence of a mean magnitude ii?, they obtained the Gutenberg-Richter (1944) relation (7). This, however, does not constitute a confirmation of the Gutenberg-Richter relation. Instead, the result simply implies that the surface wave magnitude is a parameter measured to equal accuracy throughout its range. Suppose, instead of the surface wave magnitude, we have chosen to use the seismic moment Mo as the independent variable, and we assume that a mean seismic moment exists. Using Shannon s (1948) entropy would then lead to the following exponential distribution: In N(Mo) =a - b ~, (11) provided that the seismic moment is not bounded from above. Clearly, this is incompatible with the Gutenberg-Richter relation (7) because Mo and M are not linearly related. The seismic moment Mo for up to moderately high magnitude earthquakes is considered to be given by (e.g. Wyss & Brune 1968; Thatcher & Hanks 1973; Kanamori & Anderson 1975; Kanamori 1977) lnmo = a M + /3, (12)

6 782 P. Y. Shen and L. Mansinha where cr and /3 are constants (subject to regional variations). Substituting (12) into (7), we get the correct frequency-seismic moment relation (Molnar 1979): 1nN(M0)=aG --bg cr /3 bg cr The distribution (1 1) fails because in its derivation, the information expressed by equation (12) has not been incorporated. To generate (13) from the principle of maximum entropy with Mo as the independent variable, we must use Jaynes entropy and assign to Mo a prior distribution of Also, we must impose the constraint that a mean magnitude I%? exists, instead of the constraint that a mean seismic moment exists. Expressed in terms ofmo, this is The prior distribution (14) implies that the measurement error of Mo grows linearly with Mo. This is certainly compatible with the interpretation that the surface wave magnitude M can be measured to equal accuracy throughout its range, provided that (12) holds. In fact, m (Mo) is the Jacobian of the coordinate transformation from M to Mo. The above discussion exemplifies the use of Jaynes entropy (9) when measurement accuracy of the independent variable is known. In general, however, the measurement error is not known and the principle of maximum entropy must be applied objectively, using Shannon s definition (4) for entropy. This is the case in the following discussions. In generating Gutenberg-Richter relation from the principle of maximum entropy, we have imposed a lower bound on the magnitude (M 2 0), as well as the existence of a mean. The mean magnitude His the first-order moment of the random variable M. But it is not the only moment that can be assigned. In fact, if the magnitude M is not considered to be bounded from below, we must assign moments in such a way that the highest moment assigned is of even order. Otherwise, the upper bound of entropy would not be attainable. If we choose to assign moments of up to second-order, the probability density generated would be of the form p(m)aexp (an+bnm+cnm2), CN<O. (16) The seismicity data, however, are available for only a finite range of the magnitude scale. It is unreasonable to extrapolate beyond this range and consequently, moments can be assigned with any arbitrary highest order. In general, if k is the highest order, the probability density generated by the maximum entropy principle will be the exponential of a k-degree polynomial. The hypothesis that k is the highest order can be tested within the wider hypothesis that k t I is the highest order by means of the likelihood-ratio criterion (Good 1963). Note that if any of the moments is unspecified, the corresponding term in the polynomial will be missing. This results can be extended to the case where generalized moments gi as defined in equations (3) or (6) are assigned. The probability density then becomes the exponential of a generalized polynomial:

7 Maximum entropy and the frequency-magnitude relation 783 In this way, any probability distribution can be generated, provided that the generalized moments exist and, always, that the distribution conforms with the empirical data. We have already seen the assignment of a generalized moment in equation (15). Now consider the cumulative frequency-magnitude relation of Utsu (1971) and Purcaru (1975): In N(M) = au - bu M + K In (cu - M), M < cu (1 8) where K = 1 in Utsu (1971). The probability density is proportional to + (K - 1) In (cu -M)] Thus (1 8) can be generated by assigning the first-order moment (the mean magnitude) and two generalized moments g1 and gz where and g, (M) = (K - 1) In (cu -M). Notice that unlike M defined in (1 5), the two generalized moments have no clear physical interpretations. This reff ects the purely empirical nature of (1 8). As another example, consider the normal cumulative distribution of Shlien & Toksoz (1970): 1nN(M)=ns +bsm+cs~, cs < 0. (1 9) The probability density is proportional to +bsm+csm2+ln Thus (1 9) can be generated by assigning the first-and second-order momentqand ageneralized moment g1 where Notice that equation (1 9) gives a normal distribution for the cumulative frequency or exceedence frequency while equation (16) expresses a normal distribution for the frequency. Equation (16) is generated from the maximum entropy principle by assigning moments which have simple physical interpretations. To get (19), on the other hand, a physically unclear moment gl must be assigned. Equations (18), (19) and some other frequency-magnitude relations, proposed to fit data from various seismic regions of the world, have been discussed and classified, among others, by Utsu (1971) and Purcaru (1973,1975). Purcaru (1973) also computed the entropy for several distributions. In his calculation, however, the probability densities were taken as proportional to the cumulative frequencies. This is true for the Gutenberg-Richter relation but incorrect for the distributions given by (18) and (19). The last example we will consider here is the generalized Gutenberg-Richter relation, proposed recently by Lomnitz-Adler & Lomnitz (1978, 1979): In N(M) = al - cl exp (al M). The probability density for this distribution has the form p (M) a exp [al t al M - CL exp (al M)]. (20) (21)

8 784 P. Y. Shen and L. Mansinha It is interesting to note that (21) is formally similar to the probability density generated from the maximum entropy principle by assigning both the mean magnitude ff and mean seismic moment ffo. The latter is given by p (M) a exp [ao + bom - co exp (OM)]. However, the adjustable constants in (2 1) are al, al and cl. On the other hand, in (22), the constant a is fixed while ao, bo and co are adjustable. In order to conform with the seismicity data and the Gutenberg-Richter relation at low to moderate magnitudes, the constant CYL in (21) must have a value of about 0.17 (Lomnitz- Adler & Lomnitz 1979). On the other hand, the constant a in (22) is precisely the same one in equation (12) which relates seismic moment to magnitude. The empirically determined value of a is about 1.5 In 10 = except for very large earthquakes (eg. Kanamori & Anderson 1975). Equation (22) is derived from the maximum entropy principle under physically reasonable constraints (the existence of mean magnitude and mean seismic moment). The foundation of (20), on the other hand, has been questioned by several researchers (see Lomnitz- Adler & Lomnitz 1978; Jones, Mansinha & Shen 1982). Conclusions The principle of maximum entropy can be interpreted subjectively as a heuristic principle for encoding knowledge with minimum prejudice, or objectively as a heuristic principle for generating null hypotheses. Under the subjective view, the probability distributions generated from the principle are regarded as justified independently of any empirical verification. Consequently, it is imperative for the subjectivist to have an entropy which is an absolute measure of uncertainty about the distribution and invariant under any coordinate transformation. Otherwise, the probability distribution would not be uniquely determined. Such an entropy is given by (9) where the prior probability density m (x) is found to be inversely proportional to the measurement error of the variable x. However, when the measurement error of x is not known or when x is a parameter without clear physical interpretation, the entropy cannot be uniquely defined. In this case, it is necessary to adopt Shannon s (1948) definition (equation 4) for entropy. But then the probability distribution generated from the principle of maximum entropy will depend on the choice of the independent variable and consequently must be subjected to empirical verification. A case in point is the application of maximum entropy principle in the determination of earthquake frequency-magnitude relationship. The principle of maximum entropy generates the Gutenberg-Richter relation when the surface wave magnitude M is used as the independent variable. This is a fortuitous result arising from the fact that M happens to be the parameter which is measured to equal accuracy throughout its range. We have shown that, if, for example, the seismic moment were chosen as the independent variable, a frequency-magnitude relation which is incompatible with the empirical data would have been generated. Acknowledgment This work was supported by the Natural Sciences and Engineering Research Council of Canada. References Benu, J. B. & Davis, R. O., Maximum entropy and the magnitude distribution, Bull. seism. SOC. Am., 70,

9 Maxlinum entropy and the frequency-magnitude relation 785 Cornell, C. A. & Vanmarcke, E. H., The major influences on seismic risk, Proc. 4th World Conf. Earthquake Engineering, Santiago, Chile, 1, Feller, W., An Introduction to Probability Theory and its Applications, Wiley, New York. Good, I. J., Probability and the Weighing of Evidence, Griffin, London. Good, I. J., Contribution to the discussion of a paper by C. Stein, J. R. statist. SOC. B, 24, Good, I. J., Maximum entropy for hypothesis formulation, especially for multidimensional contingency tables, Ann. math. Statist., 34, Gull, S. F. & Daniell, G. J., Image reconstruction from incomplete and noisy data, Nature, 272, Gutenberg, B. & Richter, C. F., Frequency of earthquakes in California, Bull. seism. SOC. Am., 34, Jaynes, E. T., Information theory and statistical mechanics, Phys. Rev., 106, , and 108, Jaynes, E. T., Information theory and statistical mechanics, in Statistical Physics, pp , ed. Ford, K. W., W. A. Benjamin, New York. Jaynes, E. T., Prior probabilities, IEEE Trans. Systems Science and Cybernetics, SSC-4, Jeffreys, H., Theory of Probability, Oxford University Press, London. Jeffreys, H., Scientific Inference, Cambridge University Press, London. Jones, I. F., Mansinha, L. & Shen, P. Y., On the double exponential frequency-magnitude relation of earthquakes, Bull. seism. SOC. Am., 72, Kanamori, H., The energy release in great earthquakes, J. geophys. Res., 82, Kanamori, H. & Anderson, D. L., Theoretical basis of some empirical relations in seismology, Bull. seism. SOC. Am., 65, Knopoff, L. & Kagan, Y., Analysis of the theory of extremes as applied to earthquake problems, J. geophys. Res., 82, Kullback, S., Information Theory and Statistics, Wiley, New York. Kulback, S. & Leibler, R. A., On information and sufficiency, Ann. math. Statist., 22, Lindley, D. V., The use of prior probability distributions in statistical inference and decision, Proc. 4th Berkeley Symp. Math. Statkt. Bob., 1, University of California Press. Lomnitz-Adler, J. & Lomnitz, C., A new magnitude-frequency relation, Tectonophys., 49, Lomnitz- Adler, J. & Lomnitz, C., A modified form of thegutenberg-richter magnitude-frequency relation,bull. seism. SOC. Am., 69, Molnar, P., Earthquake recurrence intervals and plate tectonics, Bull. seism. SOC. Am., 69, Purcaru, G., The informational energy and entropy in earthquake statistics and prediction of earthquakes, Riv. ital. Geofi., 22, Purcaru, G., A new magnitude-frequency relation for earthquakes and a classification of relation types, Geophys. J. R. astr. SOC., 42, Rietsch, E., The maximum entropy approach to inverse probiems,j. Geophys., 42, Riznichenko, Yu. V., Cumulative method of earthquakes for the study of the seismic activity, Izu. Akad. Nauk. SSSR. Ser. Geophys., 7, Rubincam, D. P., Information theory and the earth s density distribution, NASAIGSFC Tech. Memo Rubincam, D. P., Information theory lateral density distribution for earth inferred from global gravity field, NASAIGSFC Tech. Memo Shannon, C. E., A mathematical theory of communication, Bell Syst. tech. J., 27, and reprinted in The Mathematical Theory of Communication by C. E. Shannon and W. W. Weaver, University of Illinois Press, Urbana, Shlien, S. & Toksoz, M. N., Frequency-magnitude statistics of earthquake occurrences, Earthq. Notes, 41,5-18. Thatcher, W. & Hanks, T. C., Source parameters of Southern California earthquakes, J. geophys. Res., 78, Utsu, T., Aftershocks and earthquake statistics (111), J. Fac. Sci., Hokkaido Univ. Ser VII, 3, Wyss, M. & Brune, J. N., Seismic moment, stress and source dimensions for earthquakes in the California-Nevada region, J. geophys. Res., 73,

UNBIASED ESTIMATE FOR b-value OF MAGNITUDE FREQUENCY

UNBIASED ESTIMATE FOR b-value OF MAGNITUDE FREQUENCY J. Phys. Earth, 34, 187-194, 1986 UNBIASED ESTIMATE FOR b-value OF MAGNITUDE FREQUENCY Yosihiko OGATA* and Ken'ichiro YAMASHINA** * Institute of Statistical Mathematics, Tokyo, Japan ** Earthquake Research

More information

arxiv:physics/ v2 [physics.geo-ph] 18 Aug 2003

arxiv:physics/ v2 [physics.geo-ph] 18 Aug 2003 Is Earthquake Triggering Driven by Small Earthquakes? arxiv:physics/0210056v2 [physics.geo-ph] 18 Aug 2003 Agnès Helmstetter Laboratoire de Géophysique Interne et Tectonophysique, Observatoire de Grenoble,

More information

A GLOBAL MODEL FOR AFTERSHOCK BEHAVIOUR

A GLOBAL MODEL FOR AFTERSHOCK BEHAVIOUR A GLOBAL MODEL FOR AFTERSHOCK BEHAVIOUR Annemarie CHRISTOPHERSEN 1 And Euan G C SMITH 2 SUMMARY This paper considers the distribution of aftershocks in space, abundance, magnitude and time. Investigations

More information

Model and observed statistics of California earthquakes parameters

Model and observed statistics of California earthquakes parameters Model and observed statistics of California earthquakes parameters M. CAPUTO (*) - R. CONSOLE (*) Received on May 5 th, 1978. RIASSUNTO - 11 modello proposto da Caputo per la statistica dei parametri dei

More information

SEISMIC HAZARD CHARACTERIZATION AND RISK EVALUATION USING GUMBEL S METHOD OF EXTREMES (G1 AND G3) AND G-R FORMULA FOR IRAQ

SEISMIC HAZARD CHARACTERIZATION AND RISK EVALUATION USING GUMBEL S METHOD OF EXTREMES (G1 AND G3) AND G-R FORMULA FOR IRAQ 13 th World Conference on Earthquake Engineering Vancouver, B.C., Canada August 1-6, 2004 Paper No. 2898 SEISMIC HAZARD CHARACTERIZATION AND RISK EVALUATION USING GUMBEL S METHOD OF EXTREMES (G1 AND G3)

More information

arxiv:physics/ v1 6 Aug 2006

arxiv:physics/ v1 6 Aug 2006 The application of the modified form of Båth s law to the North Anatolian Fault Zone arxiv:physics/0608064 v1 6 Aug 2006 1. INTRODUCTION S E Yalcin, M L Kurnaz Department of Physics, Bogazici University,

More information

Physics 403. Segev BenZvi. Choosing Priors and the Principle of Maximum Entropy. Department of Physics and Astronomy University of Rochester

Physics 403. Segev BenZvi. Choosing Priors and the Principle of Maximum Entropy. Department of Physics and Astronomy University of Rochester Physics 403 Choosing Priors and the Principle of Maximum Entropy Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Odds Ratio Occam Factors

More information

On the validity of time-predictable model for earthquake generation in north-east India

On the validity of time-predictable model for earthquake generation in north-east India Proc. Indian Acad. Sci. (Earth Planet. Sci.), Vol. 101, No. 4, December 1992, pp. 361-368. 9 Printed in India. On the validity of time-predictable model for earthquake generation in north-east India V

More information

Received: 20 December 2011; in revised form: 4 February 2012 / Accepted: 7 February 2012 / Published: 2 March 2012

Received: 20 December 2011; in revised form: 4 February 2012 / Accepted: 7 February 2012 / Published: 2 March 2012 Entropy 2012, 14, 480-490; doi:10.3390/e14030480 Article OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Interval Entropy and Informative Distance Fakhroddin Misagh 1, * and Gholamhossein

More information

BOLTZMANN ENTROPY: PROBABILITY AND INFORMATION

BOLTZMANN ENTROPY: PROBABILITY AND INFORMATION STATISTICAL PHYSICS BOLTZMANN ENTROPY: PROBABILITY AND INFORMATION C. G. CHAKRABARTI 1, I. CHAKRABARTY 2 1 Department of Applied Mathematics, Calcutta University Kolkata 700 009, India E-mail: cgc-math@rediflmail.com

More information

CS 630 Basic Probability and Information Theory. Tim Campbell

CS 630 Basic Probability and Information Theory. Tim Campbell CS 630 Basic Probability and Information Theory Tim Campbell 21 January 2003 Probability Theory Probability Theory is the study of how best to predict outcomes of events. An experiment (or trial or event)

More information

Seismic Hazard Assessment for Specified Area

Seismic Hazard Assessment for Specified Area ESTIATION OF AXIU REGIONAL AGNITUDE m At present there is no generally accepted method for estimating the value of the imum regional magnitude m. The methods for evaluating m fall into two main categories:

More information

Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems

Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems Jeremy S. Conner and Dale E. Seborg Department of Chemical Engineering University of California, Santa Barbara, CA

More information

Theory of earthquake recurrence times

Theory of earthquake recurrence times Theory of earthquake recurrence times Alex SAICHEV1,2 and Didier SORNETTE1,3 1ETH Zurich, Switzerland 2Mathematical Department, Nizhny Novgorod State University, Russia. 3Institute of Geophysics and Planetary

More information

A BAYESIAN MATHEMATICAL STATISTICS PRIMER. José M. Bernardo Universitat de València, Spain

A BAYESIAN MATHEMATICAL STATISTICS PRIMER. José M. Bernardo Universitat de València, Spain A BAYESIAN MATHEMATICAL STATISTICS PRIMER José M. Bernardo Universitat de València, Spain jose.m.bernardo@uv.es Bayesian Statistics is typically taught, if at all, after a prior exposure to frequentist

More information

arxiv: v1 [math.st] 10 Aug 2007

arxiv: v1 [math.st] 10 Aug 2007 The Marginalization Paradox and the Formal Bayes Law arxiv:0708.1350v1 [math.st] 10 Aug 2007 Timothy C. Wallstrom Theoretical Division, MS B213 Los Alamos National Laboratory Los Alamos, NM 87545 tcw@lanl.gov

More information

Small-world structure of earthquake network

Small-world structure of earthquake network Small-world structure of earthquake network Sumiyoshi Abe 1 and Norikazu Suzuki 2 1 Institute of Physics, University of Tsukuba, Ibaraki 305-8571, Japan 2 College of Science and Technology, Nihon University,

More information

Information Theory in Intelligent Decision Making

Information Theory in Intelligent Decision Making Information Theory in Intelligent Decision Making Adaptive Systems and Algorithms Research Groups School of Computer Science University of Hertfordshire, United Kingdom June 7, 2015 Information Theory

More information

Information Theory and Hypothesis Testing

Information Theory and Hypothesis Testing Summer School on Game Theory and Telecommunications Campione, 7-12 September, 2014 Information Theory and Hypothesis Testing Mauro Barni University of Siena September 8 Review of some basic results linking

More information

The sequential decoding metric for detection in sensor networks

The sequential decoding metric for detection in sensor networks The sequential decoding metric for detection in sensor networks B. Narayanaswamy, Yaron Rachlin, Rohit Negi and Pradeep Khosla Department of ECE Carnegie Mellon University Pittsburgh, PA, 523 Email: {bnarayan,rachlin,negi,pkk}@ece.cmu.edu

More information

Some New Information Inequalities Involving f-divergences

Some New Information Inequalities Involving f-divergences BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 12, No 2 Sofia 2012 Some New Information Inequalities Involving f-divergences Amit Srivastava Department of Mathematics, Jaypee

More information

The Role of Asperities in Aftershocks

The Role of Asperities in Aftershocks The Role of Asperities in Aftershocks James B. Silva Boston University April 7, 2016 Collaborators: William Klein, Harvey Gould Kang Liu, Nick Lubbers, Rashi Verma, Tyler Xuan Gu OUTLINE Introduction The

More information

Bulletin of the Seismological Society of America, Vol. 75, No. 4, pp , August 1985

Bulletin of the Seismological Society of America, Vol. 75, No. 4, pp , August 1985 Bulletin of the Seismological Society of America, Vol. 75, No. 4, pp. 939-964, August 1985 IMPLICATIONS OF FAULT SLIP RATES AND EARTHQUAKE RECURRENCE MODELS TO PROBABILISTIC SEISMIC HAZARD ESTIMATES BY

More information

TWELVE LIMIT CYCLES IN A CUBIC ORDER PLANAR SYSTEM WITH Z 2 -SYMMETRY. P. Yu 1,2 and M. Han 1

TWELVE LIMIT CYCLES IN A CUBIC ORDER PLANAR SYSTEM WITH Z 2 -SYMMETRY. P. Yu 1,2 and M. Han 1 COMMUNICATIONS ON Website: http://aimsciences.org PURE AND APPLIED ANALYSIS Volume 3, Number 3, September 2004 pp. 515 526 TWELVE LIMIT CYCLES IN A CUBIC ORDER PLANAR SYSTEM WITH Z 2 -SYMMETRY P. Yu 1,2

More information

Performance of national scale smoothed seismicity estimates of earthquake activity rates. Abstract

Performance of national scale smoothed seismicity estimates of earthquake activity rates. Abstract Performance of national scale smoothed seismicity estimates of earthquake activity rates Jonathan Griffin 1, Graeme Weatherill 2 and Trevor Allen 3 1. Corresponding Author, Geoscience Australia, Symonston,

More information

Preparation of a Comprehensive Earthquake Catalog for Northeast India and its completeness analysis

Preparation of a Comprehensive Earthquake Catalog for Northeast India and its completeness analysis IOSR Journal of Applied Geology and Geophysics (IOSR-JAGG) e-issn: 2321 0990, p-issn: 2321 0982.Volume 2, Issue 6 Ver. I (Nov-Dec. 2014), PP 22-26 Preparation of a Comprehensive Earthquake Catalog for

More information

Goodness of Fit Test and Test of Independence by Entropy

Goodness of Fit Test and Test of Independence by Entropy Journal of Mathematical Extension Vol. 3, No. 2 (2009), 43-59 Goodness of Fit Test and Test of Independence by Entropy M. Sharifdoost Islamic Azad University Science & Research Branch, Tehran N. Nematollahi

More information

GLOBAL SOURCE PARAMETERS OF FINITE FAULT MODEL FOR STRONG GROUND MOTION SIMULATIONS OR PREDICTIONS

GLOBAL SOURCE PARAMETERS OF FINITE FAULT MODEL FOR STRONG GROUND MOTION SIMULATIONS OR PREDICTIONS 13 th orld Conference on Earthquake Engineering Vancouver, B.C., Canada August 1-6, 2004 Paper No. 2743 GLOBAL SOURCE PARAMETERS OF FINITE FAULT MODEL FOR STRONG GROUND MOTION SIMULATIONS OR PREDICTIONS

More information

Shaking Hazard Compatible Methodology for Probabilistic Assessment of Fault Displacement Hazard

Shaking Hazard Compatible Methodology for Probabilistic Assessment of Fault Displacement Hazard Surface Fault Displacement Hazard Workshop PEER, Berkeley, May 20-21, 2009 Shaking Hazard Compatible Methodology for Probabilistic Assessment of Fault Displacement Hazard Maria Todorovska Civil & Environmental

More information

A Measure of the Goodness of Fit in Unbinned Likelihood Fits; End of Bayesianism?

A Measure of the Goodness of Fit in Unbinned Likelihood Fits; End of Bayesianism? A Measure of the Goodness of Fit in Unbinned Likelihood Fits; End of Bayesianism? Rajendran Raja Fermilab, Batavia, IL 60510, USA PHYSTAT2003, SLAC, Stanford, California, September 8-11, 2003 Maximum likelihood

More information

MODULE -4 BAYEIAN LEARNING

MODULE -4 BAYEIAN LEARNING MODULE -4 BAYEIAN LEARNING CONTENT Introduction Bayes theorem Bayes theorem and concept learning Maximum likelihood and Least Squared Error Hypothesis Maximum likelihood Hypotheses for predicting probabilities

More information

Probabilistic seismic hazard maps for the Japanese islands

Probabilistic seismic hazard maps for the Japanese islands Soil Dynamics and Earthquake Engineering 20 (2000) 485±491 www.elsevier.com/locate/soildyn Probabilistic seismic hazard maps for the Japanese islands A. Kijko a, A.O. OÈ ncel b, * a Council for Geoscience,

More information

HANDBOOK OF APPLICABLE MATHEMATICS

HANDBOOK OF APPLICABLE MATHEMATICS HANDBOOK OF APPLICABLE MATHEMATICS Chief Editor: Walter Ledermann Volume VI: Statistics PART A Edited by Emlyn Lloyd University of Lancaster A Wiley-Interscience Publication JOHN WILEY & SONS Chichester

More information

Y. Y. Kagan and L. Knopoff Institute of Geophysics and Planetary Physics, University of California, Los Angeles, California 90024, USA

Y. Y. Kagan and L. Knopoff Institute of Geophysics and Planetary Physics, University of California, Los Angeles, California 90024, USA Geophys. J. R. astr. SOC. (1987) 88, 723-731 Random stress and earthquake statistics: time dependence Y. Y. Kagan and L. Knopoff Institute of Geophysics and Planetary Physics, University of California,

More information

Apparent Breaks in Scaling in the Earthquake Cumulative Frequency-Magnitude Distribution: Fact or Artifact?

Apparent Breaks in Scaling in the Earthquake Cumulative Frequency-Magnitude Distribution: Fact or Artifact? Bulletin of the Seismological Society of America, 90, 1, pp. 86 97, February 2000 Apparent Breaks in Scaling in the Earthquake Cumulative Frequency-Magnitude Distribution: Fact or Artifact? by Ian Main

More information

Global regression relations for conversion of surface wave and body wave magnitudes to moment magnitude

Global regression relations for conversion of surface wave and body wave magnitudes to moment magnitude Nat Hazards (2011) 59:801 810 DOI 10.1007/s11069-011-9796-6 ORIGINAL PAPER Global regression relations for conversion of surface wave and body wave magnitudes to moment magnitude Ranjit Das H. R. Wason

More information

Comment on Systematic survey of high-resolution b-value imaging along Californian faults: inference on asperities.

Comment on Systematic survey of high-resolution b-value imaging along Californian faults: inference on asperities. Comment on Systematic survey of high-resolution b-value imaging along Californian faults: inference on asperities Yavor Kamer 1, 2 1 Swiss Seismological Service, ETH Zürich, Switzerland 2 Chair of Entrepreneurial

More information

Codal provisions of seismic hazard in Northeast India

Codal provisions of seismic hazard in Northeast India Codal provisions of seismic hazard in Northeast India Sandip Das 1, Vinay K. Gupta 1, * and Ishwer D. Gupta 2 1 Department of Civil Engineering, Indian Institute of Technology, Kanpur 208 016, India 2

More information

A GENERAL CLASS OF LOWER BOUNDS ON THE PROBABILITY OF ERROR IN MULTIPLE HYPOTHESIS TESTING. Tirza Routtenberg and Joseph Tabrikian

A GENERAL CLASS OF LOWER BOUNDS ON THE PROBABILITY OF ERROR IN MULTIPLE HYPOTHESIS TESTING. Tirza Routtenberg and Joseph Tabrikian A GENERAL CLASS OF LOWER BOUNDS ON THE PROBABILITY OF ERROR IN MULTIPLE HYPOTHESIS TESTING Tirza Routtenberg and Joseph Tabrikian Department of Electrical and Computer Engineering Ben-Gurion University

More information

Information Theory, Statistics, and Decision Trees

Information Theory, Statistics, and Decision Trees Information Theory, Statistics, and Decision Trees Léon Bottou COS 424 4/6/2010 Summary 1. Basic information theory. 2. Decision trees. 3. Information theory and statistics. Léon Bottou 2/31 COS 424 4/6/2010

More information

Because of its reputation of validity over a wide range of

Because of its reputation of validity over a wide range of The magnitude distribution of declustered earthquakes in Southern California Leon Knopoff* Department of Physics and Astronomy and Institute of Geophysics and Planetary Physics, University of California,

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Lecture 1: Introduction, Entropy and ML estimation

Lecture 1: Introduction, Entropy and ML estimation 0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual

More information

Aspects of risk assessment in power-law distributed natural hazards

Aspects of risk assessment in power-law distributed natural hazards Natural Hazards and Earth System Sciences (2004) 4: 309 313 SRef-ID: 1684-9981/nhess/2004-4-309 European Geosciences Union 2004 Natural Hazards and Earth System Sciences Aspects of risk assessment in power-law

More information

UNIFORMLY MOST POWERFUL CYCLIC PERMUTATION INVARIANT DETECTION FOR DISCRETE-TIME SIGNALS

UNIFORMLY MOST POWERFUL CYCLIC PERMUTATION INVARIANT DETECTION FOR DISCRETE-TIME SIGNALS UNIFORMLY MOST POWERFUL CYCLIC PERMUTATION INVARIANT DETECTION FOR DISCRETE-TIME SIGNALS F. C. Nicolls and G. de Jager Department of Electrical Engineering, University of Cape Town Rondebosch 77, South

More information

Hypothesis testing (cont d)

Hypothesis testing (cont d) Hypothesis testing (cont d) Ulrich Heintz Brown University 4/12/2016 Ulrich Heintz - PHYS 1560 Lecture 11 1 Hypothesis testing Is our hypothesis about the fundamental physics correct? We will not be able

More information

A Bayesian Change Point Model To Detect Changes In Event Occurrence Rates, With Application To Induced Seismicity

A Bayesian Change Point Model To Detect Changes In Event Occurrence Rates, With Application To Induced Seismicity Vancouver, Canada, July -5, 5 A Bayesian Change Point Model To Detect Changes In Event Occurrence Rates, With Application To Induced Seismicity Abhineet Gupta Graduate Student, Dept. of Civil and Environmental

More information

Chapter ML:IV. IV. Statistical Learning. Probability Basics Bayes Classification Maximum a-posteriori Hypotheses

Chapter ML:IV. IV. Statistical Learning. Probability Basics Bayes Classification Maximum a-posteriori Hypotheses Chapter ML:IV IV. Statistical Learning Probability Basics Bayes Classification Maximum a-posteriori Hypotheses ML:IV-1 Statistical Learning STEIN 2005-2017 Area Overview Mathematics Statistics...... Stochastics

More information

Recurrence Times for Parkfield Earthquakes: Actual and Simulated. Paul B. Rundle, Donald L. Turcotte, John B. Rundle, and Gleb Yakovlev

Recurrence Times for Parkfield Earthquakes: Actual and Simulated. Paul B. Rundle, Donald L. Turcotte, John B. Rundle, and Gleb Yakovlev Recurrence Times for Parkfield Earthquakes: Actual and Simulated Paul B. Rundle, Donald L. Turcotte, John B. Rundle, and Gleb Yakovlev 1 Abstract In this paper we compare the sequence of recurrence times

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Chapter 9. Non-Parametric Density Function Estimation

Chapter 9. Non-Parametric Density Function Estimation 9-1 Density Estimation Version 1.2 Chapter 9 Non-Parametric Density Function Estimation 9.1. Introduction We have discussed several estimation techniques: method of moments, maximum likelihood, and least

More information

An earthquake is the result of a sudden displacement across a fault that releases stresses that have accumulated in the crust of the earth.

An earthquake is the result of a sudden displacement across a fault that releases stresses that have accumulated in the crust of the earth. An earthquake is the result of a sudden displacement across a fault that releases stresses that have accumulated in the crust of the earth. Measuring an Earthquake s Size Magnitude and Moment Each can

More information

Derivation of Table 2 in Okada (1992)

Derivation of Table 2 in Okada (1992) Derivation of Table 2 in Okada (1992) [ I ] Derivation of Eqs.(4) through (6) Eq.(1) of Okada (1992) can be rewritten, where is the displacement at due to a j-th direction single force of unit magnitude

More information

Introduction to Information Entropy Adapted from Papoulis (1991)

Introduction to Information Entropy Adapted from Papoulis (1991) Introduction to Information Entropy Adapted from Papoulis (1991) Federico Lombardo Papoulis, A., Probability, Random Variables and Stochastic Processes, 3rd edition, McGraw ill, 1991. 1 1. INTRODUCTION

More information

Gutenberg-Richter Relationship: Magnitude vs. frequency of occurrence

Gutenberg-Richter Relationship: Magnitude vs. frequency of occurrence Quakes per year. Major = 7-7.9; Great = 8 or larger. Year Major quakes Great quakes 1969 15 1 1970 20 0 1971 19 1 1972 15 0 1973 13 0 1974 14 0 1975 14 1 1976 15 2 1977 11 2 1978 16 1 1979 13 0 1980 13

More information

A note on the asymptotic distribution of Berk-Jones type statistics under the null hypothesis

A note on the asymptotic distribution of Berk-Jones type statistics under the null hypothesis A note on the asymptotic distribution of Berk-Jones type statistics under the null hypothesis Jon A. Wellner and Vladimir Koltchinskii Abstract. Proofs are given of the limiting null distributions of the

More information

P33 Correlation between the Values of b and DC for the Different Regions in the Western Anatolia

P33 Correlation between the Values of b and DC for the Different Regions in the Western Anatolia P33 Correlation between the Values of b and DC for the Different Regions in the Western Anatolia E. Bayrak* (Karadeniz Technical University) & Y. Bayrak (Karadeniz Technical University) SUMMARY The b-value

More information

Chaos, Complexity, and Inference (36-462)

Chaos, Complexity, and Inference (36-462) Chaos, Complexity, and Inference (36-462) Lecture 7: Information Theory Cosma Shalizi 3 February 2009 Entropy and Information Measuring randomness and dependence in bits The connection to statistics Long-run

More information

Bayesian Learning (II)

Bayesian Learning (II) Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning (II) Niels Landwehr Overview Probabilities, expected values, variance Basic concepts of Bayesian learning MAP

More information

Chapter 9. Non-Parametric Density Function Estimation

Chapter 9. Non-Parametric Density Function Estimation 9-1 Density Estimation Version 1.1 Chapter 9 Non-Parametric Density Function Estimation 9.1. Introduction We have discussed several estimation techniques: method of moments, maximum likelihood, and least

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB

More information

On Dependence Balance Bounds for Two Way Channels

On Dependence Balance Bounds for Two Way Channels On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu

More information

G. Larry Bretthorst. Washington University, Department of Chemistry. and. C. Ray Smith

G. Larry Bretthorst. Washington University, Department of Chemistry. and. C. Ray Smith in Infrared Systems and Components III, pp 93.104, Robert L. Caswell ed., SPIE Vol. 1050, 1989 Bayesian Analysis of Signals from Closely-Spaced Objects G. Larry Bretthorst Washington University, Department

More information

Department of Statistics, School of Mathematical Sciences, Ferdowsi University of Mashhad, Iran.

Department of Statistics, School of Mathematical Sciences, Ferdowsi University of Mashhad, Iran. JIRSS (2012) Vol. 11, No. 2, pp 191-202 A Goodness of Fit Test For Exponentiality Based on Lin-Wong Information M. Abbasnejad, N. R. Arghami, M. Tavakoli Department of Statistics, School of Mathematical

More information

Uncertainties in a probabilistic model for seismic hazard analysis in Japan

Uncertainties in a probabilistic model for seismic hazard analysis in Japan Uncertainties in a probabilistic model for seismic hazard analysis in Japan T. Annaka* and H. Yashiro* * Tokyo Electric Power Services Co., Ltd., Japan ** The Tokio Marine and Fire Insurance Co., Ltd.,

More information

A TESTABLE FIVE-YEAR FORECAST OF MODERATE AND LARGE EARTHQUAKES. Yan Y. Kagan 1,David D. Jackson 1, and Yufang Rong 2

A TESTABLE FIVE-YEAR FORECAST OF MODERATE AND LARGE EARTHQUAKES. Yan Y. Kagan 1,David D. Jackson 1, and Yufang Rong 2 Printed: September 1, 2005 A TESTABLE FIVE-YEAR FORECAST OF MODERATE AND LARGE EARTHQUAKES IN SOUTHERN CALIFORNIA BASED ON SMOOTHED SEISMICITY Yan Y. Kagan 1,David D. Jackson 1, and Yufang Rong 2 1 Department

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Rotation of the Principal Stress Directions Due to Earthquake Faulting and Its Seismological Implications

Rotation of the Principal Stress Directions Due to Earthquake Faulting and Its Seismological Implications Bulletin of the Seismological Society of America, Vol. 85, No. 5, pp. 1513-1517, October 1995 Rotation of the Principal Stress Directions Due to Earthquake Faulting and Its Seismological Implications by

More information

Verification of Predictions of Magnitude of Completeness Using an Earthquake Catalog

Verification of Predictions of Magnitude of Completeness Using an Earthquake Catalog Verification of Predictions of Magnitude of Completeness Using an Earthquake Catalog Designing induced seismic monitoring networks to meet regulations Dario Baturan Presented at GeoConvention 2015 Introduction

More information

A review of earthquake occurrence models for seismic hazard analysis

A review of earthquake occurrence models for seismic hazard analysis A review of earthquake occurrence models for seismic hazard analysis Thalia Anagnos Department of Civil Engineering, San Jose State University, San Jose, CA 95192, USA Anne S. Kiremidjian Department of

More information

Scaling of apparent stress from broadband radiated energy catalogue and seismic moment catalogue and its focal mechanism dependence

Scaling of apparent stress from broadband radiated energy catalogue and seismic moment catalogue and its focal mechanism dependence Earth Planets Space, 53, 943 948, 2001 Scaling of apparent stress from broadband radiated energy catalogue and seismic moment catalogue and its focal mechanism dependence Z. L. Wu Institute of Geophysics,

More information

Information measures in simple coding problems

Information measures in simple coding problems Part I Information measures in simple coding problems in this web service in this web service Source coding and hypothesis testing; information measures A(discrete)source is a sequence {X i } i= of random

More information

PRIOR PROBABILITIES: AN INFORMATION-THEORETIC APPROACH

PRIOR PROBABILITIES: AN INFORMATION-THEORETIC APPROACH PRIOR PROBABILITIES: AN INFORMATION-THEORETIC APPROACH Philip Goyal Cavendish Laboratory, Cambridge University. Abstract. General theoretical principles that enable the derivation of prior probabilities

More information

Testing Goodness-of-Fit for Exponential Distribution Based on Cumulative Residual Entropy

Testing Goodness-of-Fit for Exponential Distribution Based on Cumulative Residual Entropy This article was downloaded by: [Ferdowsi University] On: 16 April 212, At: 4:53 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 172954 Registered office: Mortimer

More information

CHAPTER 4 ENTROPY AND INFORMATION

CHAPTER 4 ENTROPY AND INFORMATION 4-1 CHAPTER 4 ENTROPY AND INFORMATION In the literature one finds the terms information theory and communication theory used interchangeably. As there seems to be no wellestablished convention for their

More information

Preliminary test of the EEPAS long term earthquake forecast model in Australia

Preliminary test of the EEPAS long term earthquake forecast model in Australia Preliminary test of the EEPAS long term earthquake forecast model in Australia Paul Somerville 1, Jeff Fisher 1, David Rhoades 2 and Mark Leonard 3 Abstract 1 Risk Frontiers 2 GNS Science 3 Geoscience

More information

Robustness and duality of maximum entropy and exponential family distributions

Robustness and duality of maximum entropy and exponential family distributions Chapter 7 Robustness and duality of maximum entropy and exponential family distributions In this lecture, we continue our study of exponential families, but now we investigate their properties in somewhat

More information

Calculation of maximum entropy densities with application to income distribution

Calculation of maximum entropy densities with application to income distribution Journal of Econometrics 115 (2003) 347 354 www.elsevier.com/locate/econbase Calculation of maximum entropy densities with application to income distribution Ximing Wu Department of Agricultural and Resource

More information

DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY

DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY OUTLINE 3.1 Why Probability? 3.2 Random Variables 3.3 Probability Distributions 3.4 Marginal Probability 3.5 Conditional Probability 3.6 The Chain

More information

Physics 509: Bootstrap and Robust Parameter Estimation

Physics 509: Bootstrap and Robust Parameter Estimation Physics 509: Bootstrap and Robust Parameter Estimation Scott Oser Lecture #20 Physics 509 1 Nonparametric parameter estimation Question: what error estimate should you assign to the slope and intercept

More information

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition.

The Bayesian Choice. Christian P. Robert. From Decision-Theoretic Foundations to Computational Implementation. Second Edition. Christian P. Robert The Bayesian Choice From Decision-Theoretic Foundations to Computational Implementation Second Edition With 23 Illustrations ^Springer" Contents Preface to the Second Edition Preface

More information

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check

More information

Module 7 SEISMIC HAZARD ANALYSIS (Lectures 33 to 36)

Module 7 SEISMIC HAZARD ANALYSIS (Lectures 33 to 36) Lecture 34 Topics Module 7 SEISMIC HAZARD ANALYSIS (Lectures 33 to 36) 7.3 DETERMINISTIC SEISMIC HAZARD ANALYSIS 7.4 PROBABILISTIC SEISMIC HAZARD ANALYSIS 7.4.1 Earthquake Source Characterization 7.4.2

More information

ON LARGE SAMPLE PROPERTIES OF CERTAIN NONPARAMETRIC PROCEDURES

ON LARGE SAMPLE PROPERTIES OF CERTAIN NONPARAMETRIC PROCEDURES ON LARGE SAMPLE PROPERTIES OF CERTAIN NONPARAMETRIC PROCEDURES 1. Summary and introduction HERMAN RUBIN PURDUE UNIVERSITY Efficiencies of one sided and two sided procedures are considered from the standpoint

More information

Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds.

Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds. Proceedings of the 2016 Winter Simulation Conference T. M. K. Roeder, P. I. Frazier, R. Szechtman, E. Zhou, T. Huschka, and S. E. Chick, eds. A SIMULATION-BASED COMPARISON OF MAXIMUM ENTROPY AND COPULA

More information

PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata

PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE. Noboru Murata ' / PROPERTIES OF THE EMPIRICAL CHARACTERISTIC FUNCTION AND ITS APPLICATION TO TESTING FOR INDEPENDENCE Noboru Murata Waseda University Department of Electrical Electronics and Computer Engineering 3--

More information

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35

Information Theory. Mark van Rossum. January 24, School of Informatics, University of Edinburgh 1 / 35 1 / 35 Information Theory Mark van Rossum School of Informatics, University of Edinburgh January 24, 2018 0 Version: January 24, 2018 Why information theory 2 / 35 Understanding the neural code. Encoding

More information

Monte Carlo simulations for analysis and prediction of nonstationary magnitude-frequency distributions in probabilistic seismic hazard analysis

Monte Carlo simulations for analysis and prediction of nonstationary magnitude-frequency distributions in probabilistic seismic hazard analysis Monte Carlo simulations for analysis and prediction of nonstationary magnitude-frequency distributions in probabilistic seismic hazard analysis Mauricio Reyes Canales and Mirko van der Baan Dept. of Physics,

More information

Scaling Relationship between the Number of Aftershocks and the Size of the Main

Scaling Relationship between the Number of Aftershocks and the Size of the Main J. Phys. Earth, 38, 305-324, 1990 Scaling Relationship between the Number of Aftershocks and the Size of the Main Shock Yoshiko Yamanaka* and Kunihiko Shimazaki Earthquake Research Institute, The University

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

On the Duality between Multiple-Access Codes and Computation Codes

On the Duality between Multiple-Access Codes and Computation Codes On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch

More information

log (N) 2.9<M< <M< <M< <M<4.9 tot in bin [N] = Mid Point M log (N) =

log (N) 2.9<M< <M< <M< <M<4.9 tot in bin [N] = Mid Point M log (N) = Solution Set for Assignment Exercise : Gutenberg-Richter relationship: log() = a + b. M A) For a time period between January, 90 to December 3, 998 tot in bin [] = 450 6 57 22 7 5 Mid Point M 3.5 3.65

More information

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006

Hypothesis Testing. Part I. James J. Heckman University of Chicago. Econ 312 This draft, April 20, 2006 Hypothesis Testing Part I James J. Heckman University of Chicago Econ 312 This draft, April 20, 2006 1 1 A Brief Review of Hypothesis Testing and Its Uses values and pure significance tests (R.A. Fisher)

More information

Scale-free network of earthquakes

Scale-free network of earthquakes Scale-free network of earthquakes Sumiyoshi Abe 1 and Norikazu Suzuki 2 1 Institute of Physics, University of Tsukuba, Ibaraki 305-8571, Japan 2 College of Science and Technology, Nihon University, Chiba

More information

Bayes Decision Theory

Bayes Decision Theory Bayes Decision Theory Minimum-Error-Rate Classification Classifiers, Discriminant Functions and Decision Surfaces The Normal Density 0 Minimum-Error-Rate Classification Actions are decisions on classes

More information

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1 Parameter Estimation William H. Jefferys University of Texas at Austin bill@bayesrules.net Parameter Estimation 7/26/05 1 Elements of Inference Inference problems contain two indispensable elements: Data

More information

A Note on Hypothesis Testing with Random Sample Sizes and its Relationship to Bayes Factors

A Note on Hypothesis Testing with Random Sample Sizes and its Relationship to Bayes Factors Journal of Data Science 6(008), 75-87 A Note on Hypothesis Testing with Random Sample Sizes and its Relationship to Bayes Factors Scott Berry 1 and Kert Viele 1 Berry Consultants and University of Kentucky

More information

PATTERN RECOGNITION AND MACHINE LEARNING

PATTERN RECOGNITION AND MACHINE LEARNING PATTERN RECOGNITION AND MACHINE LEARNING Chapter 1. Introduction Shuai Huang April 21, 2014 Outline 1 What is Machine Learning? 2 Curve Fitting 3 Probability Theory 4 Model Selection 5 The curse of dimensionality

More information