CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

Similar documents
Quick Review of Probability

Quick Review of Probability

Random Variables, Sampling and Estimation

Probability and statistics: basic terms

Joint Probability Distributions and Random Samples. Jointly Distributed Random Variables. Chapter { }


4. Basic probability theory

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function.

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

Lecture 7: Properties of Random Samples

NANYANG TECHNOLOGICAL UNIVERSITY SYLLABUS FOR ENTRANCE EXAMINATION FOR INTERNATIONAL STUDENTS AO-LEVEL MATHEMATICS

Lecture 1 Probability and Statistics

What is Probability?

EE 4TM4: Digital Communications II Probability Theory

Lecture 19: Convergence

Chapter 6 Sampling Distributions

Topic 9: Sampling Distributions of Estimators

Chapter 6 Principles of Data Reduction

BHW #13 1/ Cooper. ENGR 323 Probabilistic Analysis Beautiful Homework # 13

An Introduction to Randomized Algorithms

Expectation and Variance of a random variable

Distribution of Random Samples & Limit theorems

Module 1 Fundamentals in statistics

Lecture 1 Probability and Statistics

The Poisson Process *

Axioms of Measure Theory

AMS570 Lecture Notes #2

Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

A PROBABILITY PRIMER

1.010 Uncertainty in Engineering Fall 2008

STAT Homework 1 - Solutions

PRACTICE PROBLEMS FOR THE FINAL

4. Partial Sums and the Central Limit Theorem

As stated by Laplace, Probability is common sense reduced to calculation.

CH5. Discrete Probability Distributions

Approximations and more PMFs and PDFs

Notation List. For Cambridge International Mathematics Qualifications. For use from 2020

Statistics 511 Additional Materials

Last time: Moments of the Poisson distribution from its generating function. Example: Using telescope to measure intensity of an object

Probability, Expectation Value and Uncertainty

Statistical Properties of OLS estimators

Convergence of random variables. (telegram style notes) P.J.C. Spreij

PROBABILITY LOGIC: Part 2

Statisticians use the word population to refer the total number of (potential) observations under consideration

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 19 11/17/2008 LAWS OF LARGE NUMBERS II THE STRONG LAW OF LARGE NUMBERS

Binomial Distribution

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

The standard deviation of the mean

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

Lecture 5. Random variable and distribution of probability

Estimation for Complete Data

Probability and MLE.

Econ 325: Introduction to Empirical Economics

NOTES ON DISTRIBUTIONS

Pb ( a ) = measure of the plausibility of proposition b conditional on the information stated in proposition a. & then using P2

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

CS 330 Discussion - Probability

3. Z Transform. Recall that the Fourier transform (FT) of a DT signal xn [ ] is ( ) [ ] = In order for the FT to exist in the finite magnitude sense,

Lecture 4. Random variable and distribution of probability

A statistical method to determine sample size to estimate characteristic value of soil parameters

11 Correlation and Regression

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

5 : Exponential Family and Generalized Linear Models

Algebra of Least Squares

1 Inferential Methods for Correlation and Regression Analysis

7.1 Convergence of sequences of random variables

Apply change-of-basis formula to rewrite x as a linear combination of eigenvectors v j.

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 6 9/24/2008 DISCRETE RANDOM VARIABLES AND THEIR EXPECTATIONS

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015

Understanding Samples

Lecture 10: Mathematical Preliminaries

Properties and Hypothesis Testing

Lesson 11: Simple Linear Regression

Sequences A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence

Questions and Answers on Maximum Likelihood

Modeling and Performance Analysis with Discrete-Event Simulation

First, note that the LS residuals are orthogonal to the regressors. X Xb X y = 0 ( normal equations ; (k 1) ) So,

Sets and Probabilistic Models

7.1 Convergence of sequences of random variables

Lecture 2: Monte Carlo Simulation

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

If a subset E of R contains no open interval, is it of zero measure? For instance, is the set of irrationals in [0, 1] is of measure zero?

Elements of Statistical Methods Lots of Data or Large Samples (Ch 8)

[ ] ( ) ( ) [ ] ( ) 1 [ ] [ ] Sums of Random Variables Y = a 1 X 1 + a 2 X 2 + +a n X n The expected value of Y is:

Chapter 2 The Monte Carlo Method

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

TMA4245 Statistics. Corrected 30 May and 4 June Norwegian University of Science and Technology Department of Mathematical Sciences.

Outline. L7: Probability Basics. Probability. Probability Theory. Bayes Law for Diagnosis. Which Hypothesis To Prefer? p(a,b) = p(b A) " p(a)

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

Advanced Stochastic Processes.

Basis for simulation techniques

Probability and Statistics

Chapter If n is odd, the median is the exact middle number If n is even, the median is the average of the two middle numbers

Background Information

Correlation Regression

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

CS161 Handout 05 Summer 2013 July 10, 2013 Mathematical Terms and Identities

Transcription:

CEE 5 Autum 005 Ucertaity Cocepts for Geotechical Egieerig Basic Termiology Set A set is a collectio of (mutually exclusive) objects or evets. The sample space is the (collectively exhaustive) collectio of all objects or evets it may be discrete (whe there are a fiite umber of possible objects or evets) or cotiuous (whe there are a ifiite umber of possible objects or evets). Two sets are particularly importat. The uiversal set,, is the set of all objects or evets withi the sample space, ad the empty set (or ull set), φ, is the set cotaiig o objects or evets. Ve Diagram A diagram depictig sets i such a way that the area of the sample space is equal to, ad the areas of idividual sets are proportioal to the relative likelihoods of various sets withi the sample space. A example of a Ve diagram showig the sample space,, ad two sets, A ad B, is show below. The sets cotai o commo elemets, therefore, they are cosidered to be idepedet. A B A B Ve diagram with two idepedet sets: (left) mutually exclusive, ad (right) collectively exhaustive. Ve diagrams ca show relatioships betwee sets graphically. Whe sets are ot idepedet, they share elemets; i terms of a Ve diagram, the space occupied by the sets overlap each other. I such cases, we ca defie a set C that cosists of elemets that are i set A or set B as the uio of sets A ad B. We ca also defie a set D that cosists of elemets that are i set A ad set B as the itersectio of sets A ad B. Some useful relatioships betwee sets iclude: A B = B A A B = B A A (B C) = (A B) C A (B C) = (a B) C

The upper relatioships are referred to as the commutative laws ad the lower as the associative laws. D = A B C = A U B Notatio for sets usig Ve diagrams: (left) uio of A ad B, ad (right) itersectio of A ad B Other useful idetities/rules, each of which ca easily be checked with the aid of a Ve diagram, iclude: A A = A A A = A A φ = A A φ = φ A = A = A A (B C) = (A B) (A C) A (B C) = (A B) (A C) The complemet of a set is the set of all objects that are ot part of the set, ad is usually idicated by a overbar the complemet of A is A. This gives rise to some additioal idetities: A A = A A = φ A B= A B A B= A B Axioms There are three axioms upo which all of probability theory is based. The axioms of specify properties that probability must have, but do ot say what probability is as Probability a result, there are differet iterpretatios of probability. The axioms are:. The probability, P[A], of evet A has a value betwee 0 ad, i.e. 0 P[A].. The sum of the respective probabilities of each of a set of mutually exclusive ad collectively exhaustive evets {A i } is, i.e. =. 3. The probability that oe or the other of two mutually exclusive evets, A ad B, occur is equal to the sum of their idividual probabilities, i.e. P[A B] = P[A] + P[B]. Ai i

The third axiom ca be geeralized to the case of mutually exclusive evets as P[A A A 3 A ] = P[A ] + P[A ] + P[A 3 ] + + P[A ] Most evets are ot mutually exclusive they have some commo poits ad cosequetly overlap i a Ve diagram. Sice Ve diagrams are draw such that the area of the sample space is uity, the area of a set (represetig a evet) is equal to the probability of the evet. O that basis, it is easy to see i the diagram below that P[A B] = P[A] + P[B] P[A B] P[A] P[A B] P[B] Ve diagram showig relatioship betwee area ad probability Coditioal Oe of the most importat cocepts i probability theory is that of coditioal Probability probability, alog with the related cocepts of depedece ad idepedece. The coditioal probability of evet A give evet B, writte as A B, is the probability that evet A occurs give that evet B occurs. For the case show i the Ve diagram immediately above, we ca see that P[A B] = P[A B] / P[B] Two evets are said to be idepedet if P[A B] = P[A] which simply says that the probability of A is the same regardless of B. Evets are depedet if they are t idepedet. Total A very useful result of the defiitio of coditioal probability is aalogous to Probability the chai rule for partial differetiatio. If B, B, B 3,, B is a set of mutually Theorem exclusive ad collectively exhaustive evets, ad A is aother evet that is coditioal o oe or more of the B i, the P[A] = P[A B ] + P[A B ] + P[A B 3 ] + + P[A B ] = P[ A Bi] P[ Bi]

This result, kow as the total probability theorem, is illustrated graphically below. B B 3 B 4 A B B 5 Bayes Theorem Usig the total probability theorem, the probability of occurrece of a evet, A, is calculated that probability depeds o other mutually exclusive ad collectively exhaustive evets B i. Sometimes, it is also useful to kow the probability of a evet B i give that A has occurred. From our previous discussio of coditioal probability, we kow that P[A B i ] = P[A B i ] / P[B i ] Similarly, P[B i A] = P[A B i ] / P[A] from which we ca the see that P[A B i ] = P[A B i ] P[B i ] = P[B i A] P[A] The, we ca write the coditioal probability of B i A as PA [ i] P[ i] P[ i ] B B B A = PA [ ] which is kow as Bayes Theorem. Usig the total probability theorem, the deomiator ca be expaded to write Bayes Theorem i the commoly used form PA [ i] P[ i] P[ ] B B Bi A = i PA [ Bi] P[ Bi] Bayes Theorem is ofte applied i cotexts where some series of evets, B i, with estimated probabilities exists ad a evet, A, occurs. Bayes Theorem ca the be used to update the estimated probabilities of evets B i based o the iformatio gaied from evet A. For example, the B i may be radom variables that represet the stregth of a series of layers i a soil profile, ad A might be the results of a set of stregth tests, or the occurrece of a failure. The ew iformatio from the stregth tests or the failure ca be used to improve (i.e. update) the estimated distributios of layer stregths this process is ofte referred to as Bayesia updatig.

Radom Variable A radom variable is a quatity that ca take o multiple values, i.e. a quatity whose exact value is ot kow with certaity. A upper case character is geerally used to describe a radom variable; lower case characters are used to describe the values that the radom variable may take. Radom variables ca be divided ito two mai types. Cotiuous radom variables are those tha ca have a ifiite umber of possible values like a perso s height or weight, or the shear stregth of a elemet of soil. Cotiuous radom variables are frequetly described by distributios such as the ormal, logormal, beta, ad expoetial distributios. Discrete radom variables are those that ca take o oly a fiite umber of possible values like the sum of the umbers o two rolled dice or the umber of ladslides i a year. Discrete radom variables are described by distributios such as the biomial ad Poisso distributios. Probability The radomess of a cotiuous radom variable ca be described by a probability Desity desity fuctio, represeted by f (x), ad ofte referred to as a pdf. The pdf is a Fuctio fuctio for which all values are o-egative ad the area uder which is equal to. The pdf does ot directly provide iformatio o probabilities, but it does idicate the ature of the radomess of the variable. The probability of the radom variable,, takig o values betwee two limits, x ad x, is simply equal to the area uder the pdf betwee those limits, i.e. x P[ x< x] = f ( x) dx x Cumulative The radomess of a cotiuous radom variable ca be described i aother way Distributio usig the cumulative distributio fuctio, F (x), ofte referred to as the CDF, Fuctio which is defied as F x ( x) = f ( xdx ) ad is simply the area uder the pdf to the left of x (obviously, the pdf is the derivative of the CDF). Therefore, F (x) = P[ x]. From this, we ca see that the probability of beig betwee two limits, x ad x, is just the differece i the CDF values at x ad x, i.e. P[ x< x] = F ( x) F ( x) Probability Sice discrete radom variables have oly certai values, their relative frequecies Mass of occurrece ca oly be evaluated at those values. Therefore, the radomess of Fuctio a discrete radom variable is described by a probability mass fuctio, represeted by p (x) ad ofte referred to as a PMF. A PMF is ot cotiuous; graphically, it appears as a spiky histogram. However, a cotiuous CDF ca still be defied from a PMF as F (x) = P[ x] = p ( xi) xi x

Momets The radomess of a radom variable ca be characterized by a umber of discrete, scalar quatities called momets, which ca be computed from a pdf or PMF. The k th momet of a distributio is defied as m k = i + k x p x f k i for discrete ( xdx ) for cotiuous The first momet (k = ) is the mea, or expected value, of the radom variable, ad is ormally represeted as µ x. Higher momets are geerally take about the mea value, i.e. as k p ( ) for discrete i xi µ x m k = + k ( xi µ x) f ( xdx ) for cotiuous The secod momet is particularly importat it is called the variace, Var() or σ, ad its square root the stadard deviatio, σ. The stadard deviatio is a coveiet measure of dispersio about the mea because it has the same uits as the radom variable itself. Therefore, the variace is computed as p ( ) for discrete i xi µ x σ = + ( xi µ x) f ( xdx ) for cotiuous The coefficiet of variatio, COV, is a useful ormalized (dimesioless) measure of dispersio defied as the ratio of the stadard deviatio to the mea, i.e. COV = σ µ Higher order momets are used less frequetly the third momet describes the skewess (asymmetry) of the distributio, ad the fourth momet describes the kurtosis (sharpess of peak). The skewess coefficiet ad the coefficiet of kurtosis are defied as ad 3 ν = m 3 / σ 4 κ = m 4 / σ respectively.

Fuctios Much of what we do i probabilistic aalysis ivolves maipulatig fuctios of of Radom radom variables. The simplest fuctios we ca deal with are liear combiatios Variables of radom variables. If,,, is a set of radom variables ad a, a,, a is a set of costats, aother radom variable, Y, ca be defied as a liear combiatio of the s, i.e. as Y = ai i For this case, the mea ad variace of Y are give by Y µ = aiµ xi i i i j j= i+ Var[ Y ] = avar[ x] + a acov[ i, j] Note that the secod part of the variace equatio drops out if the variables are ucorrelated. Multiple Radom Variables It is ofte ecessary to cosider more tha oe radom variable to formulate a particular problem. The stregth ad stiffess of a elemet of soil may be modeled as separate radom variables, but it is preferable to model their ucertaities joitly sice more iformatio ca be extracted from their joit distributio. If two cotiuous radom variables, ad Y, exist, the joit pdf, f,y (x, y) ca be used to compute the (joit) probability that is betwee x ad x ad Y is betwee y ad y x y P[( x x x ) ( y y y )] = f ( x, y) dxdy Y, x y where f,y (x, y) is the joit probability desity fuctio of ad Y. If the variables are discrete, the joit PMF, p,y (x,y), has the property p,y (x,y) = P[( = x) (Y = y)] The joit CDF for the cotiuous case is the x y F Y, = f Y, ( x, ydxdy ) For the special case i which ad Y are idepedet, the followig relatios (the last two of which are the most importat) hold: f Y (x y) = f (x) F Y (x y) = F (x) f,y (x,y) = f (x) f Y (y) f Y (y x) = f Y (y) F Y (y x) = F Y (y) F,Y (x,y) = F (x) F Y (y) If we defie some radom variable Z to be a fuctio of ad Y, say Z = g(,y), the the mea (or expected value) of Z is

µ = E[ g(, Y )] = gxyf (, ) ( x, y) dxdy Z + + Now, however, there are multiple secod momets of a fuctio of multiple variables; for the case of two variables, there are three secod momets of g(,y). The secod momets ca be computed as Cov[, Y] = E[( µ )(Y µ Y )] Y, + + = µ µ Y Y, ( x )( y ) f ( x, ydxdy ) where Cov stads for covariace (ot to be cofused with the previously defied coefficiet of variatio, which the otatio COV is used for). Note that the three secod momets are Cov[, ], Cov[Y, Y], ad Cov[, Y]; the first two are equal to the variaces of ad Y, respectively. Whe there are two or more radom variables, their covariaces are ofte expressed i terms of a covariace matrix, e.g. σ σ σ L σ σ σ σ σ L σ σ C = M M O M σ σ σ σ L σ The degree of correlatio betwee the variables ca also be expressed i dimesioless form by the correlatio coefficiet, defied as Cov(, Y ) ρ Y, = σ σ Y The correlatio coefficiet describes the degree of liear relatioship betwee two variables. The correlatio coefficiet ca rage from - (perfect egative correlatio oe variable icreases while the other decreases, i a liear fashio) to + (perfect positive correlatio both icrease or decrease together, i a liear fashio). A correlatio coefficiet of zero meas that there is o liear relatioship betwee the variables ote that a strog oliear relatioship could still exist. Sometimes, you will see these types of relatioships expressed i terms of a correlatio matrix ρ L, ρ, ρ L, ρ, K = M M O M ρ, ρ L, The oes o the diagoal idicate that a radom variable is perfectly correlated with itself.