Course on Inverse Problems

Size: px
Start display at page:

Download "Course on Inverse Problems"

Transcription

1 Stanford University School of Earth Sciences Course on Inverse Problems Albert Tarantola Third Lesson: Probability (Elementary Notions)

2 Let u and v be two Cartesian parameters (then, volumetric probabilities and probability densities are identical). Given a probability density f (u, v), one defines the two marginal probability densities f u (u) = f v (v) = dv f (u, v) du f (u, v) and the two conditional probability densities f u v (u v = v ) = f v u (v u = u ) = f (u, v ) du f (u, v ) f (u, v) dv f (v, u )

3 marginal conditional conditional.25.2 joint marginal conditional conditional

4 Let u and v be two Cartesian parameters (then, volumetric probabilities and probability densities are identical). Given a probability density f (u, v), one defines the two marginal probability densities f u (u) = f v (v) = dv f (u, v) du f (u, v) and the two conditional probability densities f u v (u v = v ) = f v u (v u = u ) = f (u, v ) du f (u, v ) f (u, v) dv f (v, u )

5 Let u and v be two Cartesian parameters (then, volumetric probabilities and probability densities are identical). Given a probability density f (u, v), one defines the two marginal probability densities f u (u) = f v (v) = dv f (u, v) du f (u, v) and the two conditional probability densities f u v (u v ) = f v u (v u ) = f (u, v ) du f (u, v ) f (u, v) dv f (v, u )

6 Let u and v be two Cartesian parameters (then, volumetric probabilities and probability densities are identical). Given a probability density f (u, v), one defines the two marginal probability densities f u (u) = f v (v) = dv f (u, v) du f (u, v) and the two conditional probability densities f u v (u v) = f v u (v u) = f (u, v) du f (u, v) f (u, v) dv f (v, u)

7 One has f u v (u v) = f v u (v u) = f (u, v) f v (v) f (u, v) f u (u) from where (a joint distribution can be expressed by a conditional distribution times a marginal distribution) f (u, v) = f u v (u v) f v (v) = f v u (v u) f u (u) from where (Bayes theorem) f u v (u v) = f v u(v u) f u (u) f v (v)

8 Recall: f (u, v) = f u v (u v) f v (v) = f v u (v u) f u (u). The two quantities u and v are said to have independent uncertainties if, in fact, f (u, v) = f u (u) f v (v) (the joint distribution equals the product of the two marginal distributions). This implies (and is implied by) f u v (u v) = f u (u) ; f v u (v u) = f v (v).

9 two quantities with independent uncertainties (the joint distribution is the product of the two marginal distributions)

10 Let u and v be two Cartesian parameters (then, volumetric probabilities and probability densities are identical). Let f (u, v), be a probability density that is not qualitatively different from a two-dimensional Gaussian. The mean values are the variances are u = v = c uu = σ 2 u = c vv = σ 2 v = and the covariance is c uv = du du du du du dv u f (u, v) dv v f (u, v) dv (u u) 2 f (u, v) dv (v v) 2 f (u, v) dv (u u)(v v) f (u, v)

11 The covariance matrix is ( ) cuu c uv ( ) σ 2 u c uv C = c vu c vv = c vu σ 2 v. It is symmetric and positive definite (or, at least, non-negative). Note: the correlation, defined as ρ uv = c uv σ u σ v = has the property 1 ρ uv +1. c uv cuu cvv,

12 The general form of a covariance matrix is c 11 c 12 c σ 2 1 c 12 c c 21 c 22 c c 21 σ 2 2 c C = = c 31 c 32 c c 31 c 32 σ The quantities with immediate interpretation are the standard deviations {σ 1, σ 2, σ 3,... } and the correlation matrix 1 ρ 12 ρ ρ 21 1 ρ R = ρ 31 ρ

13 The multidimensional Gaussian distribution is defined as f (x 1, x 2,..., x n ) f (x) = k exp ( 1 2 (x x ) t C -1 (x x ) ) Its mean is x and its covariance is C (not obvious!).

LESSON 23: EXTREMA OF FUNCTIONS OF 2 VARIABLES OCTOBER 25, 2017

LESSON 23: EXTREMA OF FUNCTIONS OF 2 VARIABLES OCTOBER 25, 2017 LESSON : EXTREMA OF FUNCTIONS OF VARIABLES OCTOBER 5, 017 Just like with functions of a single variable, we want to find the minima (plural of minimum) and maxima (plural of maximum) of functions of several

More information

Joint ] X 5) P[ 6) P[ (, ) = y 2. x 1. , y. , ( x, y ) 2, (

Joint ] X 5) P[ 6) P[ (, ) = y 2. x 1. , y. , ( x, y ) 2, ( Two-dimensional Random Vectors Joint Cumulative Distrib bution Functio n F, (, ) P[ and ] Properties: ) F, (, ) = ) F, 3) F, F 4), (, ) = F 5) P[ < 6) P[ < (, ) is a non-decreasing unction (, ) = F ( ),,,

More information

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v } Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v

More information

Two-dimensional Random Vectors

Two-dimensional Random Vectors 1 Two-dimensional Random Vectors Joint Cumulative Distribution Function (joint cd) [ ] F, ( x, ) P xand Properties: 1) F, (, ) = 1 ),, F (, ) = F ( x, ) = 0 3) F, ( x, ) is a non-decreasing unction 4)

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Triple integrals in Cartesian coordinates (Sect. 15.5) Review: Triple integrals in arbitrary domains

Triple integrals in Cartesian coordinates (Sect. 15.5) Review: Triple integrals in arbitrary domains Triple integrals in Cartesian coordinates (Sect. 5.5) Review: Triple integrals in arbitrar domains. s: Changing the order of integration. The average value of a function in a region in space. Triple integrals

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

UNIT Define joint distribution and joint probability density function for the two random variables X and Y. UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y

More information

Gaussians. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Gaussians. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Gaussians Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Outline Univariate Gaussian Multivariate Gaussian Law of Total Probability Conditioning

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Chapter 5 Random vectors, Joint distributions. Lectures 18-23

Chapter 5 Random vectors, Joint distributions. Lectures 18-23 Chapter 5 Random vectors, Joint distributions Lectures 18-23 In many real life problems, one often encounter multiple random objects. For example, if one is interested in the future price of two different

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Inner Product Spaces 6.1 Length and Dot Product in R n

Inner Product Spaces 6.1 Length and Dot Product in R n Inner Product Spaces 6.1 Length and Dot Product in R n Summer 2017 Goals We imitate the concept of length and angle between two vectors in R 2, R 3 to define the same in the n space R n. Main topics are:

More information

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1 Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Dependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline.

Dependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline. MFM Practitioner Module: Risk & Asset Allocation September 11, 2013 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs

Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs s of Spring Quarter 2018 ECE244a - Spring 2018 1 Function s of The characteristic function is the Fourier transform of the pdf (note Goodman and Papen have different notation) C x(ω) = e iωx = = f x(x)e

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Lectures 15: Parallel Transport. Table of contents

Lectures 15: Parallel Transport. Table of contents Lectures 15: Parallel Transport Disclaimer. As we have a textbook, this lecture note is for guidance and supplement only. It should not be relied on when preparing for exams. In this lecture we study the

More information

A note about the conjecture about Spearman s rho and Kendall s tau

A note about the conjecture about Spearman s rho and Kendall s tau A note about the conjecture about Spearman s rho and Kendall s tau V. Durrleman Operations Research and Financial Engineering, Princeton University, USA A. Nikeghbali University Paris VI, France T. Roncalli

More information

Chapter 12: Bivariate & Conditional Distributions

Chapter 12: Bivariate & Conditional Distributions Chapter 12: Bivariate & Conditional Distributions James B. Ramsey March 2007 James B. Ramsey () Chapter 12 26/07 1 / 26 Introduction Key relationships between joint, conditional, and marginal distributions.

More information

Surface x(u, v) and curve α(t) on it given by u(t) & v(t). Math 4140/5530: Differential Geometry

Surface x(u, v) and curve α(t) on it given by u(t) & v(t). Math 4140/5530: Differential Geometry Surface x(u, v) and curve α(t) on it given by u(t) & v(t). α du dv (t) x u dt + x v dt Surface x(u, v) and curve α(t) on it given by u(t) & v(t). α du dv (t) x u dt + x v dt ( ds dt )2 Surface x(u, v)

More information

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables

More information

Statistical Techniques in Robotics (16-831, F12) Lecture#17 (Wednesday October 31) Kalman Filters. Lecturer: Drew Bagnell Scribe:Greydon Foil 1

Statistical Techniques in Robotics (16-831, F12) Lecture#17 (Wednesday October 31) Kalman Filters. Lecturer: Drew Bagnell Scribe:Greydon Foil 1 Statistical Techniques in Robotics (16-831, F12) Lecture#17 (Wednesday October 31) Kalman Filters Lecturer: Drew Bagnell Scribe:Greydon Foil 1 1 Gauss Markov Model Consider X 1, X 2,...X t, X t+1 to be

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6]

ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] ORTHOGONALITY AND LEAST-SQUARES [CHAP. 6] Inner products and Norms Inner product or dot product of 2 vectors u and v in R n : u.v = u 1 v 1 + u 2 v 2 + + u n v n Calculate u.v when u = 1 2 2 0 v = 1 0

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Unsupervised Learning with Permuted Data

Unsupervised Learning with Permuted Data Unsupervised Learning with Permuted Data Sergey Kirshner skirshne@ics.uci.edu Sridevi Parise sparise@ics.uci.edu Padhraic Smyth smyth@ics.uci.edu School of Information and Computer Science, University

More information

Uncertainty quantification for Wavefield Reconstruction Inversion

Uncertainty quantification for Wavefield Reconstruction Inversion Uncertainty quantification for Wavefield Reconstruction Inversion Zhilong Fang *, Chia Ying Lee, Curt Da Silva *, Felix J. Herrmann *, and Rachel Kuske * Seismic Laboratory for Imaging and Modeling (SLIM),

More information

Multivariate probability distributions and linear regression

Multivariate probability distributions and linear regression Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

Multivariate random variables

Multivariate random variables Multivariate random variables DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Joint distributions Tool to characterize several

More information

Inner Product Spaces 5.2 Inner product spaces

Inner Product Spaces 5.2 Inner product spaces Inner Product Spaces 5.2 Inner product spaces November 15 Goals Concept of length, distance, and angle in R 2 or R n is extended to abstract vector spaces V. Sucn a vector space will be called an Inner

More information

Course on Inverse Problems

Course on Inverse Problems California Institute of Technology Division of Geological and Planetary Sciences March 26 - May 25, 2007 Course on Inverse Problems Albert Tarantola Institut de Physique du Globe de Paris Lesson XVI CONCLUSION

More information

COLLOCATED CO-SIMULATION USING PROBABILITY AGGREGATION

COLLOCATED CO-SIMULATION USING PROBABILITY AGGREGATION COLLOCATED CO-SIMULATION USING PROBABILITY AGGREGATION G. MARIETHOZ, PH. RENARD, R. FROIDEVAUX 2. CHYN, University of Neuchâtel, rue Emile Argand, CH - 2009 Neuchâtel, Switzerland 2 FSS Consultants, 9,

More information

Unit roots in vector time series. Scalar autoregression True model: y t 1 y t1 2 y t2 p y tp t Estimated model: y t c y t1 1 y t1 2 y t2

Unit roots in vector time series. Scalar autoregression True model: y t 1 y t1 2 y t2 p y tp t Estimated model: y t c y t1 1 y t1 2 y t2 Unit roots in vector time series A. Vector autoregressions with unit roots Scalar autoregression True model: y t y t y t p y tp t Estimated model: y t c y t y t y t p y tp t Results: T j j is asymptotically

More information

4. CONTINUOUS RANDOM VARIABLES

4. CONTINUOUS RANDOM VARIABLES IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We

More information

Consumption. Consider a consumer with utility. v(c τ )e ρ(τ t) dτ.

Consumption. Consider a consumer with utility. v(c τ )e ρ(τ t) dτ. Consumption Consider a consumer with utility v(c τ )e ρ(τ t) dτ. t He acts to maximize expected utility. Utility is increasing in consumption, v > 0, and concave, v < 0. 1 The utility from consumption

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

Course on Inverse Problems Albert Tarantola

Course on Inverse Problems Albert Tarantola California Institute of Technology Division of Geological and Planetary Sciences March 26 - May 25, 27 Course on Inverse Problems Albert Tarantola Institut de Physique du Globe de Paris CONCLUSION OF THE

More information

Statistics. Lent Term 2015 Prof. Mark Thomson. 2: The Gaussian Limit

Statistics. Lent Term 2015 Prof. Mark Thomson. 2: The Gaussian Limit Statistics Lent Term 2015 Prof. Mark Thomson Lecture 2 : The Gaussian Limit Prof. M.A. Thomson Lent Term 2015 29 Lecture Lecture Lecture Lecture 1: Back to basics Introduction, Probability distribution

More information

Lecture 23: 6.1 Inner Products

Lecture 23: 6.1 Inner Products Lecture 23: 6.1 Inner Products Wei-Ta Chu 2008/12/17 Definition An inner product on a real vector space V is a function that associates a real number u, vwith each pair of vectors u and v in V in such

More information

ECE534, Spring 2018: Solutions for Problem Set #3

ECE534, Spring 2018: Solutions for Problem Set #3 ECE534, Spring 08: Solutions for Problem Set #3 Jointly Gaussian Random Variables and MMSE Estimation Suppose that X, Y are jointly Gaussian random variables with µ X = µ Y = 0 and σ X = σ Y = Let their

More information

Probability and statistics; Rehearsal for pattern recognition

Probability and statistics; Rehearsal for pattern recognition Probability and statistics; Rehearsal for pattern recognition Václav Hlaváč Czech Technical University in Prague Czech Institute of Informatics, Robotics and Cybernetics 166 36 Prague 6, Jugoslávských

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

9.1 Mean and Gaussian Curvatures of Surfaces

9.1 Mean and Gaussian Curvatures of Surfaces Chapter 9 Gauss Map II 9.1 Mean and Gaussian Curvatures of Surfaces in R 3 We ll assume that the curves are in R 3 unless otherwise noted. We start off by quoting the following useful theorem about self

More information

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability

More information

Chapter 5,6 Multiple RandomVariables

Chapter 5,6 Multiple RandomVariables Chapter 5,6 Multiple RandomVariables ENCS66 - Probabilityand Stochastic Processes Concordia University Vector RandomVariables A vector r.v. is a function where is the sample space of a random experiment.

More information

The Maximum Entropy Principle and Applications to MIMO Channel Modeling

The Maximum Entropy Principle and Applications to MIMO Channel Modeling The Maximum Entropy Principle and Applications to MIMO Channel Modeling Eurecom CM Talk, 16/02/2006 Maxime Guillaud maxime.guillaud@eurecom.fr (Joint work with Mérouane Debbah) The Maximum Entropy Principle

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 10 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Monday, October 26, 2015 Recap

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

7. The Multivariate Normal Distribution

7. The Multivariate Normal Distribution of 5 7/6/2009 5:56 AM Virtual Laboratories > 5. Special Distributions > 2 3 4 5 6 7 8 9 0 2 3 4 5 7. The Multivariate Normal Distribution The Bivariate Normal Distribution Definition Suppose that U and

More information

02 Background Minimum background on probability. Random process

02 Background Minimum background on probability. Random process 0 Background 0.03 Minimum background on probability Random processes Probability Conditional probability Bayes theorem Random variables Sampling and estimation Variance, covariance and correlation Probability

More information

Probability Density (1)

Probability Density (1) Probability Density (1) Let f(x 1, x 2... x n ) be a probability density for the variables {x 1, x 2... x n }. These variables can always be viewed as coordinates over an abstract space (a manifold ).

More information

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY

More information

The Multivariate Gaussian Distribution [DRAFT]

The Multivariate Gaussian Distribution [DRAFT] The Multivariate Gaussian Distribution DRAFT David S. Rosenberg Abstract This is a collection of a few key and standard results about multivariate Gaussian distributions. I have not included many proofs,

More information

Bayesian Gaussian / Linear Models. Read Sections and 3.3 in the text by Bishop

Bayesian Gaussian / Linear Models. Read Sections and 3.3 in the text by Bishop Bayesian Gaussian / Linear Models Read Sections 2.3.3 and 3.3 in the text by Bishop Multivariate Gaussian Model with Multivariate Gaussian Prior Suppose we model the observed vector b as having a multivariate

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Multivariate random variables

Multivariate random variables DS-GA 002 Lecture notes 3 Fall 206 Introduction Multivariate random variables Probabilistic models usually include multiple uncertain numerical quantities. In this section we develop tools to characterize

More information

Probability theory. References:

Probability theory. References: Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

Bivariate Transformations

Bivariate Transformations Bivariate Transformations October 29, 29 Let X Y be jointly continuous rom variables with density function f X,Y let g be a one to one transformation. Write (U, V ) = g(x, Y ). The goal is to find the

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

Integrals in cylindrical, spherical coordinates (Sect. 15.7)

Integrals in cylindrical, spherical coordinates (Sect. 15.7) Integrals in clindrical, spherical coordinates (Sect. 15.7 Integration in spherical coordinates. Review: Clindrical coordinates. Spherical coordinates in space. Triple integral in spherical coordinates.

More information

Preliminary statistics

Preliminary statistics 1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),

More information

Chapter 5. The multivariate normal distribution. Probability Theory. Linear transformations. The mean vector and the covariance matrix

Chapter 5. The multivariate normal distribution. Probability Theory. Linear transformations. The mean vector and the covariance matrix Probability Theory Linear transformations A transformation is said to be linear if every single function in the transformation is a linear combination. Chapter 5 The multivariate normal distribution When

More information

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables Lecture 6 : Independence, Covariance and Correlation of Discrete Random Variables 0/ 3 Definition Two discrete random variables X and Y defined on the same sample space are said to be independent if for

More information

Integration by Substitution

Integration by Substitution Integration by Substitution MATH 151 Calculus for Management J. Robert Buchanan Department of Mathematics Fall 2018 Objectives After this lesson we will be able to use the method of integration by substitution

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Gaussian random variables inr n

Gaussian random variables inr n Gaussian vectors Lecture 5 Gaussian random variables inr n One-dimensional case One-dimensional Gaussian density with mean and standard deviation (called N, ): fx x exp. Proposition If X N,, then ax b

More information

Probability (continued)

Probability (continued) DS-GA 1002 Lecture notes 2 September 21, 15 Probability (continued) 1 Random variables (continued) 1.1 Conditioning on an event Given a random variable X with a certain distribution, imagine that it is

More information

1 Random variables and distributions

1 Random variables and distributions Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set

More information

An Introduction to Bayesian Linear Regression

An Introduction to Bayesian Linear Regression An Introduction to Bayesian Linear Regression APPM 5720: Bayesian Computation Fall 2018 A SIMPLE LINEAR MODEL Suppose that we observe explanatory variables x 1, x 2,..., x n and dependent variables y 1,

More information

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial

More information

Partial derivatives, linear approximation and optimization

Partial derivatives, linear approximation and optimization ams 11b Study Guide 4 econ 11b Partial derivatives, linear approximation and optimization 1. Find the indicated partial derivatives of the functions below. a. z = 3x 2 + 4xy 5y 2 4x + 7y 2, z x = 6x +

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Bayesian Inference for the Multivariate Normal

Bayesian Inference for the Multivariate Normal Bayesian Inference for the Multivariate Normal Will Penny Wellcome Trust Centre for Neuroimaging, University College, London WC1N 3BG, UK. November 28, 2014 Abstract Bayesian inference for the multivariate

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Lecture 5: GPs and Streaming regression

Lecture 5: GPs and Streaming regression Lecture 5: GPs and Streaming regression Gaussian Processes Information gain Confidence intervals COMP-652 and ECSE-608, Lecture 5 - September 19, 2017 1 Recall: Non-parametric regression Input space X

More information

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables Department o Electrical Engineering University o Arkansas ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Two discrete random variables

More information

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ). .8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics

More information

6 The normal distribution, the central limit theorem and random samples

6 The normal distribution, the central limit theorem and random samples 6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e

More information

DEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE

DEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE Data Provided: None DEPARTMENT OF COMPUTER SCIENCE Autumn Semester 203 204 MACHINE LEARNING AND ADAPTIVE INTELLIGENCE 2 hours Answer THREE of the four questions. All questions carry equal weight. Figures

More information

Multivariate Distributions

Multivariate Distributions IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate

More information

Surfaces of Arbitrary Constant Negative Gaussian Curvature and Related Sine-Gordon Equations

Surfaces of Arbitrary Constant Negative Gaussian Curvature and Related Sine-Gordon Equations Mathematica Aeterna, Vol.1, 011, no. 01, 1-11 Surfaces of Arbitrary Constant Negative Gaussian Curvature and Related Sine-Gordon Equations Paul Bracken Department of Mathematics, University of Texas, Edinburg,

More information

Transformation of Probability Densities

Transformation of Probability Densities Transformation of Probability Densities This Wikibook shows how to transform the probability density of a continuous random variable in both the one-dimensional and multidimensional case. In other words,

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

Hypothesis Testing. Econ 690. Purdue University. Justin L. Tobias (Purdue) Testing 1 / 33

Hypothesis Testing. Econ 690. Purdue University. Justin L. Tobias (Purdue) Testing 1 / 33 Hypothesis Testing Econ 690 Purdue University Justin L. Tobias (Purdue) Testing 1 / 33 Outline 1 Basic Testing Framework 2 Testing with HPD intervals 3 Example 4 Savage Dickey Density Ratio 5 Bartlett

More information

Inference about the Slope and Intercept

Inference about the Slope and Intercept Inference about the Slope and Intercept Recall, we have established that the least square estimates and 0 are linear combinations of the Y i s. Further, we have showed that the are unbiased and have the

More information

Uncorrelatedness and Independence

Uncorrelatedness and Independence Uncorrelatedness and Independence Uncorrelatedness:Two r.v. x and y are uncorrelated if C xy = E[(x m x )(y m y ) T ] = 0 or equivalently R xy = E[xy T ] = E[x]E[y T ] = m x m T y White random vector:this

More information