ESSENTIALS OF PROBABILITY THEORY
|
|
- Ellen Anthony
- 5 years ago
- Views:
Transcription
1 ESSENTIALS OF PROBABILITY THEORY Michele TARAGNA Dipartimento di Elettronica e Telecomunicazioni Politecnico di Torino michele.taragna@polito.it Master Course in Mechatronic Engineering Master Course in Computer Engineering 01RKYQW / 01RKYOV Estimation, Filtering and System Identification Academic Year 2017/2018
2 Random experiment and random source of data S : outcome space, i.e., the set of possible outcomessof the random experiment; F : space of events (or results) of interest, i.e., the set of the combinations of interest where the outcomes ins can be clustered; P( ) : probability function defined inf that associates to any event inf a real number between0and1. E = (S,F,P( )) : random experiment Example: roll a dice with six sides to see if an odd or even side appears S = {1,2,3,4,5,6} is the set of the six sides of the dice; F = {A,B,S, }, wherea = {2,4,6} andb = {1,3,5} are the events of interest, i.e., the even and odd number sets; P (A) = P (B) = 1/2 (if the dice is fair),p (S) = 1,P ( ) = 0. 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 1
3 A random variable of the experimente is a variablev whose values depend on the outcomesofe through of a suitable functionϕ( ) : S V, wherev is the set of possible values of v: v = ϕ(s) Example: the random variable depending on the outcome of the roll of a dice with six sides can be defined as v = ϕ(s) = +1 ifs A = {2,4,6} 1 ifs B = {1,3,5} A random source of data produces data that, besides the process under investigation characterized by the unknown true valueθ o of the variable to be estimated, are also functions of a random variable; in particular, at the time instant t, the datum d(t) depends on the random variable v(t). 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 2
4 Probability distribution and density functions Let us consider a real scalar variablex R. The (cumulative) probability distribution function F( ) : R R of the scalar random variablev is defined as: F(x) = P(v x) Main properties of the function F( ): F( ) = 0 F(+ ) = 1 F( ) is a monotonic nondecreasing function: F(x 1 ) F(x 2 ), x 1 < x 2 F( ) is almost continuous and, in particular, it is continuous from the right: F(x + ) = F(x) P(x 1 v < x 2 ) = F(x 2 ) F(x 1) F( ) is almost everywhere differentiable 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 3
5 The p.d.f. or probability density functionf( ) : R R is defined as: Main properties of the function f( ): f(x) 0, x R f(x)dx = P(x v < x+dx) + f(x)dx = 1 F(x) = x f(ξ)dξ f(x) = df(x) dx P(x 1 v < x 2 ) = F(x 2 ) F(x 1) = x 2 x 1 f(x)dx 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 4
6 Characteristic elements of a probability distribution Let us consider a scalar random variable v. Mean or mean value or expected value or expectation: Ev] = + xf(x) dx = v Note thate ] is a linear operator, i.e.: Eαv +β] = αev]+β, α,β R. Variance: Varv] = E (v Ev]) 2] = + (x Ev]) 2 f(x) dx = σ 2 v 0 Standard deviation or root mean square deviation: σ v = Varv] 0 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 5
7 k-th order (raw) moment: m k v] = E v k] = + x k f(x) dx In particular: m 0 v] = E1] = 1,m 1 v] = Ev] = v k-th order central moment: µ k v] = E (v Ev]) k] = + (x Ev]) k f(x) dx In particular: µ 0 v] = E1] = 1,µ 1 v] = Ev Ev]] = 0, µ 2 v] = E (v Ev]) 2] = Varv] = σ 2 v 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 6
8 Vector random variables A vectorv = v 1 v 2 v n ] T is a vector random variable if it depends on the outcomes of a random experimentethrough a vector functionϕ( ):S R n such that ϕ 1 (v 1 x 1,v 2 x 2,...,v n x n ) F, x = x 1 x 2 x n ] T R n The joint (cumulative) probability distribution functionf( ) : R n 0,1] is defined as: F(x 1,...,x n ) = P(v 1 x 1,v 2 x 2,...,v n x n ) withx 1,...,x n R and with all the inequalities simultaneously satisfied. Thei-th marginal probability distribution functionf i ( ) : R 0,1] is defined as: F i (x i ) = F(+,...,+,x i,+,...,+ ) = }{{}}{{} i 1 n i = P(v 1,...,v i 1,v i x i,v i+1,...,v n ) 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 7
9 The joint p.d.f. or joint probability density functionf( ) : R n R is defined as: and it is such that: f(x 1,...,x n ) = n F(x 1,...,x n ) x 1 x 2 x n f(x 1,...,x n )dx 1 dx 2 dx n = P(x 1 <v 1 x 1 +dx 1,...,x n <v n x n +dx n ) Thei-th marginal probability density functionf i ( ) : R R is defined as: f i (x i ) = + + }{{} n 1 times f(x 1,...,x n )dx 1 dx i 1 dx i+1 dx n The n components of the vector random variable v are (mutually) independent if and only if: f(x 1,...,x n ) = n i=1 f i(x i ) 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 8
10 Mean or mean value or expected value or expectation: Ev] = Ev 1 ] Ev 2 ] Ev n ] ] T R n, Ev i ] = + x if i (x i ) dx i Variance matrix or covariance matrix: Σ v Main properties ofσ v : = Varv] = E it is symmetric, i.e.,σ v = Σ T v (v Ev])(v Ev]) T] = = R n (x Ev])(x Ev]) T f(x) dx R n n it is positive semidefinite, i.e.,σ v 0, since the quadratic form x T Σ v x = E (x T (v Ev])) 2] 0, x R n the eigenvaluesλ i (Σ v ) 0, i = 1,...,n det(σ v ) = n i=1 λ i(σ v ) 0 Σ v ] ii = E (v i Ev i ]) 2] = σ 2 v i = σ 2 i = variance ofv i Σ v ] ij = E(v i Ev i ])(v j Ev j ])] = σ vi v j = σ ij = covariance ofv i andv j 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 9
11 Example: letv = v1 Ev] = v = Σ v = E = E v 2 ] be a bidimensional vector random variable ] Ev1 ] Ev 2 ] = v1 v 2 ] (v v)(v v) T] = E v1 ] ] v 1 v1 v T ] 1 v 2 v 2 v 2 v 2 (v1 = E ] ]) ( ] ]) T ] v1 v1 v1 = v 2 v 2 v 2 v 2 ] v1 v ] ] 1 v 2 v v 1 v 1, v 2 v 2 = 2 ]] (v 1 v 1 ) 2 (v 1 v 1 )(v 2 v 2 ) = E (v 2 v 2 )(v 1 v 1 ) (v 2 v 2 ) 2 = E (v 1 v 1 ) 2] ] E(v 1 v 1 )(v 2 v 2 )] = E(v 1 v 1 )(v 2 v 2 )] E (v 2 v 2 ) 2] = ] ] σ 2 v1 σ v1 v 2 σ 2 σ = σ v1 v 2 σ = v 2 σ 12 σ 2 = Σ T v 2 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 10
12 Correlation coefficient and correlation matrix Let us consider any two componentsv i andv j of a vector random variablev. The (linear) correlation coefficientρ ij R of the scalar random variablesv i andv j is defined as: E(v i Ev i ])(v j Ev j ])] ρ ij = E (v i Ev i ]) 2] E (v j Ev j ]) 2] = σ ij σ i σ j ρij 1, since the vector random variablew = vi v j ] T has: Σ w = Varw] = σ2 i σ ij σ = 2 i ρ ij σ i σ j 0 σ ij ρ ij σ i σ j Note that σ 2 j det(σ w ) = σ 2 iσ 2 j ρ 2 ij σ 2 iσ 2 j = ( 1 ρ 2 ij) σ 2 i σ 2 j 0 ρ 2 ij 1 σ 2 j 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 11
13 The random variablesv i andv j are uncorrelated if and only ifρ ij = 0, i.e., if and only ifσ ij = E(v i Ev i ])(v j Ev j ])] = 0. Note that: ρ ij = 0 Ev i v j ] = Ev i ]Ev j ] σ ij =E(v i Ev i ])(v j Ev j ])]=Ev i v j v i Ev j ] Ev i ]v j +Ev i ]Ev j ]]= =Ev i v j ] 2Ev i ]Ev j ]+Ev i ]Ev j ]=Ev i v j ] Ev i ]Ev j ]=0 Ev i v j ]=Ev i ]Ev j ] Ifv i andv j are linearly dependent, i.e.,v j = αv i +β α,β R withα 0, thenρ ij = α { α = sgn(α) = +1, ifα > 0 1, ifα < 0 and then ρij = 1 σ 2 i (v =E i Ev i ]) 2] =E vi 2 2v iev i ]+Ev i ] 2] =E vi 2 =E vi 2 ] Evi ] 2 σ 2 j (v =E j Ev j ]) 2] =E (αv i +β Eαv i +β]) 2] =E =E (αv i αev i ]) 2] =E α 2 (v i Ev i ]) 2] =α 2 E ] 2Evi ] 2 +Ev i ] 2 = (αv i +β αev i ] β) 2] = (v i Ev i ]) 2] =α 2 σ 2 i σ ij =Ev i v j ] Ev i ]Ev j ] =Ev i (αv i +β)] Ev i ]Eαv i +β]= =αe vi 2 ] +βevi ] Ev i ](αev i ]+β)=αe vi 2 ] αevi ] =α 2 E vi 2 ] Evi ] 2] =ασ 2 i 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 12
14 Note that, if the random variablesv i andv j are mutually stochastically independent, they are also uncorrelated, while the converse is not always true. In fact, ifv i andv j are mutually stochastically independent, then: Ev i v j ]= = = = Ev i ]Ev j ] x i x j f(x i,x j ) dx i dx j = x i x j f i (x i )f j (x j ) dx i dx j = x i f i (x i ) dx i + ρ ij = 0 x j f j (x j ) dx j = Ifv i andv j are jointly Gaussian and uncorrelated, they are also mutually independent. 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 13
15 Let us consider a vector random variablev = v 1 v 2 v n ] T. The correlation matrix or normalized covariance matrixρ v R n n is defined as: ρ 11 ρ 12 ρ 1n 1 ρ 12 ρ 1n ρ 12 ρ 22 ρ 2n ρ 12 1 ρ 2n ρ v =..... = Main properties ofρ v : ρ 1n ρ 2n ρ nn ρ 1n ρ 2n 1 it is symmetric, i.e.,ρ v = ρ T v it is positive semidefinite, i.e.,ρ v 0, sincex T ρ v x 0, x R n the eigenvaluesλ i (ρ v ) 0, i = 1,...,n det(ρ v ) = n i=1 λ i(ρ v ) 0 ρ v ] ii = ρ ii = σ ii σ 2 i = σ2 i σ 2 i = 1 ρ v ] ij = ρ ij = correlation coefficient ofv i andv j,i j 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 14
16 Relevant case #1: if a vector random variablev = v 1 v 2 v n ] T is such that all its components are each other uncorrelated (i.e.,σ ij =ρ ij = 0, i j), then: σ σ Σ v =..... = diag ( σ ,σ 2 2,,σ 2 ) n 0 0 σ 2 n ρ v = = I n n Obviously, the same result holds if all the components of v are mutually independent. 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 15
17 Relevant case #2: if a vector random variablev = v 1 v 2 v n ] T is such that all its components are each other uncorrelated (i.e.,σ ij =ρ ij = 0, i j) and have the same standard deviation (i.e.,σ i =σ, i), then: Σ v = ρ v = σ σ σ = σ 2 I n n = I n n Obviously, the same result holds if all the components of v are mutually independent. 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 16
18 Gaussian or normal random variables A scalar Gaussian or normal random variablev is such that its p.d.f. turns out to be: f(x) = 1 2πσv exp ( ) (x v) 2 2σ 2 v, with v = Ev] andσ 2 v = Varv] and the notationsv N ( v,σ 2 v) orv G ( v,σ 2 v ) are used. Ifw = αv +β, wherev is a scalar normal random variable andα,β R, then: w N ( w,σ 2 ( w) = N α v +β,α 2 σ 2 ) v note that, ifα= 1 andβ= v, thenw N (0,1), i.e.,w has a normalized p.d.f.: σ v σ v f(x) = 1 ( ) x 2 exp 2π 2 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 17
19 The probabilityp k that the outcome of a scalar normal random variablev differs from the mean value v no more thank times the standard deviationσ v is equal to: P k = P ( v k σ v v v +k σ v ) = P ( v v k σ v ) = = ( ) x 2 exp dx 2π 2 k In particular, it turns out that: k P k % % % and this allows to define suitable confidence intervals of the random variable v. 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 18
20 A vector normal random variablev = v 1 v 2 v n ] T is such that its p.d.f. is: ( 1 f(x) = (2π) n/2 exp 1 ) detσ v 2 (x v)t Σ 1 v (x v) where v = Ev] R n andσ v = Varv] R n n. n scalar normal variablesv i,i = 1,...,n, are said to be jointly Gaussian if the vector random variablev = v 1 v 2 v n ] T is normal. Main properties: ifv 1,...,v n are jointly Gaussian, then anyv i,i = 1,...,n, is also normal, while the converse is not always true ifv 1,...,v n are normal and independent, then they are also jointly Gaussian ifv 1,...,v n are jointly Gaussian and uncorrelated, they are also independent 01RKYQW / 01RKYOV - Estimation, Filtering and System Identification 19
ESSENTIALS OF PROBABILITY THEORY
ESSENTIALS OF PROBABILITY THEORY Michele TARAGNA Dipartimento di Automatica e Informatica Politecnico di Torino michele.taragna@polito.it Corso di Laurea Magistrale in Ingegneria Informatica (Computer
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More information5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors
EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationReview (Probability & Linear Algebra)
Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationL2: Review of probability and statistics
Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution
More informationCommunication Theory II
Communication Theory II Lecture 5: Review on Probability Theory Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt Febraury 22 th, 2015 1 Lecture Outlines o Review on probability theory
More informationESTIMATION THEORY. Michele TARAGNA. Dipartimento di Elettronica e Telecomunicazioni Politecnico di Torino.
ESTIMATION THEORY Michele TARAGNA Dipartimento di Elettronica e Telecomunicazioni Politecnico di Torino michele.taragna@polito.it Master Course in Mechatronic Engineering Master Course in Computer Engineering
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationElements of Probability Theory
Short Guides to Microeconometrics Fall 2016 Kurt Schmidheiny Unversität Basel Elements of Probability Theory Contents 1 Random Variables and Distributions 2 1.1 Univariate Random Variables and Distributions......
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationOutline. Random Variables. Examples. Random Variable
Outline Random Variables M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Random variables. CDF and pdf. Joint random variables. Correlated, independent, orthogonal. Correlation,
More information1 Random variables and distributions
Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationMultivariate probability distributions and linear regression
Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,
More information16.584: Random Vectors
1 16.584: Random Vectors Define X : (X 1, X 2,..X n ) T : n-dimensional Random Vector X 1 : X(t 1 ): May correspond to samples/measurements Generalize definition of PDF: F X (x) = P[X 1 x 1, X 2 x 2,...X
More informationReview (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology
Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationECE Lecture #10 Overview
ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationLQR, Kalman Filter, and LQG. Postgraduate Course, M.Sc. Electrical Engineering Department College of Engineering University of Salahaddin
LQR, Kalman Filter, and LQG Postgraduate Course, M.Sc. Electrical Engineering Department College of Engineering University of Salahaddin May 2015 Linear Quadratic Regulator (LQR) Consider a linear system
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More informationTAMS39 Lecture 2 Multivariate normal distribution
TAMS39 Lecture 2 Multivariate normal distribution Martin Singull Department of Mathematics Mathematical Statistics Linköping University, Sweden Content Lecture Random vectors Multivariate normal distribution
More informationShort course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda
Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis Lecture Recalls of probability theory Massimo Piccardi University of Technology, Sydney,
More informationProbability theory. References:
Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.
More informationECE Lecture #9 Part 2 Overview
ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by
More informationCourse on Inverse Problems
Stanford University School of Earth Sciences Course on Inverse Problems Albert Tarantola Third Lesson: Probability (Elementary Notions) Let u and v be two Cartesian parameters (then, volumetric probabilities
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More information3. Review of Probability and Statistics
3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture
More informationPreliminary statistics
1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),
More informationIntroduction to Probability Theory
Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random
More informationReview of Statistics
Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More information1.12 Multivariate Random Variables
112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random
More informationReview: mostly probability and some statistics
Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationmatrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2
Short Guides to Microeconometrics Fall 2018 Kurt Schmidheiny Unversität Basel Elements of Probability Theory 2 1 Random Variables and Distributions Contents Elements of Probability Theory matrix-free 1
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationStochastic Processes for Physicists
Stochastic Processes for Physicists Understanding Noisy Systems Chapter 1: A review of probability theory Paul Kirk Division of Molecular Biosciences, Imperial College London 19/03/2013 1.1 Random variables
More informationIntroduction to Statistics and Error Analysis
Introduction to Statistics and Error Analysis Physics116C, 4/3/06 D. Pellett References: Data Reduction and Error Analysis for the Physical Sciences by Bevington and Robinson Particle Data Group notes
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationx. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).
.8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationChapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:
4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained
More informationInternational Journal of Scientific & Engineering Research, Volume 6, Issue 2, February-2015 ISSN
1686 On Some Generalised Transmuted Distributions Kishore K. Das Luku Deka Barman Department of Statistics Gauhati University Abstract In this paper a generalized form of the transmuted distributions has
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More informationChapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory
Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationChapter 5,6 Multiple RandomVariables
Chapter 5,6 Multiple RandomVariables ENCS66 - Probabilityand Stochastic Processes Concordia University Vector RandomVariables A vector r.v. is a function where is the sample space of a random experiment.
More informationEXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY
EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2016 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks.
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationUCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #34 Prof Young-Han Kim Tuesday, May 7, 04 Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei) Linear estimator Consider a channel with the observation Y XZ, where the
More informationReview of Statistics
Economics 672 Fall 2017 Tauchen 1 Random Variables Review of Statistics A random variable takes on values as determined by its probability distribution. The probability distribution can be discrete, continuous,
More informationChapter 6. Random Processes
Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process
More informationFINAL EXAM: Monday 8-10am
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationChapter 1. Sets and probability. 1.3 Probability space
Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability
More informationUQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables
UQ, Semester 1, 2017, Companion to STAT2201/CIVL2530 Exam Formulae and Tables To be provided to students with STAT2201 or CIVIL-2530 (Probability and Statistics) Exam Main exam date: Tuesday, 20 June 1
More information2.7 The Gaussian Probability Density Function Forms of the Gaussian pdf for Real Variates
.7 The Gaussian Probability Density Function Samples taken from a Gaussian process have a jointly Gaussian pdf (the definition of Gaussian process). Correlator outputs are Gaussian random variables if
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationQuestion 1. Question 4. Question 2. Question 5. Question 3. Question 6.
İstanbul Kültür University Faculty of Engineering MCB17 Introduction to Probability Statistics Second Midterm Fall 21-21 Number: Name: Department: Section: Directions You have 9 minutes to complete the
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationStochastic Process II Dr.-Ing. Sudchai Boonto
Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand Random process Consider a random experiment specified by the
More informationTransformation of Probability Densities
Transformation of Probability Densities This Wikibook shows how to transform the probability density of a continuous random variable in both the one-dimensional and multidimensional case. In other words,
More informationProbability Theory Review Reading Assignments
Probability Theory Review Reading Assignments R. Duda, P. Hart, and D. Stork, Pattern Classification, John-Wiley, 2nd edition, 2001 (appendix A.4, hard-copy). "Everything I need to know about Probability"
More informationThe random variable 1
The random variable 1 Contents 1. Definition 2. Distribution and density function 3. Specific random variables 4. Functions of one random variable 5. Mean and variance 2 The random variable A random variable
More informationProbability. Carlo Tomasi Duke University
Probability Carlo Tomasi Due University Introductory concepts about probability are first explained for outcomes that tae values in discrete sets, and then extended to outcomes on the real line 1 Discrete
More informationEXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours
EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, 008 Statistical Theory and Methods I Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks.
More information