Statistics for scientists and engineers

Size: px
Start display at page:

Download "Statistics for scientists and engineers"

Transcription

1 Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics? Examples Statistics Probability theory 3. Random variables Vectors of random variables Conditioning Moments and expected values Transformation of random variables One-dimensional case Multi-variate case Example Special case: affine transformations Some useful distributions 8 3. Continuous RVs Dirac delta distribution Uniform distribution ormal distribution Multi-variate normal distribution Exponential distribution Chi-squared distribution Beta distribution Gamma distribution Discrete RVs Kronecker delta distribution Bernoulli distribution Binomial distribution Poisson distribution Some important relations 4. Orthonormal transformation of independent normal RVs Statistics of normal random variables Chi-s quared distribution Relation between Chi-squared and Gamma distribution Relation between Gamma and Beta distributions Statistics from normal RVs

2 Introduction. Motivation - why study statistics? Statistical modeling and analysis, including the collection and interpretation of data, form an essential part of the scientific method in diverse fields, including social, biological, and physical sciences. Statistical theory is primarily based on the mathematical theory of probability, and covers a wide range of topics, from highly abstract areas to topics directly relevant for applications. The main goal of the theory of statistics is to draw information from data. Data can come in a variety of forms including signals in continuous time and lists of discrete-time values in dimensions. Important aspects include:. Model construction: to get insight into a problem, we build a generally drastically oversimplified model of the problem. This model should capture the properties in which we re interested and abstract away everything else. Example: Shannon s BSC in information theory.. Methods: given a certain model, we derive methods for extracting useful information from the data. Examples: MMSE estimation. 3. Performance comparison: different methods need to be compared in terms of certain performance criteria. We also introduce notions of optimality. Examples: information inequality. 4. Algorithm design: in most cases, estimation methods do not lead to closed-form solutions. We need to develop clever numerical methods to solve these problems. Example: ewton-raphson, Expectation-Maximization, Turbo codes.. Examples A sequence of elements from an assembly line. An unknown number of these elements θ is defective. We would like to know what θ is, but we do not have the time or resources to investigate each of the elements. We choose to draw without replacement n elements and try to determine θ. We wish to study how the income of a large population e.g., grad students is distributed. An exhaustive study of the entire population is impossible. We base our study on n samples. We make n observations to determine a constant µ. The observations are corrupted by random fluctuations. For instance we transmit a symbol b, +} n times and try to recover that bit in the presence of thermal noise. To solve these problems, we first require to construct a model. Let us consider some simple models: Define a random variable X k, as being the whether or not item k out of the n items is defective. So X k defective, OK}. A possible model could be independent observations with p Xk defective θ and p Xk OK θ. This model may not be correct: for instance we may have drawn samples just as after machine in the assembly line was repaired. We introduce a random variable, being the incomes of n people X [X,..., X n ]. The joint distribution of these incomes is given by p X x. Let us assume the incomes are independent, so that p X x n k p X k x k. Finally, let us model p Xk x k as a normal distribution with mean µ and variance σ, both independent of k. The observation is given by a random variable X [X,..., X n ], with X k µ + W k where w k is a noise sample. Clearly p X x can be found from p W w. ow we can introduce some additional assumptions regarding p W w. We can say that the noise at time k does not depend on the noise at time l k. The noise samples are independent. In that case p W w n k p Z k z k. We can also assume the noise samples are identically distributed, so that p Zk z k is independent of k. The specific distribution p Z z depends on the physical properties of the noise.

3 We see that we always need to introduce some simplifying assumptions. These are generally based on knowledge regarding the observations. For instance, when we see some observations looks like it is normally distributed, let us then model it so. These models are not always 00% accurate. Look at the income distribution. A normal distribution can result in some people having negative incomes!.3 Statistics Definition A statistic T is a map from an observation space Ω to some space of values J. T x is usually what you compute after you observe x. J is commonly within a Euclidean space. The choice of the statistics is closely related to what we are trying to infer from the data. Examples the fraction of defective items out of n samples, T x. the sample mean T x n k x k x the sample variance T x n k x k x Probability theory We now give a non-too-rigorous introduction into the basics of probability.. Random variables Given a sample space Ω e.g, heads, tails, Ω H, T }. A random variable RV X is a mapping from Ω to the real numbers R. When Ω is finite or countably infinite, X is said to be a discrete RV. Otherwise X is a continuous RV. With each RV we can associate a cumulative distribution function CDF: F X x P ω Ω : X ω x} where P E} is the probability of some event. We generally abuse the notation and write P X x} instead of P ω Ω : X ω x}. where the last equation is a slight abuse of notation. We will often use probability density functions pdfs p X x, defined as b a p X x dx P a X b} P ω Ω : a X ω b} ote that p X x and P X x} are not necessarily the same thing! ote also that + For continuous RVs, when F X x is differentiable, Similarly, p X x dx. d dx F X x p X x. F X x x p X u du. In the case of discrete RVs, we use a slightly different terminology: F X x is the cumulative mass function CMF, while p X x is the probability mass function pmf. In that case, we have p X x P X x}. For discrete RVs all integrals should be replaced by summations, and sections dealing with differentiations cannot be applied. 3

4 . Vectors of random variables The concept of a RV is easily extended to a multi-dimensional case. We can group n random variables X,..., X n in a vector X. This vector is again a random variable with probability density function p X x. X,..., X n are said to be mutually independent when p X x n p Xk x k. k X,..., X n are said to be identically distributed when p Xk x p Xl x for any k and l. In many problems we will consider variables which are independent and identically distributed iid. The marginal distributions p Xk x k, k,..., n can be obtained as follows p Xk x k... p X x dx... dx k dx k... dx n..3 Conditioning Given two random variables X and Y. The conditional probability function p X Y x y is given by p X Y x y p X,Y x, y p Y y for p Y y 0. This is to be interpreted as the probability function of x, given than Y y. In p X Y x y, x is the random variable, while y should be interpreted as a parameter. Also, p X Y x y dx for all y, while p X Y x y dx g x for some function g.. Examples: What happens when X and Y are independent? dice, with X k being the number of eyes of dice k and Y k even, odd}. Determine p X,X x, x, p X,Y x, y as well as the conditional probability functions. Bayes Rule Probably one of the most useful results is Bayes Rule. Looking back at, we easily find that so that p X,Y x, y p X Y x y p Y y p Y X y x p X x p X Y x y p Y X y x p X x p Y y Learn this rule by heart. You will be using this a lot! As a variation, note that p Y y p X,Y x, y dx py X y x p X x dx, so that p X Y x y p Y X y x p X x py X y x p X x dx which is known as Bayes Theorem. 4

5 .4 Moments and expected values We introduce the expectation operator on a RV X: given a function g : R Γ, the expectation or: expected value of g is w.r.t. p X x is given by E X g X} g x p X x dx. Observe that E X g X} is just an element in Γ, no longer dependent on x. A special case are the moments and the central moments µ n E X X n } n } µ n E X X µ n. The mean and variance of a distribution are given by µ and µ, respectively. We will sometimes denote the mean by µ and the variance by σ. The standard deviation is given by σ. Properties of the expectation operator Linearity E X g X + g X} E X g X} + E X g X} Uncorrelated RVs: X and Y are said to be uncorrelated when E X,Y X Y } E X X} E Y Y }. Show that independent RVs are uncorrelated. Show than uncorrelated RVs are not necessarily independent Conditional expectation E X Y g X} g x p X Y x Y dx, which is a function of the RV Y. Iterative expectations E X,Y g X, Y } E Y EX Y g X, Y } } Expectations and functions: let Y g X, then E X g X} E Y Y } so we can evaluate E Y Y } without explicit knowledge of p Y y..5 Transformation of random variables The discrete case is trivial, so we will focus on continuous RVs..5. One-dimensional case RV X and an invertible function f : R R. We wish to determine the probability distribution of Y f X. We see that X f Y g Y. It is easily verified that p Y y p X g y y. When f is not invertible, we use a different technique: F Y y P Y y} P f X y} which should be evaluated further and then differentiated wrt y. 5

6 .5. Multi-variate case Given real-valued RVs X, X,..., X and functions h,..., h h k : R R. We define Y [Y,..., Y ] T and X [X,..., X ] T where Y n h n X or simply Y h X. We assume h is one-to-one invertible, so that X h Y with X n g n Y. We now introduce the Jacobian as the determinant of the matrix J y with [J y] k,n y n g k y. Then p Y y p X h Y det J y..5.3 Example Problem: Given X and X with known p X,X x, x. Y X + X. Determine p Y y. Solution : We see that the transformation is not one-to-one. So we first introduce Y X X. ow Y h X is invertible: given Y and Y, we find X and X as X g Y, Y and Y + Y X g Y, Y so that Y Y [J y], y g y [J y], y g y [J y], y g y [J y], y g y 6

7 so that J y [ and det J y /. Hence Y + Y p Y,Y y, y p X,X, Y Y / p Y + Y X,X, Y Y ] And finally Solution : Since p Y y p Y,Y y, y dy F Y! X y x P Y y X x } P X + X y X x } P X y x } F X y x so that and p Y! X y x p X y x p Y y p X y x p X x dx which can be interpreted as a convolution of two pdfs..5.4 Special case: affine transformations Introduce an matrix A and an vector c. Define Y h X AX + c then h. is an affine transformation. When A is invertible X A Y c and so that J y A p Y y p X A y c deta p X A y c deta. When A is an invertible square matrix with AA T I A T A, we say that A is orthonormal. Orthonormal matrices are norm-preserving AX X where X k X k. 7

8 3 Some useful distributions For a more exhaustive list, go to With each distribution we will provide the mean µ and the variance σ. 3. Continuous RVs 3.. Dirac delta distribution The Dirac delta distribution is used when we have absolute certainty regarding a random variable. We write X δ x where δ x is defined as f x δ x dx f 0 and has µ σ Uniform distribution X U a, b, with b > a, then Also and p X x b a a < x < b 0 else µ b a σ b a ormal distribution Probably the most important distribution. X µ, σ p X x 3..4 Multi-variate normal distribution exp πσ σ x µ X [X,..., X ] is said to have a multi-variate normal distribution X m, Σ if its probability function has the form p X x π / det Σ exp x mt Σ x m where and Properties E X X} m E X X m X m T } Σ. When X m, Σ, then X k µ k, σ k. Determine µk and σ k. When X m, Σ and X k and X l are uncorrelated, then they are also independent. X k µ k, σ k does not imply X m, Σ! ot a function!. 8

9 3..5 Exponential distribution X has an exponential distribution with rate parameter λ > 0 when p X x λe λx x > 0 0 else. Then µ λ σ λ 3..6 Chi-squared distribution When Y k 0,, Y,..., Y n independent, then X n degrees of freedom. We write X χ n with where Γ z is the Gamma function. Also, Properties k Y k p X x x n e x/ Γ n, x > 0 n/ µ n σ n X k χ n k, X,..., X L independent, then L k X k χ P k n k. Γ z + zγ z Γ. So, Γ n n! Γ / π Beta distribution X β r, s is defined for x [0, ] with where B r, s is the beta function, given by p X x xr x s B r, s B r, s Γ r Γ s Γ r + s. Mean and variance are given by µ s/ s + r, and σ rs/ has a chi-squared distribution with n s + r r + s Gamma distribution X Γ k, θ for x > 0, k > 0 and θ > 0, with p X x xk e x/θ Γ k θ k. Mean and variance are given by µ θk, and σ θ k. Somewhat confusingly, sometimes you will see λ θ but write X Γ k, λ for p X x x k e xλ λ k /Γ k. Beware! 9

10 3. Discrete RVs 3.. Kronecker delta distribution This is the discrete counterpart to the Dirac delta distribution: p X x δ x with x 0 δ x 0 x 0 and σ µ Bernoulli distribution There are two possible outcomes 0 failure and success with probability p and p, respectively. p X x p x 0 p x with µ p σ p p 3..3 Binomial distribution X is the number of successes out of Bernoulli trials p X x p x p x x for x 0,,..., } where and x! x! x! µ p σ p p 3..4 Poisson distribution Events occur with a known average rate /λ expressed in events per unit of time. Then the distribution of the number of events X in a unit of time has the following distribution for x, with p X x e λ λ x x! µ λ σ λ 0

11 4 Some important relations 4. Orthonormal transformation of independent normal RVs Theorem Z Z, Z,...Z n T has independent normal distributed elements with the same variance σ, and expected value E Z} d. Y g Z AZ + c, where A is an n n orthonormal matrix. Then Y has independent normal components with the same variance σ and E Y} Ad + c. Proof: We know that since det A for orthonormal matrices. Hence ow, since p Z z p Y y p Y A y c deta p Y A y c πσ n exp n exp πσ n exp πσ σ n z i d i i z d σ z d σ z d A y c d y Ad + c where we have used the fact that A and A are norm-preserving. This leads to p Y y. n exp y Ad + c πσ σ which proves that the Y k s are independent, normal distributed, variance σ with E Y} Ad + c. QED. 4. Statistics of normal random variables Theorem: Let Z [Z,..., Z ] be a sample iid from a µ, σ population. Then. Z i Z i and i Zi Z are independent. Z µ, σ 3. σ i Zi Z χ Proof: We introduce an orthonormal matrix A and Y AZ with Y n Z and i Y i i Zi Z. This matrix is constructed as follows: select the first row as [,,,..., ] and the remaining rows are then obtained by the Gram-Schmidt orthogonalization procedure: Y i Z i Y Z

12 and i Y i Y AZ Z Zi. i We find that i Y i Zi nz i Z i Z. i Since A is an orthonormal matrix, a T i a j δ i j, where a T i denotes the i-th row of A. Since at [,,,..., ], we see that for j a T j a 0 a ji i so that for j E Y j } E a T j Z } a T j E Z} µ 0. i a ji We can draw the following conclusions: Y, Y,..., Y are independent normal RVs with variance σ and E Y j } 0 for j, and E Y } µ µ. Y µ, σ so that Z Y / µ, σ /. This proves the second part of the theorem. Y,..., Y 0, σ iid, so that Y /σ,..., Y /σ 0,. We see that proves the third part of the theorem. i Y χ n. This Since Y,..., Y are independent of Y, it follows that Z is independent of i Zi Z. This proves the first part of the theorem. 4.3 Chi-s quared distribution Z µ. σ Relate the result to the χ distribu- Problem: Z µ, σ. Determine the distribution of X tion. Solution:

13 Y Z µ σ, with Y 0,. Since X Y is a non-invertible function, we cannot use the Jacobian. However F X x P X x} P Y x } P x Y x } F Y x FY x Taking the derivative wrt x, and noting that p Y y is an even function: p X x p Y x x + p Y x x p Y x x exp x πx If we consider the χ distribution, this gives us the same result: x e x/ Γ e x/ / π x 4.4 Relation between Chi-squared and Gamma distribution Problem: How are the gamma distribution and the χ distribution related? Solution: Γ k, θ with k n/ and θ yields which is clearly equal to the χ n distribution. n Γ, xn/ e x/ Γ n n/ 4.5 Relation between Gamma and Beta distributions Problem: X Γ k, θ and X Γ k, θ independent. A Determine the distribution of Y X + X and of Y X / X + X. B Show that Y and Y are independent. C Show from this result that the sum of squared iid zero-mean unit-variance normal RVs has a χ distribution. Solution: A Since we can always introduce Z X /θ, with Z Γ k,, we can assume θ without loss of generality. Since X and X are independent p X,X x, x xk e x Γ k x k e x Γ k Since Y X + X and Y X / X + X, X Y Y and X Y Y Y. Hence y y J y y y and det J y y y y + y y y. We find that p Y Y y, y p X X y y, y y y y yk y k e yy y k y k e y e +yy Γ k Γ k 3 y

14 y k +k e y y k y k Γ k + k Γ k + k Γ k Γ k yk +k e y Γ k + k y k y k B k, k so that Y Γ k + k, and Y β k, k. B follows from A, since p Y Y y, y can be written as p Y Y y, y p Y y p Y y. CWhen X l Γ, independent, we now know that n l X l Γ n Y l 0,, and X l Yl, then X l χ Γ,. So, n l Y l n l Y l Γ n, χ n, n l Y l χ n. 4.6 Statistics from normal RVs,. When we introduce Γ n,. And since Problem: X [X,..., X ] is a vector of iid RVs with X k µ, σ. Compute the expected value of the following RVs: A Sample mean M k X k B Sample variance with known mean C Sample variance with unknown mean Solution: A B R S X k µ k X k M k E M M} E X µ E R R} E X σ } X k k E Xk X k } k } X k µ k E Xk X k µx k + µ } k µ + σ µ + µ k 4

15 C E S S} σ } E X X k M k } E X Xk + M M X k k k } E X Xk + M M E X k } Xk M k n } E X X k E X M } k µ + σ µ + σ Verify that E X X k } µ + σ and that E X M } σ + µ. 5

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

7.3 The Chi-square, F and t-distributions

7.3 The Chi-square, F and t-distributions 7.3 The Chi-square, F and t-distributions Ulrich Hoensch Monday, March 25, 2013 The Chi-square Distribution Recall that a random variable X has a gamma probability distribution (X Gamma(r, λ)) with parameters

More information

SOLUTION FOR HOMEWORK 12, STAT 4351

SOLUTION FOR HOMEWORK 12, STAT 4351 SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.

More information

Lecture 2: Review of Basic Probability Theory

Lecture 2: Review of Basic Probability Theory ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao

More information

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Pattern Recognition A Brief Mathematical Review Hamid R. Rabiee Jafar Muhammadi, Ali Jalali, Alireza Ghasemi Spring 2012 http://ce.sharif.edu/courses/90-91/2/ce725-1/ Agenda Probability theory

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Brief Review of Probability

Brief Review of Probability Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Probability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014

Probability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014 Probability Machine Learning and Pattern Recognition Chris Williams School of Informatics, University of Edinburgh August 2014 (All of the slides in this course have been adapted from previous versions

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

Lecture 6 Basic Probability

Lecture 6 Basic Probability Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

2. The CDF Technique. 1. Introduction. f X ( ).

2. The CDF Technique. 1. Introduction. f X ( ). Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions

More information

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Introduction to Machine Learning

Introduction to Machine Learning What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes

More information

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/ STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This

More information

[POLS 8500] Review of Linear Algebra, Probability and Information Theory

[POLS 8500] Review of Linear Algebra, Probability and Information Theory [POLS 8500] Review of Linear Algebra, Probability and Information Theory Professor Jason Anastasopoulos ljanastas@uga.edu January 12, 2017 For today... Basic linear algebra. Basic probability. Programming

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

[Chapter 6. Functions of Random Variables]

[Chapter 6. Functions of Random Variables] [Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Fundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner

Fundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner Fundamentals CS 281A: Statistical Learning Theory Yangqing Jia Based on tutorial slides by Lester Mackey and Ariel Kleiner August, 2011 Outline 1 Probability 2 Statistics 3 Linear Algebra 4 Optimization

More information

Chapter 7. Basic Probability Theory

Chapter 7. Basic Probability Theory Chapter 7. Basic Probability Theory I-Liang Chern October 20, 2016 1 / 49 What s kind of matrices satisfying RIP Random matrices with iid Gaussian entries iid Bernoulli entries (+/ 1) iid subgaussian entries

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

Probability Background

Probability Background CS76 Spring 0 Advanced Machine Learning robability Background Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu robability Meure A sample space Ω is the set of all possible outcomes. Elements ω Ω are called sample

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

ECE531: Principles of Detection and Estimation Course Introduction

ECE531: Principles of Detection and Estimation Course Introduction ECE531: Principles of Detection and Estimation Course Introduction D. Richard Brown III WPI 22-January-2009 WPI D. Richard Brown III 22-January-2009 1 / 37 Lecture 1 Major Topics 1. Web page. 2. Syllabus

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

1 Random variables and distributions

1 Random variables and distributions Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

Probability and Estimation. Alan Moses

Probability and Estimation. Alan Moses Probability and Estimation Alan Moses Random variables and probability A random variable is like a variable in algebra (e.g., y=e x ), but where at least part of the variability is taken to be stochastic.

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

Multivariate Distributions (Hogg Chapter Two)

Multivariate Distributions (Hogg Chapter Two) Multivariate Distributions (Hogg Chapter Two) STAT 45-1: Mathematical Statistics I Fall Semester 15 Contents 1 Multivariate Distributions 1 11 Random Vectors 111 Two Discrete Random Variables 11 Two Continuous

More information