Analysis of Experimental Designs
|
|
- Leslie Baldwin
- 6 years ago
- Views:
Transcription
1 Analysis of Experimental Designs p. 1/? Analysis of Experimental Designs Gilles Lamothe Mathematics and Statistics University of Ottawa
2 Analysis of Experimental Designs p. 2/? Review of Probability A short Introduction to SAS Probability Random Variables Expectation Operator E{} Variance Operator σ 2 {} Independence A few important models
3 Analysis of Experimental Designs p. 3/? Short Intro to SAS Here is the URL for the course Web page: glamothe/mat3378spring2012 Scroll down to updates on the course Web page and you will find a link to an Introduction to SAS. Set up a computer account at the CUBE computer lab on the 2nd floor. The SAS windows environment is composed of five different types of windows: Results, Explorer, Log, Output and Enhanced Editor.
4 Analysis of Experimental Designs p. 4/? Short Intro to SAS Consider a standard normal random variable Z N(0,1). Its p.d.f. is ( ) f(z) = 1 2π exp z2 2, < z <. We say that the quantile 1.96 is a 97.5th percentile of N(0,1) since P(Z < 1.96) =.975
5 Analysis of Experimental Designs p. 5/? Short Intro to SAS SAS program: We enter a SAS program into the editor window. Consider the following program: data quantiles; z=probit(0.975); proc print data=quantiles; run; Submitting: In the menu, select run and then select submit. The output will be sent to the output window. We got: Obs z
6 Analysis of Experimental Designs p. 6/? Short Intro to SAS Suppose now that we want to compute P(Z > 1.96) = 1 P(Z 1.96). SAS program: data prob; p=1-cdf( normal,1.96); proc print data=prob; run; Submitting: In the menu, select run and then select submit. The output will be sent to the output window. We got: Thus, P(Z > 1.96) = Obs p
7 Analysis of Experimental Designs p. 7/? Short Intro to SAS Example: Write a SAS program to compute the probability that a standard normal random variable will be as far away from 0 as 1.56.
8 Analysis of Experimental Designs p. 7/? Short Intro to SAS Example: Write a SAS program to compute the probability that a standard normal random variable will be as far away from 0 as We want 2P(Z > 1.56) = 2[1 P(Z 1.56)].
9 Analysis of Experimental Designs p. 7/? Short Intro to SAS Example: Write a SAS program to compute the probability that a standard normal random variable will be as far away from 0 as SAS program: data prob; p=2*(1-cdf( normal,1.56)); proc print data=prob; run; From the output, we get 2P(Z > 1.56) =
10 Analysis of Experimental Designs p. 8/? Terminology from Probability Theory Consider a random experiment, that is a process which will result in outcomes where we cannot predict the outcome with certainty. S=set of all possible outcomes. A subset E of the sample space S is called an event. Example: Consider the set of possible grades for this course S = {A+,A,A,...,E,F}. The event that you get at least an A is E = {A+,A,A }. We say that the event has occurred if the observed outcome is in the event.
11 Analysis of Experimental Designs p. 9/? Terminology from Probability Theory The goal of probability theory is to construct measures of the chances that an event will occur. According to Kolmogorov such a measure should satisfy the following axioms: Probability Axioms: Suppose that we can construct a function P such that [Certainty] P(S) = 1; [Positivity] P(E) 0 for all events E; [Additivity] Let E 1,E 2,... be mutually exclusive events, then P(E 1 E 2...) = P(E 1 )+P(E 2 )+...; then we say that P is a probability measure.
12 Analysis of Experimental Designs p. 10/? Terminology from Probability Theory Consequences of the axioms: P(A c ) = 1 P(A) if A B, then P(A) P(B) 0 P(E) 1 Addition Rule: P(A B) = P(A)+P(B) P(A B) A B
13 Analysis of Experimental Designs p. 11/? Terminology from Probability Theory Law of Large Numbers: Suppose that we can repeat n independent trials of the experiment. Let Y be the number of times that E occurs and p = P(E). Then, for all ɛ > 0, lim P ( Y/n p ɛ) = 0. n The above result leads to the following approach that allows us to interpret probabilities. Frequentist Approach: A probability refers to limiting relative frequencies. Example: If we say that the probability that a salesman will sell a car to the next customer is 20%. The frequentist interpretation is that in the long run this salesman sells cars to about 20% of his customers.
14 Analysis of Experimental Designs p. 12/? Terminology from Probability Theory A random variable X is a function that associates a real number X(s) to each outcome s. We use a cummulative distribution function (c.d.f.) F X to specify probabilities associated with X: F X (x) = P(X x) = P(s S : X(s) x) F(x) = x i :x i x f(x i ) F(x) = x f(y)dy
15 Analysis of Experimental Designs p. 13/? Terminology from Probability Theory We can also specify the probabilities with f X : for X discrete, we use the prob. mass function for X continuous, we use the prob. density function Recall: Area under the density are probabilities. f(x i ) = P(X = x i ) f(x) = d dx F X(x)
16 Analysis of Experimental Designs p. 14/? Terminology from Probability Theory The expectation of X (also called its mean) is defined as a weighted average of the values in its support (i.e. possible values x taken by X), it is { xf(x), discrete case µ X = E{X} = xf(x), discrete case Remark: µ X = E{X} is a measure of central tendency and also of location. It is a Linear Operator: E{aX +b} = ae{x}+b
17 Analysis of Experimental Designs p. 15/? Normal Model f(x) = 1 σ 2π e 1 2( x µ σ ) 2 The mean µ is a measure of location and of central tendency. µ x
18 Analysis of Experimental Designs p. 15/? Normal Model f(x) = 1 σ 2π e 1 2( x µ σ ) 2 The mean µ is a measure of location and of central tendency. µ µ 1 x µ 1 > µ
19 Analysis of Experimental Designs p. 15/? Normal Model f(x) = 1 σ 2π e 1 2( x µ σ ) 2 The mean µ is a measure of location and of central tendency. µ 1 µ µ 1 < µ x
20 Analysis of Experimental Designs p. 16/? Terminology from Probability Theory The variance of X is defined as the mean squared deviations from the mean: σ 2 {X} = E[(X µ X ) 2 ] = E[X 2 ] µ 2 X Remark: σ 2 {X} is a measure of dispersion and variability. It is not a linear operator. Actually σ 2 {ax +b} = a 2 σ 2 {X} Square Units of σ 2 : Since the units of the variance are squared, we often refer to the standard deviation σ{x} = σ 2 {X} Remark: σ{x} is a measure of scale. Of course it also measures dispersion and variability.
21 Analysis of Experimental Designs p. 17/? Normal Model f(x) = 1 σ 2π e 1 2( x µ σ ) 2 The standard deviation σ is a measure dispersion and variability. µ x
22 Analysis of Experimental Designs p. 17/? Normal Model f(x) = 1 σ 2π e 1 2( x µ σ ) 2 The standard deviation σ is a measure dispersion and variability. µ x σ 1 > σ
23 Analysis of Experimental Designs p. 17/? Normal Model f(x) = 1 σ 2π e 1 2( x µ σ ) 2 The standard deviation σ is a measure dispersion and variability. µ x σ 1 < σ
24 Analysis of Experimental Designs p. 18/? Joint Distributions Consider two random variable X 1 and X 2. We say that they will have a joint c.d.f. F(x 1,x 2 ) = P({X 1 x 1 } {X 2 x 2 }). Discrete case: The joint probability mass function is f X1,X 2 (x 1,x 2 ) = P({X 1 = x 1 } {X 2 = x 2 }). Frequentist Interpretation: f X1,X 2 (1,5) = 0.2 means that if we repeat the experiment a larger number of times, then for about 20% of the trials we will observe {X 1 = 1,X 2 = 5}. Continuous case: The joint probability density function is f X1,X 2 (x 1,x 2 ) = x 1 x2 F X1,X 2 (x 1,x 2 )
25 Analysis of Experimental Designs p. 19/? Expectation The expectation of h(x 1,...,X n ) is E{h(X 1,...,X n )} = { h(x1,...,x n )f(x 1,...,x n ), discrete case h(x1,...,x n )f(x 1,...,x n )dx 1 dx n, continuous case It is a linear operator: E{b+ n i=1 a ix i } = b+ n i=1 a ie{x i }
26 Analysis of Experimental Designs p. 20/? Independence Consider two random variable X and Y. We say that they are independent if P({X A} {Y B}) = P(X A)P(Y B) for all events {X A} and {Y B} Why is this independence? Assuming that P(X A Y B) and P(Y B X A) exist. Independence is equivalent to the following: P(X A Y B) = P(X A) P(Y B X A) = P(Y B) Consequences of independence: f(x,y) = f X (x)f Y (y) E[h 1 (X)h 2 (Y)] = E[h 1 (X)]E[h 2 (Y)]
27 Analysis of Experimental Designs p. 21/? Independent Random Variables We can cumulate the variance from independent random variables: Let X and Y be independent random variables, then σ 2 {X +Y} = E{[(X +Y) (µ X +µ Y )] 2 } = E{(X µ X ) 2 }+E{(Y µ Y ) 2 } 2E{X µ X }E{Y µ Y } = E{(X µ X ) 2 }+E{(Y µ Y ) 2 } = σ 2 {X}+σ 2 {Y} Linear functions of independent random variables: Consider Y 1,Y 2,...,Y n independent random variables. σ 2 {b+ n i=1 a iy i } = n i=1 a2 i σ2 {Y i }
28 Analysis of Experimental Designs p. 22/? Random Sample Consider a random sample X 1,...,X n from a population with mean µ and variance σ 2. In other words, the random variables are independent and E[X i ] = µ and σ 2 {X i } = σ 2. Show that E{X} = µ, and σ 2 {X} = σ 2 /n. Furthermore show that E{S 2 } = σ 2. X = (1/n) n i=1 X i is the sample mean and S 2 = is the sample variance. n i=1 (X i X) 2 n 1 hint: E{X 2 } = σ 2 {X}+(E{X}) 2 = ( n i=1 X2 i) nx 2 n 1
29 Analysis of Experimental Designs p. 23/? A few prob. models There are many important probability models that are used in statistics. We will present a few of these models. discrete uniform normal chi-square Student s t Snedecor s F distribution
30 Analysis of Experimental Designs p. 24/? A few prob. models Important for rank based statistics. We say that X follows a discrete uniform distribution on {1,2,...,m} if its prob. mass function is f(x) = 1/m for x {1,2,...,m}. E{X} = m i=1 i(1/m) = (m+1)/2 σ 2 {X} = E{X 2 } µ 2 X = m i=1 i2 (1/m) Example: Uniform on {1,2,3,4,5}: [ (m+1) 2 ] 2 = m
31 Analysis of Experimental Designs p. 25/? Normal Model A normal random variable X with mean µ and variance σ 2, i.e. X N(µ,σ 2 ), has the following density f(x) = 1 σ 2π e 1 2( x µ σ ) 2, < x < µ x The normal (gaussian) distribution is often used as an approximative model for quantitative variables. It is often a reasonable model since many variables are the result of many different additive effects.
32 Analysis of Experimental Designs p. 26/? Normal Model Is the normal curve a reasonable model? Abraham de Moivre Pierre-Simon Laplace CAUTION: Assuming normality is not always appropriate!! However due to the Central Limit Theorem the assumption of normality is often reasonable.
33 Analysis of Experimental Designs p. 27/? A few prob. models The family of normal distributions is closed under linear transformations of independent normal random variables. if Y i N(µ i,σ 2 i ), i = 1,...,n and Y 1,...,Y n are independent, then W = b+ n i=1 a iy i N(E{W},σ 2 {W}), where E{W} = b+ n i=1 a iµ i and σ 2 {W} = n i=1 a2 i σ2 i Consequences: [standardizing] if X N(µ,σ 2 ), then Z = (X µ)/σ = (1/σ)X µ/σ N(0,1). [sample mean] if X 1,...,X n is a random sample from N(µ,σ 2 ), then X = n i=1 X i/n N(µ,σ 2 /n).
34 Analysis of Experimental Designs p. 28/? A few prob. models The following three distributions are the distributions that we will use the most throughout this course. chi-square distribution t distribution F distribution In the following slides we will remind you how to construct these distributions from the normal distribution.
35 Analysis of Experimental Designs p. 29/? Chi-Square: χ 2 Let Z N(0,1), then Z 2 χ 2 (1), that is a chi-square distribution with ν = 1 degree of freedom. Theorem: Let U i be a chi-square random variable with ν = ν i degrees of freedom. Notation: χ 2 (ν i ). If U 1,...,U n are independent, then U U n χ 2 (ν ν n ) Remark: Assume that X i N(µ,σ 2 ) where µ is known. We can construct a χ 2 random variable to collect information concerning the unknown variance. n ) i=1 (X i µ) 2 2 χ 2 (n) σ 2 = n i=1 ( Xi µ σ
36 Analysis of Experimental Designs p. 30/? Chi-Square: χ 2 Notation: We will use χ 2 (ν) to denote a χ 2 distribution with ν degrees of freedom. However it will also be used as the notation of a random variable with that distribution. P(χ 2 (6) > 12.6) = Quantile Notation: χ 2 (A;ν) is a quantity such that P(χ 2 (ν) χ 2 (A;ν)) = A. Thus χ 2 (0.95;6) = 12.6.
37 Analysis of Experimental Designs p. 31/? Student stdistribution Theorem: Let Z N(0,1) and U χ 2 (ν) be independent random variables, then T = Z/ U/ν t(ν). Notation: We use t(ν) to denote both a t distribution with ν degrees of freedom and also a random variable with this distribution. Here P(t(10) > 1.18) = Quantile Notation: t(0.95; 10) = 1.18.
38 Analysis of Experimental Designs p. 32/? Random Sample fromn(µ,σ 2 ) Consider X 1,...,X n a random sample from a N(µ,σ 2 ). The sample mean : X = 1 n n i=1 X i The sample variance : S 2 = Theorem: X N(µ,σ 2 /n) (n 1)S 2 /σ 2 χ 2 (n 1) X and S 2 are independent. n 1 i=1 (X i X) 2 n 1 Standard Application from your Intro to Stats class: T = X µ = (X µ)/ σ 2 /n t(n 1) S2 /n [(n 1)S2 /σ 2 ]/(n 1)
39 Analysis of Experimental Designs p. 33/? Snedecor sf distribution Theorem: Let U 1 χ 2 (ν 1 ) and U 2 χ 2 (ν 2 ) be independent random variables, then F = (U 1 /ν 1 )/(U 2 /ν 2 ) F(ν 1,ν 2 ). Notation: We use F(ν 1,ν 2 ) to denote both an F distribution with ν 1 in the numerator and ν 2 degrees of freedom in the denominator and also a random variable with this distribution. Here P(F(10, 15) > 2.54) = Quantile Notation: F(0.95; 10, 15) = 2.54.
40 Analysis of Experimental Designs p. 34/? Snedecor sf distribution Let X 1,...,X n be a random sample from a N(µ,σ 2 ) population. We showed that T = X µ t(n 1) S2 /n Show that T 2 = [ 2 X µ F(1,n 1) S2 /n]
1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationCourse information: Instructor: Tim Hanson, Leconte 219C, phone Office hours: Tuesday/Thursday 11-12, Wednesday 10-12, and by appointment.
Course information: Instructor: Tim Hanson, Leconte 219C, phone 777-3859. Office hours: Tuesday/Thursday 11-12, Wednesday 10-12, and by appointment. Text: Applied Linear Statistical Models (5th Edition),
More informationStat 704 Data Analysis I Probability Review
1 / 39 Stat 704 Data Analysis I Probability Review Dr. Yen-Yi Ho Department of Statistics, University of South Carolina A.3 Random Variables 2 / 39 def n: A random variable is defined as a function that
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationSet Theory. Pattern Recognition III. Michal Haindl. Set Operations. Outline
Set Theory A, B sets e.g. A = {ζ 1,...,ζ n } A = { c x y d} S space (universe) A,B S Outline Pattern Recognition III Michal Haindl Faculty of Information Technology, KTI Czech Technical University in Prague
More informationChapter 2. Continuous random variables
Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction
More informationReview of Statistics I
Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More informationData Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2017 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationA random variable is a quantity whose value is determined by the outcome of an experiment.
Random Variables A random variable is a quantity whose value is determined by the outcome of an experiment. Before the experiment is carried out, all we know is the range of possible values. Birthday example
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationWill Landau. Feb 28, 2013
Iowa State University The F Feb 28, 2013 Iowa State University Feb 28, 2013 1 / 46 Outline The F The F Iowa State University Feb 28, 2013 2 / 46 The normal (Gaussian) distribution A random variable X is
More informationIntroduction to Normal Distribution
Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction
More informationTopic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.
Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationS n = x + X 1 + X X n.
0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each
More informationLecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2
Lecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2 Fall, 2013 Page 1 Random Variable and Probability Distribution Discrete random variable Y : Finite possible values {y
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationChapter 2. Probability
2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing
More informationExamples of random experiment (a) Random experiment BASIC EXPERIMENT
Random experiment A random experiment is a process leading to an uncertain outcome, before the experiment is run We usually assume that the experiment can be repeated indefinitely under essentially the
More informationII. The Normal Distribution
II. The Normal Distribution The normal distribution (a.k.a., a the Gaussian distribution or bell curve ) is the by far the best known random distribution. It s discovery has had such a far-reaching impact
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationProbability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.
Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationBrief Review of Probability
Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationSpecial Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p).
Sect 3.4: Binomial RV Special Discrete RV s 1. Assumptions and definition i. Experiment consists of n repeated trials ii. iii. iv. There are only two possible outcomes on each trial: success (S) or failure
More informationRandom Vectors 1. STA442/2101 Fall See last slide for copyright information. 1 / 30
Random Vectors 1 STA442/2101 Fall 2017 1 See last slide for copyright information. 1 / 30 Background Reading: Renscher and Schaalje s Linear models in statistics Chapter 3 on Random Vectors and Matrices
More informationHomework for 1/13 Due 1/22
Name: ID: Homework for 1/13 Due 1/22 1. [ 5-23] An irregularly shaped object of unknown area A is located in the unit square 0 x 1, 0 y 1. Consider a random point distributed uniformly over the square;
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationPreliminary Statistics. Lecture 3: Probability Models and Distributions
Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution
More information7.3 The Chi-square, F and t-distributions
7.3 The Chi-square, F and t-distributions Ulrich Hoensch Monday, March 25, 2013 The Chi-square Distribution Recall that a random variable X has a gamma probability distribution (X Gamma(r, λ)) with parameters
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationAsymptotic Statistics-VI. Changliang Zou
Asymptotic Statistics-VI Changliang Zou Kolmogorov-Smirnov distance Example (Kolmogorov-Smirnov confidence intervals) We know given α (0, 1), there is a well-defined d = d α,n such that, for any continuous
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationAarti Singh. Lecture 2, January 13, Reading: Bishop: Chap 1,2. Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell
Machine Learning 0-70/5 70/5-78, 78, Spring 00 Probability 0 Aarti Singh Lecture, January 3, 00 f(x) µ x Reading: Bishop: Chap, Slides courtesy: Eric Xing, Andrew Moore, Tom Mitchell Announcements Homework
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationMath438 Actuarial Probability
Math438 Actuarial Probability Jinguo Lian Department of Math and Stats Jan. 22, 2016 Continuous Random Variables-Part I: Definition A random variable X is continuous if its set of possible values is an
More informationCHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution
CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS Recall that a continuous random variable X is a random variable that takes all values in an interval or a set of intervals. The distribution of a continuous
More informationChapter 4. Continuous Random Variables
Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More informationContinuous r.v. s: cdf s, Expected Values
Continuous r.v. s: cdf s, Expected Values Engineering Statistics Section 4.2 Josh Engwer TTU 29 February 2016 Josh Engwer (TTU) Continuous r.v. s: cdf s, Expected Values 29 February 2016 1 / 17 PART I
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationBACKGROUND NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2016 PROBABILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO
ACKGROUND NOTES FYS 4550/FYS9550 - EXERIMENTAL HIGH ENERGY HYSICS AUTUMN 2016 ROAILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO efore embarking on the concept of probability, we will first define
More informationSingle Maths B: Introduction to Probability
Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationEXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS
EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker
More informationSampling Distributions of Statistics Corresponds to Chapter 5 of Tamhane and Dunlop
Sampling Distributions of Statistics Corresponds to Chapter 5 of Tamhane and Dunlop Slides prepared by Elizabeth Newton (MIT), with some slides by Jacqueline Telford (Johns Hopkins University) 1 Sampling
More informationOutline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions
Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationSTAT 512 sp 2018 Summary Sheet
STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}
More informationMATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM
MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM YOUR NAME: KEY: Answers in Blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they
More informationChapter 2: Fundamentals of Statistics Lecture 15: Models and statistics
Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:
More informationExpectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom
Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, 8.5 Jeremy Orloff and Jonathan Bloom Learning Goals. Be able to compute and interpret expectation, variance, and standard
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationReview of Probabilities and Basic Statistics
Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationTopic 6 Continuous Random Variables
Topic 6 page Topic 6 Continuous Random Variables Reference: Chapter 5.-5.3 Probability Density Function The Uniform Distribution The Normal Distribution Standardizing a Normal Distribution Using the Standard
More informationDiscrete Probability distribution Discrete Probability distribution
438//9.4.. Discrete Probability distribution.4.. Binomial P.D. The outcomes belong to either of two relevant categories. A binomial experiment requirements: o There is a fixed number of trials (n). o On
More informationEC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)
1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationRelationship between probability set function and random variable - 2 -
2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationProbability Review. Chao Lan
Probability Review Chao Lan Let s start with a single random variable Random Experiment A random experiment has three elements 1. sample space Ω: set of all possible outcomes e.g.,ω={1,2,3,4,5,6} 2. event
More informationProbability Density Functions
Statistical Methods in Particle Physics / WS 13 Lecture II Probability Density Functions Niklaus Berger Physics Institute, University of Heidelberg Recap of Lecture I: Kolmogorov Axioms Ingredients: Set
More informationDef 1 A population consists of the totality of the observations with which we are concerned.
Chapter 6 Sampling Distributions 6.1 Random Sampling Def 1 A population consists of the totality of the observations with which we are concerned. Remark 1. The size of a populations may be finite or infinite.
More informationMath 362, Problem set 1
Math 6, roblem set Due //. (4..8) Determine the mean variance of the mean X of a rom sample of size 9 from a distribution having pdf f(x) = 4x, < x
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationIV. The Normal Distribution
IV. The Normal Distribution The normal distribution (a.k.a., the Gaussian distribution or bell curve ) is the by far the best known random distribution. It s discovery has had such a far-reaching impact
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationContinuous Random Variables
MATH 38 Continuous Random Variables Dr. Neal, WKU Throughout, let Ω be a sample space with a defined probability measure P. Definition. A continuous random variable is a real-valued function X defined
More information1 INFO Sep 05
Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.
UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Christopher Barr University of California, Los Angeles,
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationLecture 2: Review of Basic Probability Theory
ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent
More informationIntroduction to Statistics and Error Analysis II
Introduction to Statistics and Error Analysis II Physics116C, 4/14/06 D. Pellett References: Data Reduction and Error Analysis for the Physical Sciences by Bevington and Robinson Particle Data Group notes
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationIV. The Normal Distribution
IV. The Normal Distribution The normal distribution (a.k.a., a the Gaussian distribution or bell curve ) is the by far the best known random distribution. It s discovery has had such a far-reaching impact
More informationMath/Stat 352 Lecture 9. Section 4.5 Normal distribution
Math/Stat 352 Lecture 9 Section 4.5 Normal distribution 1 Abraham de Moivre, 1667-1754 Pierre-Simon Laplace (1749 1827) A French mathematician, who introduced the Normal distribution in his book The doctrine
More informationStatistics, Data Analysis, and Simulation SS 2017
Statistics, Data Analysis, and Simulation SS 2017 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2017 Dr. Michael O. Distler
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited
Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance
More information