Chapter 3. Julian Chan. June 29, 2012
|
|
- Madison Lambert
- 5 years ago
- Views:
Transcription
1 Chapter 3 Julian Chan June 29, 202 Continuous variables For a continuous random variable X there is an associated density function f(x). It satisifies many of the same properties of discrete random variables except that instead of having a summation we have an intergral. To be clear it satisifies: f(x) 0 for all x. f(x)dx =. The CDF is given by F (a) = P (X a) = a f(x)dx. P (b X a) = a b f(x)dx = F (a) F (b). By the fundamental theorem of calculus F (x) = f(x). The main difference between continuous and discrete random variables is that instead of a sum there is an intergral. The is illistrated with the poperty of expectation: E[g(x)] = Consequently the mean µ is computed as µ = E[X] = g(x)f(x)dx. and the moment generating function is given by: M x (t) = E[e tx ] = xf(x)dx, e tx f(x)dx. The higer moments are computed in the usual way( either with derivatives or expectation), and the variance is also, but in particular: var(x) = E[x 2 ] (E[x]) 2 = x 2 f(x)dx ( xf(x)dx) 2.
2 Example. Suppose that X is a continuous random variable whose probability density function is given by: { c (4x 2x f(x) = 2 ) 0 x 2 0 else What is the value of c that makes this a density? From the second property of a density we have: 2 0 c (4x 2x2 )dx =. After some calculus we find that f(x)dx = Find the P (X > ). We find that: 2 c = P (X > ) = 8 (4x 2x2 )dx =.5. Example. You have contacted customer service to contact you. You are told you will be given a call between 8 in the morning and 8 hours. When you ask what hour block would be more likely to expect the call you are told that all are equally likely. The density function is that of a uniform distribution which is given by: { f(x) = x 8 0 else where x denotes the hours of the day. Find the probability that the package is delivered between 9 to 2, and 5 to 8 hours. while We compute the mean: P (9 < X < 2) = P (9 < X < 2) = µ = x dx = 3. 0 dx =.3 0 dx =.3. 0 The variance is 8.33, andthemomentgeneratingfunctionise 0t 0. Example. Consider the density function given { 4x 3 f(x) = 5 x 2 0 else Find the expectation and variance. 2
3 2 Uniform A random variable is said to follow a uniform random variable on [a, b] if it has density given by: { f(x) = b a a x b 0 else In this instance the mean, the variance and moment generating function are particularly easy to compute. To check your understanding you should compute them. 3 NORMAL The normal density is given by: f(x) = (x µ)2 exp ( 2πσ 2σ 2 ) for < x <. It has a mean of µ and a variance of σ 2. The moment generating function is given by: M x (t) = exp (µt + σ 2 t 2 /2). This is perhaps one of THE most important proabability density functions known. It has many important properties. Theorem. If X is a normal random variable with mean µ and standard deviation σ then the transformed random variable x µ σ is a STANDARD normal random variable (ie it has mean µ and standard deviation ). Example. An expert witness in a peternity suit testifies that the length in days of a pregnancy (time from impregnation to the delivery of the child) is approximately normally distributed with parameters µ = 270 and σ 2 = 00. The defendant in the suit is able to prove that he was out of the country a period that began 290 days before the birth of the child and ended 240 days before the birth. If the defandant was in fact the father of the child what is the probability that the mother could have had the very long or very short pregnancy indicated by the testimony? The probability we we seek is: P (X < 240 or X > 290) since these probabilities are disjoint we have P (X < 240 or X > 290) = P (X < 240) + P (X > 290) = P (Z < ) + P (Z > ) = 00 3
4 Example. Most galaxies take the form of a flattened disk with the shape of a flattened disk with the majority of the light coming from a thin fundamental plane. The degree of flatteneing differs from galaxy to galaxy. Let X denote the perpendicular distance from the center of the milkyway to a gaseous mass. If X is normally distributed with mean zero and standard deviation 00 parses (9.2 trillion miles) find the probability a gaseous mass is more than 250 parces from the center? The probability we seek is given by: P (X > 250) = P (Z > ) =.9938 = At what distacne does only 20% of the masses lie further than this distance? This is a bit of reverse thinking. The number we seek is a, and it satisifies the property that: P (X > a) =.2 We can find a by using the standard normal table that is we translate it as follows: P (X > a) = P (Z > a 0 00 ) =.2. Using the standard normal tables we find that: a 0 00 =.85. We remark that in this instance the moment generating function is given by M x (t) = exp (µt + σ 2 t 2 /2) = exp (00 2 t 2 /2). We have yet to explain why we have diverged from discrete random random variable. Continuous random variables in part are needed to compute discrete probabilities for large sample sizes. This is illistrated with the following example: Example. Consider a class with 500 students. For this particular class the pass rate for each student is p =.4 and we assume that the students are independent and identically distributed (ie they have roughly the same study patterns and intelligence). Let x be the random variable that a certain number of students pass. We see that this is a binomial distribution since each out come is either a pass or a fail (a Bernoulli trial). What is the probability that exactly 300 pass? P (X = 300) = ( ) However this number is impossible for most calculators to calculate!!! This number is so close to zero that for all practicle purposes it is zero. What is the probability that between 00 and 300 students pass? This is given by: P (00 < X < 300) = 300 X=00 4 ( 500 X ).4 X X.
5 If there is no hope in computing one term of this series what hope is there in computing 200? Fortunately the Binomial can be approximated by a normal random variable with mean np and variacne np( p) thanks to the theorem below. We compute the probability as follows: P (00 < X < 300) P ( < Z < ) = What is the probability that more than 300 pass? This is approximated by P (X > 300) P (Z < ) = 0. Theorem. Let X be a binomial random variable with parameters n and p. For large n (there are a few guide lines for what large n is) the random variable X can be approximated by a normal random variable with mean np and variacne np( p). There are two things to note about this theorem. The first is what is large n, and the second is that a continunity correction error can be used. The following fact is one that we will use later on and is related to hypothesis testing, but can be used in several ways. Let X be a normal random varible with mean µ and variance σ then P ( σ < X µ < σ) =.68, and P ( 2σ < X µ < 2σ) =.95, P ( 3σ < X µ < 3σ) =.997. This is often refered to the 68%, 95%, 99, 7% rule. Example. The time until death of a specific disease is normally distributed. It is found that the mean is given by µ = 8days and the standard deviation is given by.5 days. Find a range for which 95% of all individuals will live once infected. so P ( 2σ < X µ < 2σ) =.95 P ( 2.5 < X 8 < 2.5) =.95, P (5 < X < ) =.95, thus we find that 95% of all individuals will live between 5 and 8 days. Later we will see that this is an example of what is called a confidence interval. 5
6 4 Chebyshev s inequality We would like a rune that is similar to the 68%, 95%, 99, 7% rule for random variables that may not be normally distributed. The best that we can do is what is called Chebyshev s inequality: 5 Gamma P ( X µ ) > kσ) > k 2. The gamma distribution is a two parameter family of continuous probability distribution. For different values of the parameter we associate a different distribution. There are two equivalent (via a U subsitution) form of the gamma distribution. The density of each is given by: Γ(α)β α xα e x β The function Γ(α) is called the gamma function (not a distribution) and is given by: Γ(α) = 0 e t t α dt. Upon integrating by parts one can see that it has the property that: Γ(α) = (α ) Γ(α ). Thus for positive integer values of α this function is exactly the factorial function!!! We also note the important calculation that Γ( 2 ) = π. The gamma distribution has a mean, variance, and moment generating function given by: µ = αβ, σ 2 = αβ 2, and M x (t) = ( βt) α for t < β. When α = we obtain the exponential distribution which we now discuss in more detail. 6 Exponential The exponential distribution is obtained from the gamma distribution with α =. It thus has the following properties: β xe x β The exponential distribution has a mean, variance, and moment generating function given by: µ = β, σ 2 = β 2, and M x (t) = ( βt). 6
7 Example. The time until failure x of a light bulb in days is exponentially distributed with β = /30. The store offers a 0 day warrenty on the poduct. Find the proportion of light bulbs that can be returned? This is given by P (X < 0) = 0 The mean time until failure is given by 30 days. 0 /30 e x/30 dx = e /3 =. There is a fundamental connection to the poisson process as you can guess since both functions involve e. The relationship is: Theorem. Consider a poisson process with paramer λ. Let W be the random varialble time of the occurance of the first event then W is an expoential distribution with parameter β = /λ. Proof. F (W ) = P (W < X) = P (W > X) let y be the number of occurances of the event in the interval [0, X]. Now Y is a poisson distribution with parameter X so we have P (W > x) = P (Y = 0) = e x (x) 0 0! = e x. Example. The average number of lightning strikes on transformers durning a severe thunder storm season in a given area is 2 per week. If the number of lightning strikes follows a poisson process find the probability that durning the next storm season one must wait at most one week in order to see the first transformer strike. We have that λ = / 2 = 2 we now find that: P (X > ) = P (X < ) = 0 2 e x 2dx = ( e 2) = e 2. 7 Chi-squared distribution We briefly mention the Chi-squared distribution denoted χ γ since it will be used in future sections. The χ γ ditribution is defined in terms of a transformation which is the topic which will will talk about at the next of this section, and again in the next chapter. It is one of the most important concepts in probability theory. The transformation is as follows: let X be a gamma random variable with β = 2, and α = γ 2 > 0. The parameter γ determines the distribution, and as γ X converges to a normal random variable. Table 4 of appendix A allows us to work with this probability distribution. 7
8 8 Weibull The Weibull distribution is given by the density function: αβx β e αxβ with parameters α > 0, and β > 0. The Weibull distribution is most commonly used in the application of reliability. However it has several other applications, and is obtained from the gamma distribution. The mean and variacne are given by: µ = α β Γ( + β ) and σ 2 = α 2 β Γ( + 2 β ) µ2. Example. The length of time in hours that a rechargeable battery will hold it s charge is a random variable distributed according to a weibull distribution with α =.0 and β = 2. We first find that the density function is given by: 50xe x2 00 What is the probability that the battery keeps it s charge for at least 0 hours? This is given by:// P (x > 0) = 0 50xe x2 00 dx =. What is the mean length of time a battery will hold it s charge? µ =.0 2 Γ( + 2 ) = 5 π 9 Transformations of random variables In this section I will discuss the CDF method of transformations, and in the book the equivalent PDF method is given. The idea is this if x is a random variable then so is y = x 2, y = x, y = x +. If y = f(x) then one would like to know what is the density of y? For a definite answer as to what the density is one would need a specific function and density for x. These next few examples should be illimunating. Example. Let x be a standard normal random variable and determine the denisty of y = x 2. We start with the CDF of y: F y (a) = P (Y < a) = P (x 2 < a) = P ( a a < x < a) 8
9 The point being here that we do not necessarily know how to calculate P (Y < a) however we can calculate P ( a a < x < a) since we know the density of x, and we do so as follows P ( a a < x < a) = a a 2π exp ( x2 2 ) = F x( a) F ( a). Recall the connection between a CFD and a PDF is given by F (z) = f(z). Thus we have This gives us the pdf of y: F y (a) = F x ( a) F ( a). f y = d da (F x( a) F ( a)) = f x ( a) 2 a f x( a) 2 a = 2aπ exp ( a 2 ). This formula is not good for all values of a only ones for which 0 < a <. Do you see why? Example. Let X be an exponential distribution with parameter one, and y = ln(x). Find the density for y. The same steps (and words apply) so we start with the CDF of y: F y (a) = P (Y < a) = P (ln(x) < a) = P (x < e a ) The point being here that we do not necessarily know how to calculate P (Y < a) however we can calculate P (x < e a ) since we know the density of x, and we do so as follows P (x < e a ) = F x (e a ). Recall the connection between a CFD and a PDF is given by F (z) = f(z). Thus we have The density is given by for < a <. F y (a) = F x (e a ). f y = f x (e a )e a = e ea e a 9
10 Example. Let X be a uniform distribution on [0, ], and y = ln(x). Find the density for y. The same steps (and words apply) so we start with the CDF of y: F y (a) = P (Y < a) = P ( ln(x) < a) = P (x > e a ) = P (x < e a ) The point being here that we do not necessarily know how to calculate P (Y < a) however we can calculate P (x < e a ) since we know the density of x, and we do so as follows P (x < e a ) = F x (e a ). Recall the connection between a CFD and a PDF is given by Thus we have The density is given by F (z) = f(z). F y (a) = F x (e a ). f y = 0 f x (e a ) e a = e a for 0 < a <. Note this this is just the exponential distribution! As we conclude this chapter we will again be asked to calculated probabilities involving transformations as they occur naturally. In addition we will also be asked to determine the distribution of one of the most famous transformations of random variables: x +...+x n n. 0
15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationReview 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015
Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted
More informationContinuous random variables
Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationTopic 4: Continuous random variables
Topic 4: Continuous random variables Course 003, 2018 Page 0 Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative
More informationContinuous Distributions
Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study
More informationContinuous random variables
Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot
More informationTopic 4: Continuous random variables
Topic 4: Continuous random variables Course 3, 216 Page Continuous random variables Definition (Continuous random variable): An r.v. X has a continuous distribution if there exists a non-negative function
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationSolution: First we need to find the mean of this distribution. The mean is. ) = e[( e 1 e 1 ) 0] = 2.
Math 0A with Professor Stankova Worksheet, Discussion #; Monday, //207 GSI name: Roy Zhao Standard Deviation Example. Let f(x) = e e x for x and 0 otherwise. Find the standard deviation of this distribution.
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationCHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS. 6.2 Normal Distribution. 6.1 Continuous Uniform Distribution
CHAPTER 6 SOME CONTINUOUS PROBABILITY DISTRIBUTIONS Recall that a continuous random variable X is a random variable that takes all values in an interval or a set of intervals. The distribution of a continuous
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More information5.2 Continuous random variables
5.2 Continuous random variables It is often convenient to think of a random variable as having a whole (continuous) interval for its set of possible values. The devices used to describe continuous probability
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationLet X be a continuous random variable, < X < f(x) is the so called probability density function (pdf) if
University of California, Los Angeles Department of Statistics Statistics 1A Instructor: Nicolas Christou Continuous probability distributions Let X be a continuous random variable, < X < f(x) is the so
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationChapter 3 Common Families of Distributions
Lecture 9 on BST 631: Statistical Theory I Kui Zhang, 9/3/8 and 9/5/8 Review for the previous lecture Definition: Several commonly used discrete distributions, including discrete uniform, hypergeometric,
More informationAdvanced Herd Management Probabilities and distributions
Advanced Herd Management Probabilities and distributions Anders Ringgaard Kristensen Slide 1 Outline Probabilities Conditional probabilities Bayes theorem Distributions Discrete Continuous Distribution
More information3 Modeling Process Quality
3 Modeling Process Quality 3.1 Introduction Section 3.1 contains basic numerical and graphical methods. familiar with these methods. It is assumed the student is Goal: Review several discrete and continuous
More informationLecture 5: Moment generating functions
Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationReview for the previous lecture
Lecture 1 and 13 on BST 631: Statistical Theory I Kui Zhang, 09/8/006 Review for the previous lecture Definition: Several discrete distributions, including discrete uniform, hypergeometric, Bernoulli,
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More informationSlides 8: Statistical Models in Simulation
Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationChapter 4 Continuous Random Variables and Probability Distributions
Chapter 4 Continuous Random Variables and Probability Distributions Part 1: Continuous Random Variables Section 4.1 Continuous Random Variables Section 4.2 Probability Distributions & Probability Density
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationLimiting Distributions
We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationSTT 441 Final Exam Fall 2013
STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More informationExam 3, Math Fall 2016 October 19, 2016
Exam 3, Math 500- Fall 06 October 9, 06 This is a 50-minute exam. You may use your textbook, as well as a calculator, but your work must be completely yours. The exam is made of 5 questions in 5 pages,
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationClosed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,
More informationDefinition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R
Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment
More informationFinal Examination. STA 215: Statistical Inference. Saturday, 2001 May 5, 9:00am 12:00 noon
Final Examination Saturday, 2001 May 5, 9:00am 12:00 noon This is an open-book examination, but you may not share materials. A normal distribution table, a PMF/PDF handout, and a blank worksheet are attached
More informationBell-shaped curves, variance
November 7, 2017 Pop-in lunch on Wednesday Pop-in lunch tomorrow, November 8, at high noon. Please join our group at the Faculty Club for lunch. Means If X is a random variable with PDF equal to f (x),
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationStatistics 100A Homework 5 Solutions
Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to
More informationMATH : EXAM 2 INFO/LOGISTICS/ADVICE
MATH 3342-004: EXAM 2 INFO/LOGISTICS/ADVICE INFO: WHEN: Friday (03/11) at 10:00am DURATION: 50 mins PROBLEM COUNT: Appropriate for a 50-min exam BONUS COUNT: At least one TOPICS CANDIDATE FOR THE EXAM:
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationSTAT 3610: Review of Probability Distributions
STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationContinuous Random Variables and Probability Distributions
Continuous Random Variables and Probability Distributions Instructor: Lingsong Zhang 1 4.1 Probability Density Functions Probability Density Functions Recall from Chapter 3 that a random variable X is
More informationLecture 3. Biostatistics in Veterinary Science. Feb 2, Jung-Jin Lee Drexel University. Biostatistics in Veterinary Science Lecture 3
Lecture 3 Biostatistics in Veterinary Science Jung-Jin Lee Drexel University Feb 2, 2015 Review Let S be the sample space and A, B be events. Then 1 P (S) = 1, P ( ) = 0. 2 If A B, then P (A) P (B). In
More informationECE 313 Probability with Engineering Applications Fall 2000
Exponential random variables Exponential random variables arise in studies of waiting times, service times, etc X is called an exponential random variable with parameter λ if its pdf is given by f(u) =
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationProbability and Statistics Concepts
University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationChapter 4: Continuous Probability Distributions
Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationMath 362, Problem set 1
Math 6, roblem set Due //. (4..8) Determine the mean variance of the mean X of a rom sample of size 9 from a distribution having pdf f(x) = 4x, < x
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More informationChapter 6 Continuous Probability Distributions
Math 3 Chapter 6 Continuous Probability Distributions The observations generated by different statistical experiments have the same general type of behavior. The followings are the probability distributions
More informationChapter 4 - Lecture 3 The Normal Distribution
Chapter 4 - Lecture 3 The October 28th, 2009 Chapter 4 - Lecture 3 The Standard Chapter 4 - Lecture 3 The Standard Normal distribution is a statistical unicorn It is the most important distribution in
More informationChapter 2 Continuous Distributions
Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following
More informationWeek 1 Quantitative Analysis of Financial Markets Distributions A
Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October
More information8 Laws of large numbers
8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationPARAMETER ESTIMATION: BAYESIAN APPROACH. These notes summarize the lectures on Bayesian parameter estimation.
PARAMETER ESTIMATION: BAYESIAN APPROACH. These notes summarize the lectures on Bayesian parameter estimation.. Beta Distribution We ll start by learning about the Beta distribution, since we end up using
More informationORF 245 Fundamentals of Statistics Chapter 4 Great Expectations
ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable
More informationProbability Distributions Columns (a) through (d)
Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationGEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs
STATISTICS 4 Summary Notes. Geometric and Exponential Distributions GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs P(X = x) = ( p) x p x =,, 3,...
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to
More informationMidterm Examination. STA 205: Probability and Measure Theory. Wednesday, 2009 Mar 18, 2:50-4:05 pm
Midterm Examination STA 205: Probability and Measure Theory Wednesday, 2009 Mar 18, 2:50-4:05 pm This is a closed-book examination. You may use a single sheet of prepared notes, if you wish, but you may
More informationApplied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 4 Continuous Random Variables and Probability Distributions 4 Continuous CHAPTER OUTLINE Random
More informationChapter 4 Continuous Random Variables and Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 4 Continuous Random Variables and Probability Distributions 4 Continuous CHAPTER OUTLINE 4-1
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationProbability Methods in Civil Engineering Prof. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur
Probability Methods in Civil Engineering Prof. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur Lecture No. # 12 Probability Distribution of Continuous RVs (Contd.)
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 6
MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and
More informationMATH Notebook 5 Fall 2018/2019
MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................
More informationMath Bootcamp 2012 Miscellaneous
Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationFinal Exam # 3. Sta 230: Probability. December 16, 2012
Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets
More informationChapter 4: CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Chapter 4: CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Gamma Distribution Weibull Distribution Lognormal Distribution Sections 4-9 through 4-11 Another exponential distribution example
More informationSTA 4321/5325 Solution to Homework 5 March 3, 2017
STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform
More informationStat 100a, Introduction to Probability.
Stat 100a, Introduction to Probability. Outline for the day: 1. Geometric random variables. 2. Negative binomial random variables. 3. Moment generating functions. 4. Poisson random variables. 5. Continuous
More informationGuidelines for Solving Probability Problems
Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More information