To understand and analyze this test, we need to have the right model for the events. We need to identify an event and its probability.

Size: px
Start display at page:

Download "To understand and analyze this test, we need to have the right model for the events. We need to identify an event and its probability."

Transcription

1 Probabilistic Models Example #1 A production lot of 10,000 parts is tested for defects. It is expected that a defective part occurs once in every 1,000 parts. A sample of 500 is tested, with 2 defective parts found. Should you conclude that this is a bad lot? Is the number of defective parts within the tolerance? To understand and analyze this test, we need to have the right model for the events. We need to identify an event and its probability. Basic probability properties: 0 event cannot occur P (event) = 1 event must occur 0 < p(e) < 1 otherwise. If A is set of all possible events = P (A) = 1 If distinct events are disjoint = P (A 1 or A 2 or A 3 ) = P (A 1 ) + P (A 2 ) + P (A 3 ) P (A j ) = 1 j= all possible A1 A2 A3 A If all events are equally likely = P (A j ) = 1/n for n events. For example we can consider: A 1 = the first part defective A 2 = the second part defective 62

2 A 3 = only the third part defective Then A 1 and A 3 are disjoint = P (A 1 or A 3 ) = P (A 1 ) + P (A 3 ) Note: If A 1 and A 2 not disjoint = P (A 1 or A 2 ) P (A 1 ) + P (A 2 ) A1 A2 A3 A In this case we have P (A 1 or A 2 ) = P (A 1 ) + P (A 2 ) P (A 1 and A 2 ) In our example: The event space is all possible outcomes of the test - large and complicated! So we would like to break it down to smaller tests: testing each part individually. Defective/ Not Defective is similar to a coin flip: Fair Coin P (H) = 1/2 P (T ) = 1/2 Biased Coin P (T ) = p P (H) = 1 p = q It is useful to designate H, T (Success or Failure) with 1 and 0. Each test is then a Bernoulli trial with parameter p: P (1)p. Now we can calculate the probability of events involving more than one test of a part, by noting that the tests of the parts are assumed to be independent. Conditional Probability P (A B) = probability that event A occurs given B P (A, B) P (A B) = where P(A,B) = probability that both occur, and P(B) = probability only B P (B) occurs. 63

3 Simple example showing dependence: Flip a coin three times. Find the P(observe 2 heads second is a head). If we view all possible outcome ( for a fair coin ) we have: HHH HHT THH THT HTH HTT TTH TTT. The first four events are those with H as second outcome. Then P(observe 2 heads second is a head) = P(2 out of 3 second is head) = 2/4 = 1/2. Note: P(2 out of 3) is unconditional probability, i.e the probability of 2 successes in 3 Bernoulli trials. ( ) (1 3 2 ( 1 1 P(2 out of 3) = = 2 2) 2) 3 8 We can also calculate P(observe 2 heads second is a head) using P (2 out of 3 and second is H) P (second is H) = 2/8 1/2 = 1 2 Simple example showing independence: Successive outcomes of the coin flip (Bernoulli trial) are independent. So P (X 1 = H X 2 = T ) = P (X 1 = H) Then, it must be that P (X 1 = H X 2 = T ) = P (X 1 = H, X 2 = T ) P (X 2 = T ) = P (X 1 = H, X 2 = T ) = P (X 1 = H) P (X 2 = T ) In general the joint probability density for two independent random variables X 1, X 2 is written as P (X 1, X 2 ) = f(x 1, x 2 ) = f(x 1 )f(x 2 ) that is the density for the joint distribution is just the product of the individual densities. The statement of a problem can make a big difference: Example Given a room of n people, what is the probability that at least 2 have the same birthday? So we have to find P (X 2) where X is the number of people with same birthday. Stated in this way it is difficult because we have to consider all possible combinations. 64

4 But we can also state as 1 P (X = 1). So P (X 2) = 1 P (X = 1) where P (1) = P (the 2 nd person is different from the 1 st ) P (the 3 rd person is different from the 2 nd and 1 st ) P (the n th person is different from all the rest) ( = 1 1 ) ( 1 2 ) ( 1 n 1 ) P (X 2) n Recall our problem. Define X j = outcome of j th test. P (X j = 1) = p where p is the probability of defective part. Here success is the event of finding a defective part. Then if the individual tests are independent: P (X 1 = 1 and X 2 = 0) = P (X 1 = 1) P (X 2 = 0) Again, the treatment of and and or in probability of events can be viewed from the set point of view. P (X 1 = 1 or X 2 = 0) = P (X 1 = 1) + P (X 2 = 0) P (X 1 = 1 and X 2 = 0) Let s write the event space for the first two tests: P (X 2 = 0) }{{} P (X 1 = 1) P (X 1 = 1) = p P (X 2 = 0) = 1 p P (X 1 = 1 and X 2 = 0) = p(1 p) using independence P (X 1 = 1 or X 2 = 0) = p + (1 p) p(1 p) = 1 p(1 p) Note: we also have to be careful about specifying order: P (X 1 = 1 and X 2 = 0) P(1 success and 1 failure) 65

5 P(1 success and 1 failure) = P (X 1 = 1, X 2 = 0) + P (X 1 = 0, X 2 = 1) = 2p(1 p) So if we ask what is the probability of a defective part in first n trials, we need to take into account the number of ways that could occur: and so on Similarly, if we ask for P(two defective parts in n trials): and so on For any one of these sequences of n trials with j defective (successes) we have p j (1 p) n j for the probability for that particular sequence. Since the order does not affect this probability, and the trials are identical, we look at the number of combinations/permtations ( ) we can have in these sequences: n n! choose j from n = = j j!(n j)! So the probability for j defective in n trials is: P (j out of n) = ( n j ) p j (1 p) n j This is called the binomial distribution B(n, p) with parameters n and p, where n is the number of Bernoulli trials and p is the success for any 1 trail. Note: to use the binomial distribution we must have a sequence of tests which are: 1. Identical Bernoulli trials (only two possible outcomes) 2. Independent So if Y is the number of defectives found in a batch of size n, P (Y = j) = ( ) n P(all possible sequences with j successes) = p j (1 p) n j. j Back to our example: We have a test of 500 parts with 2 defectives, and we have to determine if that is enough to decide that we have a bad lot of parts. Then, not only do we have to calculate the probability of this event, but also give a tolerance for the number of defectives observed. Tolerance is topically defined in terms of 90%, or 95%, or 99% chance of observing some event, under the assumption 66

6 that the lot is good. First we calculate the probability of the obsserved event P(Y = 2) for Y B(500, 0.001). So P (Y = 2) = ( ) (0.001) 2 (0.999) How do we compare this to the tolerance? If the lot is good, then we would expect that the probability of observing k or more defectives is ( ) n P (Y k) = p j (1 p) n j j j=k Graphically, our probability mass function for the discrete binomial variable Y, P(Y = j) looks like: P(Y = j) j So P (Y k) is adding up the mass for all values j k Here we see: P (Y 1) = 1 P (Y = 0) = = P (Y 2) = 1 [P (Y = 0) + P (Y = 1)] = 1 ( ) = P (Y 3) = 1 [P (Y = 0)+P (Y = 1)+P (Y = 2)] = 1 ( ) = So if we define a 90 % tolerance - usually called a confidence interval - then this defines a rare event as an event that occurs with 10% probability. Similarly an event which occurs with 5% probability, defined as rare, corresponds to defining a 95% confidence interval. Thus for 90% confidence interval, we would conclude that our observation of 2 defectives is a rare event for a good batch. 67

7 For a 95% confidence interval, we would conclude that observing 2 defectives is not rare.. Thus the choice of the confidence interval influences how many false positives or false negatives we obtain in our testing. The choice depends on our tolerance of either one. So far our calculation of probability/confidence interval is based on discrete probability mass functions. We can generalize to other approximations by noting that we are calculating P(Y) where Y is a sum of identically distributed, independent random variables = X j = 0 or 1 with probability p. Y = X j = the number of successes, where the outcome for each trial is independent. There are many results for sums of random variables, the most famous being the Central Limit Theorem. This let us approximate the distribution of the sum with a norma distribution: [ ] [ ] Y N(µ, σ 2 ) where µ = E Y and σ 2 = V ar Y Let s first review Expected Value. Expected value is by definition the weighted average, where the weight is given by the probability mass function: P (X = x i ) = f(x i ) for discrete values X i of the random variables X. E[X] = all i x i f(x i ) E[g(X)] = all i g(x i ) f(x i ) for g a function For X a Bernoulli random variable, f(1) = p, f(0)= 1-p, so E[X] = 1 p+0 (1 p) = p For a binomial random variable ( ) n P (Y = j) = p j (1 p) n j j ( ) n E[Y ] = j p j (1 p) n j j j=0 This can be calculated by rearranging terms in the sum. We also note that since Y = n X k i.e. Y is the sum of the outcomes of n Bernoulli trials, then E[Y ] = E [ n X k ] 68

8 Since X k are identical, and we can commute the sums, [ n ] E X k = E[X k ] = ne[x k ] = np Similarly we can use this idea to calculate variance V ar[x] = E[(X E(X)) 2 ] = E[X 2 2XE[X] + (E[X]) 2 ] = E[X 2 ] (E[X]) 2 (since E[X] is just a number not a random variable). For the Bernoulli trials: E[X] = p V ar[x] = E[X 2 ] p 2 E[X 2 ] = 1 2 p (1 p) = p V ar[x] = p p 2 = p(1 p) So E[Y ] = E[X k ] since Y = X k Now we can calculate: [ n ] ( n V ar[y ] = V ar X k = E X k E Recall that [ n ] E X k = ne[x k ] = np [ n ]) 2 X k So we can write the variation as: ( n ) 2 V ar[y ] = E (X k p) = E k p) (X (X k p)(x j p) j k<j [ n ] = E (X k p) 2 + 2E (X k p)(x j p) j Note that = E[(X k p) 2 ] + 2 j k<j E[(X k p)(x j p)] k<j E[(X k p)(x j p)] = (x k p)(x j p)f(x j, x k ) all x k, x j = (x k p)(x j p)f(x j )f(x k ) all x k, x j 69

9 since x j, x k are independent (Bernoulli trials). So, E[(X k p)(x j p)] = E[X k p]e[x j p] = 0 since E[X k ] = p So, V ar[y ] = n E[(X k p) 2 ] = n V ar[x k ] = np(1 p) Thus the sum of independent random variables X i has expected value ne[x i ] and variance nv ar[x i ]. In addition the Central Limit Theorem gives an approximation of the density of sum of i.i.d. random variables. n lim = X i ne[x i ] N(0, 1) n n V ar[xi ] This says that the sum of n i.i.d. random variables X i approaches a normal distribution. A normal distribution with mean µ and variance σ 2 has probability density function of the form (y µ) 2 p(y) e 2σ 2 2πσ 2 Note: this is a continuous random variable. You can verify that : + p(y)dy = 1, E[Y ] = + By the definition of E[Y], E[Y µ] = 0, E[cY ] = ce[y ] = cµ V ar[cy ] = E[(cY cµ) 2 ] = c 2 V ar[y ] - < y < + yp(y)dy = µ, V ar[y ] = + (y µ) 2 p(y)dy = σ 2 So in the limit above, we subtract off the mean and divide by the standard deviation, leaving us with a random variable with mean 0, and variance 1. The proof that the density tends towards a normal distribution is not covered here, but the implications are significant: The Central Limit Theorem (CLT) says that we can take any random variable which has bounded mean and variance (the random variables can be discrete or continuous) and if we take a sum of these random variables, as n the density of the sum will be normal. 70

10 Then we can approximate the probability of observations for the sums by using the normal distribution. For example, in our previous example, we considered the probability that P ( n X i > k) for some k. The CLT says we can consider instead: Xi ne[x i ] P (Z) = n V arxi P(z) 0 z Then = P P ( Xi ne[x i ] > k ne[x ) i] n V arxi n V arxi ( Z > k ne[x i] n V arxi ) = P (Z > z) Notice that in this case the comparison is between n X i a sum of discrete random variables which take only positive values, and Z, a continuous random variable with range over all reals. So we would expect that this approximation may not be valid for all values of Y = n X i for finite n. Then P (Z > z) we can identify using p(z) Typically we call the range of likely variation a confidence interval, which is then defined in terms of values of Z. The confidence interval could be one-sided or two-sided, depending on the application. We can compare our previous results to approximation with the normal distribution: P (Y 1) = (binomial) P (Y 1) = (normal) FIGURE 71

11 P(z) P(Z > z*) 0 z* z Example #2 Suppose a bus can arrive at any time between 11:00 am and 11:15 am, with equal probability. If you arrive at the bus stop at 11:00 am, with what probability will you wait for 10 minutes or more for the bus? With what probability will you wait a total of 300 minutes over the whole month (assuming you arrive at 11:00 am every day)? For which value of total minutes would you question the validity of the bus schedule? Let X = the waiting time of the bus in one day Then P (X > 10) = 1 P (X 10) = 1 Where f(x) is a uniform probability density given by 1 0 x 15 f(x) = 15 0 otherwise. 10 Let Y = total amount of time waiting = 30 X i. Then Where P (Y > 300) = P ( E[X i ] = V ar[x i ] = X i > 300) = P x 15 dx = 15 2 (x 15/2) 2 15 and 0 f(x)dx ( Xi ne[x i ] > 300 ne[x ) i] n V arxi n V arxi dx = 1 15 [x3 3 15x x 2 4 ] =

12 Then P (Y > 300) = P ( X i > 300) = P Z > = P (Z > ) < 0.01 So if we could choose a 90% or 95% confidence level, that would be P (Z > 1.28) 0.90 P (Z > 1.6) 0.95 Note that is much larger that these levels. Then we could conclude that we have observed rare event, given the assumption about the bus arrivals, so we could question this assumption. Example #3 Test a drug on 100 patients, with probability p of benefit from the drug. How many patients would we expect to observe receiving benefits from the drug in order to accept the assumed effectiveness of it? In this case we have to choose the confidence level and determine the number of patients that satisfy that confidence level. If Y is the number of patients receiving a benefit, this says we want to find y such that P (Y y) = 0.95 Then we need to identify E[Y] and Var[Y], and in particular we would like to identify Y as a sum of random variables. Here Y = X i wherex i is the Bernoulli trial for each individual patient. So E[X i ] = p = E[Y ] = 100p and V ar[y ] = 100p(1 p). = ( ) Y 90 P 1.65 = = Y So, in order to accept the drugs effectiveness of.9, we would want to see more than 85 patients receiving benefits from the drug. Least Squares - Linear Regression Data:- view as a random variable or as a function + random variable at each data point. 73

13 Let Y i = the data points Y i = f(t) + ε i where ε i = error (usual assumption ε i N(0,σ)) What is σ. It depends on f(t). The data points fit a linear function f(t) = a + bt Y i t In general, we would like to minimize errors ε i. In fact, we will minimize variance about a + bt ( like the mean ). = F (a, b) = (y i (a + bt i ) 2 ) F n a = 2(y i (a + bt i )( 1)) = 0 F n b = 2(y i (a + bt i )( t i )) = 0 y i = na + b t i t i y i = a t i b t 2 i Solve for a and b: n y i b n a = t i n b = n n y i t i n y i n t i n( n t 2 i ) ( n t i ) 2 74

14 Note ŷ i = a + bt i is the estimate of y i Then R 2 variation of estimate n = variation of actual data = (ŷ i ȳ) 2 n (y i ȳ) 2 We can apply the Least square fit to the following data: x i = y i = Then a = , b = 3.065, and R 2 = x 2 i = 3671 x i y i = Example(From Larsen&Marx) Crickets make their chirping sound by sliding one wing cover very rapidly back and forth over the other. Biologists have long been aware that there is a linear relationship between temperature and frequency with which a cricket chirps, although the slope and y-intercept of the relationship varies from species to species. Listed below are 15 frequency-temperature observations recorded for the striped ground cricket, Nemobius fasciatus fasciatus. Plot these data and find the equation of the least-square line, y= a+bx. Suppose a cricket of this species is observed to chirp 18 times per second. What would be the estimated temperature? Observation number Chirps per second (x i ) Temperature,y i ( 0 F )

15 Note that X and Y do not have to be related linearly in order to use linear regression. Example If y Ae bx, then we can take logarithms of both sides to obtain linear problem. = ln y = ln(ae bx ) = ln A + Bx. So, W = ln A + Bx Then we can apply the linear regression formulae to W, and x ln a = Wi b x i n b = Wi x i y i xi n( x 2 i ) ( x i ) 2 Example(From Larsen&Marx) Mistletoe is a plant that grows parasitically in the upper branches of large trees. Left unchecked, it can seriously stunt a tree s growth. Recently an experiment was done to test a theory that older trees are less susceptible to mistletoe growth than younger trees. A number of shoots were cut from 3-,4-,9-,15-, and 40-yearold Ponderosa pines. These were then side-grafted to 3-year-old nursery stock and planted in a preserve. Each tree was inoculated with mistletoe seeds. Five years later, a count was made of the number of mistletoe plants in each stand of trees. (A stand consisted of approximately ten trees; there were three stands of each of the four youngest age groups and two stands of the oldest.)the results are shown below: Age of Trees x (years) Number of Mistletoe Plants,y So if we try to approximate mistletoe data using y = a + bx we get a = , b = , and R 2 = For y = ax b, we use z = ln y, v = ln x, α = ln a = z = ln a + b ln x = ln a + bv = α + bv. Then we get α = 4.9, b= , and r 2 = Note: Here r 2 was calculated using the linear formulation. n r 2 (ẑ i z) 2 = n (z i z) 2 where ẑ i = α + bv i and ŷ i = e α x b i For y = ae bx, w = lny = w = ln a + bx 76

16 Then we get ln a = 3.56, b= r 2 = n (ŵ i w) 2 n (w i w) 2 Again r 2 was calculated using the linear formulation where w i = ln a + bx i, and y i = e (ln a+bx i) 77

What is Probability? Probability. Sample Spaces and Events. Simple Event

What is Probability? Probability. Sample Spaces and Events. Simple Event What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

Discrete Probability Distribution

Discrete Probability Distribution Shapes of binomial distributions Discrete Probability Distribution Week 11 For this activity you will use a web applet. Go to http://socr.stat.ucla.edu/htmls/socr_eperiments.html and choose Binomial coin

More information

Statistical Experiment A statistical experiment is any process by which measurements are obtained.

Statistical Experiment A statistical experiment is any process by which measurements are obtained. (التوزيعات الا حتمالية ( Distributions Probability Statistical Experiment A statistical experiment is any process by which measurements are obtained. Examples of Statistical Experiments Counting the number

More information

Probability (10A) Young Won Lim 6/12/17

Probability (10A) Young Won Lim 6/12/17 Probability (10A) Copyright (c) 2017 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later

More information

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary) Chapter 14 From Randomness to Probability How to measure a likelihood of an event? How likely is it to answer correctly one out of two true-false questions on a quiz? Is it more, less, or equally likely

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Discrete random variables and probability distributions

Discrete random variables and probability distributions Discrete random variables and probability distributions random variable is a mapping from the sample space to real numbers. notation: X, Y, Z,... Example: Ask a student whether she/he works part time or

More information

27 Binary Arithmetic: An Application to Programming

27 Binary Arithmetic: An Application to Programming 27 Binary Arithmetic: An Application to Programming In the previous section we looked at the binomial distribution. The binomial distribution is essentially the mathematics of repeatedly flipping a coin

More information

Introduction to bivariate analysis

Introduction to bivariate analysis Introduction to bivariate analysis When one measurement is made on each observation, univariate analysis is applied. If more than one measurement is made on each observation, multivariate analysis is applied.

More information

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Mathematics. (  : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2 ( : Focus on free Education) Exercise 16.2 Question 1: A die is rolled. Let E be the event die shows 4 and F be the event die shows even number. Are E and F mutually exclusive? Answer 1: When a die is

More information

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL: Notes slides from before lecture CSE 21, Winter 2017, Section A00 Lecture 16 Notes Class URL: http://vlsicad.ucsd.edu/courses/cse21-w17/ Notes slides from before lecture Notes March 8 (1) This week: Days

More information

Introduction to bivariate analysis

Introduction to bivariate analysis Introduction to bivariate analysis When one measurement is made on each observation, univariate analysis is applied. If more than one measurement is made on each observation, multivariate analysis is applied.

More information

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur

4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur 4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Rahul Roy Indian Statistical Institute, Delhi. Adapted

More information

CS206 Review Sheet 3 October 24, 2018

CS206 Review Sheet 3 October 24, 2018 CS206 Review Sheet 3 October 24, 2018 After ourintense focusoncounting, wecontinue withthestudyofsomemoreofthebasic notions from Probability (though counting will remain in our thoughts). An important

More information

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Expected Value 7/7/2006

Expected Value 7/7/2006 Expected Value 7/7/2006 Definition Let X be a numerically-valued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω x m(x), provided

More information

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Section 7.2 Sec$on Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli Trials and the Binomial Distribution Random Variables

More information

Review of Statistics I

Review of Statistics I Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution

More information

2. In a clinical trial of certain new treatment, we may be interested in the proportion of patients cured.

2. In a clinical trial of certain new treatment, we may be interested in the proportion of patients cured. Discrete probability distributions January 21, 2013 Debdeep Pati Random Variables 1. Events are not very convenient to use. 2. In a clinical trial of certain new treatment, we may be interested in the

More information

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x. Ch. 16 Random Variables Def n: A random variable is a numerical measurement of the outcome of a random phenomenon. A discrete random variable is a random variable that assumes separate values. # of people

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2 IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

Chapter 4: Probability and Probability Distributions

Chapter 4: Probability and Probability Distributions Chapter 4: Probability and Probability Distributions 4.1 a. Subjective probability b. Relative frequency c. Classical d. Relative frequency e. Subjective probability f. Subjective probability g. Classical

More information

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks Outline 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks Likelihood A common and fruitful approach to statistics is to assume

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (RV)X (R.V.) X: A function on sample space X: S R Cumulative distribution

More information

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces. Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0. () () a. X is a binomial distribution with n = 000, p = /6 b. The expected value, variance, and standard deviation of X is: E(X) = np = 000 = 000 6 var(x) = np( p) = 000 5 6 666 stdev(x) = np( p) = 000

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics February 19, 2018 CS 361: Probability & Statistics Random variables Markov s inequality This theorem says that for any random variable X and any value a, we have A random variable is unlikely to have an

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Two Classes of R.V. Discrete R.V. Bernoulli Binomial Geometric Poisson Continuous R.V. Uniform Exponential,

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

Applied Statistics I

Applied Statistics I Applied Statistics I Liang Zhang Department of Mathematics, University of Utah June 17, 2008 Liang Zhang (UofU) Applied Statistics I June 17, 2008 1 / 22 Random Variables Definition A dicrete random variable

More information

Conditional Probability

Conditional Probability Conditional Probability Conditional Probability The Law of Total Probability Let A 1, A 2,..., A k be mutually exclusive and exhaustive events. Then for any other event B, P(B) = P(B A 1 ) P(A 1 ) + P(B

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology 6.04/6.4: Probabilistic Systems Analysis Fall 00 Quiz Solutions: October, 00 Problem.. 0 points Let R i be the amount of time Stephen spends at the ith red light. R i is a Bernoulli random variable with

More information

Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

Events A and B are said to be independent if the occurrence of A does not affect the probability of B. Independent Events Events A and B are said to be independent if the occurrence of A does not affect the probability of B. Probability experiment of flipping a coin and rolling a dice. Sample Space: {(H,

More information

Chapter 4: Probability and Probability Distributions

Chapter 4: Probability and Probability Distributions Chapter 4: Probability and Probability Distributions 4.1 How Probability Can Be Used in Making Inferences 4.1 a. Subjective probability b. Relative frequency c. Classical d. Relative frequency e. Subjective

More information

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4. I. Probability basics (Sections 4.1 and 4.2) Flip a fair (probability of HEADS is 1/2) coin ten times. What is the probability of getting exactly 5 HEADS? What is the probability of getting exactly 10

More information

CS 441 Discrete Mathematics for CS Lecture 20. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

CS 441 Discrete Mathematics for CS Lecture 20. Probabilities. CS 441 Discrete mathematics for CS. Probabilities CS 441 Discrete Mathematics for CS Lecture 20 Probabilities Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square CS 441 Discrete mathematics for CS Probabilities Three axioms of the probability theory:

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Probability Distributions for Discrete RV

Probability Distributions for Discrete RV An example: Assume we toss a coin 3 times and record the outcomes. Let X i be a random variable defined by { 1, if the i th outcome is Head; X i = 0, if the i th outcome is Tail; Let X be the random variable

More information

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur

3rd IIA-Penn State Astrostatistics School July, 2010 Vainu Bappu Observatory, Kavalur 3rd IIA-Penn State Astrostatistics School 19 27 July, 2010 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Bhamidi V Rao Indian Statistical Institute,

More information

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Notes for Math 324, Part 17

Notes for Math 324, Part 17 126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and

More information

The Bernoulli distribution has only two outcomes, Y, success or failure, with values one or zero. The probability of success is p.

The Bernoulli distribution has only two outcomes, Y, success or failure, with values one or zero. The probability of success is p. The Bernoulli distribution has onl two outcomes, Y, success or failure, with values one or zero. The probabilit of success is p. The probabilit distribution f() is denoted as B (p), the formula is: f ()

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

STOR Lecture 16. Properties of Expectation - I

STOR Lecture 16. Properties of Expectation - I STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics

More information

Recap of Basic Probability Theory

Recap of Basic Probability Theory 02407 Stochastic Processes Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk

More information

Discrete Structures for Computer Science

Discrete Structures for Computer Science Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely

More information

Recap of Basic Probability Theory

Recap of Basic Probability Theory 02407 Stochastic Processes? Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Lecture 3 - Axioms of Probability

Lecture 3 - Axioms of Probability Lecture 3 - Axioms of Probability Sta102 / BME102 January 25, 2016 Colin Rundel Axioms of Probability What does it mean to say that: The probability of flipping a coin and getting heads is 1/2? 3 What

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

EE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178

EE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178 EE 178 Lecture Notes 0 Course Introduction About EE178 About Probability Course Goals Course Topics Lecture Notes EE 178: Course Introduction Page 0 1 EE 178 EE 178 provides an introduction to probabilistic

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Bernoulli Trials, Binomial and Cumulative Distributions

Bernoulli Trials, Binomial and Cumulative Distributions Bernoulli Trials, Binomial and Cumulative Distributions Sec 4.4-4.6 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,

More information

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212 Expected Value Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212 http://cseweb.ucsd.edu/classes/wi16/cse21-abc/ March

More information

Lecture 3: Random variables, distributions, and transformations

Lecture 3: Random variables, distributions, and transformations Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an

More information

Discrete Random Variable Practice

Discrete Random Variable Practice IB Math High Level Year Discrete Probability Distributions - MarkScheme Discrete Random Variable Practice. A biased die with four faces is used in a game. A player pays 0 counters to roll the die. The

More information

Prof. Thistleton MAT 505 Introduction to Probability Lecture 18

Prof. Thistleton MAT 505 Introduction to Probability Lecture 18 Prof. Thistleton MAT 505 Introduction to Probability Lecture Sections from Text and MIT Video Lecture: 6., 6.4, 7.5 http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-04-probabilisticsystems-analysis-and-applied-probability-fall-200/video-lectures/lecture-6-discrete-randomvariable-examples-joint-pmfs/

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G. CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Copyright Arthur G. Werschulz, 2017.

More information

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102 Mean, Median and Mode Lecture 3 - Axioms of Probability Sta102 / BME102 Colin Rundel September 1, 2014 We start with a set of 21 numbers, ## [1] -2.2-1.6-1.0-0.5-0.4-0.3-0.2 0.1 0.1 0.2 0.4 ## [12] 0.4

More information

Week 9 The Central Limit Theorem and Estimation Concepts

Week 9 The Central Limit Theorem and Estimation Concepts Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

1 INFO Sep 05

1 INFO Sep 05 Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Chapter 2: The Random Variable

Chapter 2: The Random Variable Chapter : The Random Variable The outcome of a random eperiment need not be a number, for eample tossing a coin or selecting a color ball from a bo. However we are usually interested not in the outcome

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers. Chapter 5: Probability and Discrete Probability Distribution Learn. Probability Binomial Distribution Poisson Distribution Some Popular Randomizers Rolling dice Spinning a wheel Flipping a coin Drawing

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability? Probability: Why do we care? Lecture 2: Probability and Distributions Sandy Eckel seckel@jhsph.edu 22 April 2008 Probability helps us by: Allowing us to translate scientific questions into mathematical

More information

Stat 134 Fall 2011: Notes on generating functions

Stat 134 Fall 2011: Notes on generating functions Stat 3 Fall 0: Notes on generating functions Michael Lugo October, 0 Definitions Given a random variable X which always takes on a positive integer value, we define the probability generating function

More information

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13 Discrete Mathematics & Mathematical Reasoning Chapter 7 (continued): Markov and Chebyshev s Inequalities; and Examples in probability: the birthday problem Kousha Etessami U. of Edinburgh, UK Kousha Etessami

More information

Common Discrete Distributions

Common Discrete Distributions Common Discrete Distributions Statistics 104 Autumn 2004 Taken from Statistics 110 Lecture Notes Copyright c 2004 by Mark E. Irwin Common Discrete Distributions There are a wide range of popular discrete

More information

continuous random variables

continuous random variables continuous random variables continuous random variables Discrete random variable: takes values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability X is positive integer i with probability

More information

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev CS4705 Probability Review and Naïve Bayes Slides from Dragomir Radev Classification using a Generative Approach Previously on NLP discriminative models P C D here is a line with all the social media posts

More information

p. 4-1 Random Variables

p. 4-1 Random Variables Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i

More information

NLP: Probability. 1 Basics. Dan Garrette December 27, E : event space (sample space)

NLP: Probability. 1 Basics. Dan Garrette December 27, E : event space (sample space) NLP: Probability Dan Garrette dhg@cs.utexas.edu December 27, 2013 1 Basics E : event space (sample space) We will be dealing with sets of discrete events. Example 1: Coin Trial: flipping a coin Two possible

More information

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6 With SOLUTIONS Mudunuru, Venkateswara Rao STA 2023 Spring 2016 1 1. A committee of 5 persons is to be formed from 6 men and 4 women. What

More information

Guidelines for Solving Probability Problems

Guidelines for Solving Probability Problems Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does

More information

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

STAT 516 Midterm Exam 2 Friday, March 7, 2008

STAT 516 Midterm Exam 2 Friday, March 7, 2008 STAT 516 Midterm Exam 2 Friday, March 7, 2008 Name Purdue student ID (10 digits) 1. The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Sociology 6Z03 Review II

Sociology 6Z03 Review II Sociology 6Z03 Review II John Fox McMaster University Fall 2016 John Fox (McMaster University) Sociology 6Z03 Review II Fall 2016 1 / 35 Outline: Review II Probability Part I Sampling Distributions Probability

More information

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6 With SOLUTIONS Mudunuru Venkateswara Rao, Ph.D. STA 2023 Fall 2016 Venkat Mu ALL THE CONTENT IN THESE SOLUTIONS PRESENTED IN BLUE AND BLACK

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions 1999 Prentice-Hall, Inc. Chap. 4-1 Chapter Topics Basic Probability Concepts: Sample

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information