This does not cover everything on the final. Look at the posted practice problems for other topics.

Size: px
Start display at page:

Download "This does not cover everything on the final. Look at the posted practice problems for other topics."

Transcription

1 Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry out computations. Problem. (Counting) (a) There are several ways to think about this. Here is one. The letters are p, r, o, b,b, a, i,i, l, t, y. We use the following steps to create a sequence of these letters. Step : Choose a position for the letter p: ways to do this. Step : Choose a position for the letter r: ways to do this. Step 3: Choose a position for the letter o: 9 ways to do this. Step 4: Choose two positions for the two b s: 8 choose ways to do this. Step 5: Choose a position for the letter a: 6 ways to do this. Step 6: Choose two positions for the two i s: 5 choose ways to do this. Step 7: Choose a position for the letter l: 3 ways to do this. Step 8: Choose a position for the letter t: ways to do this. Step 9: Choose a position for the letter y: ways to do this. Multiply these all together we get: ( ) (b) Here are two ways to do this problem. ( ) 5 3 =!!! Method. Since every arrangement has equal probability of being chosen we simply have to count the number that start with the letter b. After putting a b in position there are letters: p, r, o, b, a, i,i, l, t, y, to place in the last positions. We count this in the same manner as part (a). That is Choose the position for p: ways. Choose the positions for r,o,b,a,: ways. Choose two positions for the two i s: 5 choose ways. Choose the position for l: 3 ways. Choose the position for t: ways. Choose the position for y: ways. ( ) 5 Multiplying this together we get =! arrangements! start with the letter b. Therefore the probability a random arrangement starts with!/! b is!/!! =

2 Method. Suppose we build the arrangement by picking a letter for the first position, then the second position etc. Since there are letters, two of which are b s we have a / chance of picking a b for the first letter. Problem. (Probability) We are given P (E F ) = 3/4. E c F c = (E F ) c P (E c F c ) = P (E F ) = /4. Problem 3. (Counting) Let H i be the event that the i th hand has one king. We have the conditional probabilities ( )( ) ( )( ) ( )( ) P (H ) = ( ) 5 ; P (H H ) = 3 ( ) 39 ; P (H 3 H H ) = 3 ( ) 6 3 P (H 4 H H H 3 ) = P (H H H 3 H 4 ) = P (H 4 H H H 3 ) P (H 3 H H ) P (H H ) P (H ) ( )( )( )( )( )( ) = ( )( )( ) Problem 4. (Conditional probability) (a) Sample space = Ω = {(, ), (, ), (, 3),..., (6, 6) } = {(i, j) i, j =,, 3, 4, 5, 6 }. (Each outcome is equally likely, with probability /36.) A = {(, 3), (, ), (3, )}, B = {(3, ), (3, ), (3, 3), (3, 4), (3, 5), (3, 6), (, 3), (, 3), (4, 3), (5, 3), (6, 3) } P (A B) = P (A B) P (B) = /36 /36 =.. (b) P (A) = 3/36 P (A B), so they are not independent. Problem 5. (Bayes formula) For a given problem let C be the event the student gets the problem correct and K the event the student knows the answer. The question asks for P (K C). We ll compute this using Bayes rule: P (K C) = P (C K) P (K). P (C)

3 We re given: P (C K) =, P (K) =.6. Law of total prob.: P (C) = P (C K) P (K) + P (C K c ) P (K c ) = =.7. Therefore P (K C) =.6 =.857 = 85.7%..7 Problem 6. (Bayes formula) Here is the game tree, R means red on the first draw etc. 7/ 3/ R B 6/9 3/9 7/ 3/ R B R B 5/8 3/8 6/9 3/9 6/9 3/9 7/ 3/ R 3 B 3 R 3 B 3 R 3 B 3 R 3 B 3 Summing the probability to all the B 3 nodes we get P (B 3 ) = Problem 7. (Expected value and variance) 3 3 =.35. X - - p(x) /5 /5 3/5 4/5 5/5 We compute Thus E[X] = ]5 + [ 4 5 Var(X) = E((X 3 ) ) = = 3. Problem 8. (Expected value and variance) We first compute E[X] = E[X ] = E[X 4 ] = Thus, x xdx = 3 x xdx = x 4 xdx = 3. Var(X) = E[X ] (E[X]) = 4 9 = 8 3

4 and Var(X ) = E[X 4 ] = ( E[X ] ) = 3 4 =. Problem 9. (Expected value and variance) Use Var(X) = E(X ) E(X) 3 = E(X ) 4 E(X ) = 7. Problem. (Expected value and variance) answer: Make a table X: prob: (-p) p X. From the table, E(X) = ( p) + p = p. Since X and X have the same table E(X ) = E(X) = p. Therefore, Var(X) = p p = p( p). Problem. (Expected value) Let X be the number of people who get their own hat. Following the hint: let X j represent whether person j gets their own hat. That is, X j = if person j gets their hat and if not. We have, X = X j, so E(X) = E(X j ). j= j= Since person j is equally likely to get any hat, we have P (X j = ) = /. Thus, X j Bernoulli(/) E(X j ) = / E(X) =. Problem. (Expected value and variance) (a) There are a number of ways to present this. X 3 binomial(5, /6), so P (X = 3k) = ( 5 k (b) X 3 binomial(5, /6). ) ( 6 ) k ( ) 5 k 5, for k =,,,..., 5. 6 Recall that the mean and variance of binomial(n, p) are np and np( p). So, E(X) = 3np = = 75/6, and Var(X) = 9np( p) = 9 5(/6)(5/6) = 5/4. (c) E(X + Y ) = E(X) + E(Y ) = 5/6 = 5., E(X) = E(X) = 5/6 = 5. 4

5 Var(X + Y ) = Var(X) + Var(Y ) = 5/4. Var(X) = 4Var(X) = 5/4. The means of X + Y and X are the same, but Var(X) > Var(X + Y ). This makes sense because in X +Y sometimes X and Y will be on opposite sides from the mean so distances to the mean will tend to cancel, However in X the distance to the mean is always doubled. Problem 3. (Continuous random variables) First we find the value of a: f(x) dx = = x + ax dx = + a 3 a = 3/. The CDF is F X (x) = P (X x). We break this into cases: (i) b < F X (b) =. (ii) b F X (b) = (iii) < x F X (b) =. b x + 3 x dx = b + b3. Using F X we get ( ) P (.5 < X < ) = F X () F X (.5) = = 3 6. Problem 4. (Exponential distribution) (a) We compute P (X 5) = P (X < 5) = 5 λe λx dx = ( e 5λ ) = e 5λ. (b) We want P (X 5 X ). First observe that P (X 5, X ) = P (X 5). From similar computations in (a), we know P (X 5) = e 5λ P (X ) = e λ. From the definition of conditional probability, P (X 5 X ) = P (X 5, X ) P (X ) = P (X 5) P (X ) = e 5λ Note: This is an illustration of the memorylessness property of the exponential distribution. Problem 5. (a) Note, Y follows what is called a log-normal distribution. 5

6 F Y (a) = P (Y a) = P (e Z a) = P (Z ln(a)) = Φ(ln(a)). Differentiating using the chain rule: f y (a) = d da F Y (a) = d da Φ(ln(a)) = a φ(ln(a)) = π a e (ln(a)) /. (b) (i) We want to find q.33 such that P (Z q.33 ) =.33. That is, we want (ii) We want q.9 such that Φ(q.33 ) =.33 q.33 = Φ (.33). F Y (q.9 ) =.9 Φ(ln(q.9 )) =.9 q.9 = e Φ (.9). (iii) As in (ii) q.5 = e Φ (.5) = e =. Problem 6. (a) We did this in class. Let φ(z) and Φ(z) be the PDF and CDF of Z. F Y (y) = P (Y y) = P (az + b y) = P (Z (y b)/a) = Φ((y b)/a). Differentiating: f Y (y) = d dy F Y (y) = d dy Φ((y b)/a) = φ((y b)/a) = e (y b) /a. a π a Since this is the density for N(b, a ) we have shown Y N(b, a ). (b) By part (a), Y N(µ, σ ) Y = σz + µ. But, this implies (Y µ)/σ = Z N(, ). QED Problem 7. (Quantiles) The density for this distribution is f(x) = λ e λx. We know (or can compute) that the distribution function is F (a) = e λa. The median is the value of a such that F (a) =.5. Thus, e λa =.5.5 = e λa log(.5) = λa a = log()/λ. Problem 8. (Correlation) As usual let X i = the number of heads on the i th flip, i.e. or. Let X = X + X + X 3 the sum of the first 3 flips and Y = X 3 + X 4 + X 5 the sum of the last 3. Using the algebraic properties of covariance we have Cov(X, Y ) = Cov(X + X + X 3, X 3 + X 4 + X 5 ) = Cov(X, X 3 ) + Cov(X, X 4 ) + Cov(X, X 5 ) + Cov(X, X 3 ) + Cov(X, X 4 ) + Cov(X, X 5 ) + Cov(X 3, X 3 ) + Cov(X 3, X 4 ) + Cov(X 3, X 5 ) 6

7 Because the X i are independent the only non-zero term in the above sum is Cov(X 3 X 3 ) = Var(X 3 ) = 4 Therefore, Cov(X, Y ) = 4. We get the correlation by dividing by the standard deviations. Since X is the sum of 3 independent Bernoulli(.5) we have σ X = 3/4 Cor(X, Y ) = Cov(X, Y ) σ X σ Y = /4 (3)/4 = 3. Problem 9. (Joint distributions) (a) Here we have two continuous random variables X and Y with going potability density function and f(x, y) = otherwise. So f(x, y) = xy( + y) for x and y, 5 P ( 4 X, 3 Y 3 ) = f(x, y)dy dx = 4 7. (b) F (a, b) = a b f(x, y)dy dx = 3 5 a b + 5 a b 3 for a and b. (c) We find the marginal cdf F X (a) by setting b in F (a, b) to the top of its range, i.e. b =. So F X (a) = F (a, ) = a. (d) For x, we have f X (x) = f(x, y)dy = This is consistent with (c) because d dx (x ) = x. (e) We first compute f Y (y) for y as f Y (y) = f(x, y)dy = x. f(x, y)dx = 6 y(y + ). 5 Since f(x, y) = f X (x)f Y (y), we conclude that X and Y are independent. Problem. (Joint distributions) (a) The marginal probability P Y () = / P (X =, Y = ) = P (X =, Y = ) =. Now each column has one empty entry. This can be computed by making the column add up to the given marginal probability. 7

8 Y \X P Y - /6 /6 /6 / / / P X /6 /3 /6 (b) No, X and Y are not independent. For example, P (X =, Y = ) = / = P (X = ) P (Y = ). Problem. Standardize: ( ) ( Xi µ n P X i < 3 = P σ/ < 3/n µ ) n σ/ n i ( ) 3/ /5 P Z < (by the central limit theorem) /3 = P (Z < 3) =.3 =.9987 (from the table) Problem. (Central limit theorem) Let X j be the IQ of a randomly selected person. We are given E(X j ) = and σ Xj = 5. Let X be the average of the IQ s of randomly selected people. We have (X) = and σ X = 5/ =.5. The problem asks for P (X > 5). Standardizing we get P (X > 5) P (Z > ). This is effectively. Problem 3. (NHST chi-square) We will use a chi-square test for homogeneity. Remember we need to use all the data!. For hypotheses we have: H : the re-offense rate is the same for both groups. H A : the rates are different. Here is the table of counts. The computation of the expected counts is explained below. Control group Experimental group observed expected observed expected Re-offend Don t re-offend The expected counts are computed as follows. Under H the re-offense rates are the same, say θ. To find the expected counts we find the MLE of θ using the combined data: total re-offend ˆθ = total subjects = 4. 8

9 Then, for example, the expected number of re-offenders in the control group is ˆθ = 5. The other expected counts are computed in the same way. The chi-square test statistic is X = (observed - expected) expected = Finally, we need the degrees of freedom: df = because this is a two-by-two table and ( ) ( ) =. (Or because we can freely fill in the count in one cell and still be consistent with the marginal counts,,, 3, 4 used to compute the expected counts.) From the χ table: p = P (X >.33 df = ) <.. Conclusion: we reject H in favor of H A. The experimental intervention appears to be effective. Problem 4. (Confidence intervals) We compute the data mean and variance x = 65, s = The number of degrees of freedom is 9. We look up the critical value t 9,.5 =.6 in the t-table The 95% confidence interval is [ x t 9,.5s, x + t ] 9,.5s = n n [ , ] = [6.7, 69.79] Problem 5. (Confidence intervals) Suppose we have taken data x,..., x n with mean x. The 95% confidence interval for the mean is x ± z.5 σ n. This has width z.5 σ n. Setting the width equal to and substitituting values z.5 =.96 and σ = 5 we get So, n = (9.6) = n = n = 9.6. If we use our rule of thumb that z.5 = we have n/ = n = 4. Problem 6. (Confidence intervals) The 9% confidence interval is x ± z.5. Since z n.5 =.64 and n = 4 our confidence interval is x ±.64 4 = x ±.4 If this is entirely above.5 we have x.4 >.5, so x >.54. Let T be the number out of 4 who prefer A. We have x = T >.54, so T > 6. 4 Problem 7. (Confidence intervals) A 95% confidence means about 5% = / will be wrong. You d expect about to be wrong. 9

10 With a probability p =.5 of being wrong, the number wrong follows a Binomial(4, p) distribution. This has expected value, and standard deviation 4(.5)(.95) =.38. wrong is (-)/.38 = 5.8 standard deviations from the mean. This would be surprising. Problem 8. (Confidence intervals) We have n = 7 and s = If we fix a hypothesis for σ we know (n )s σ χ n We used R to find the critical values. (Or use the χ table.) c5 = qchisq(.975,6) = 4.93 c975 = qchisq(.5,6) = The 95% confidence interval for σ is [ ] [ ] (n ) s (n ) s , =, = [.968, ] c.5 c We can take square roots to find the 95% confidence interval for σ [4.648, 8.37] Problem 9. (a) Step. We have the point estimate p ˆp =.333. Step. Use the computer to generate many (say ) size samples. (These are called the bootstrap samples.) Step 3. For each sample compute p = / x and δ = p ˆp. Step 4. Sort the δ and find the critical values δ.95 and δ.5. (Remember δ.95 is the 5th percentile etc.) Step 5. The 9% bootstrap confidence interval for p is [ˆp δ.5, ˆp δ.95 ] (b) It s tricky to keep the sides straight here. We work slowly and carefully: The 5th and 95th percentiles for x are the th and 9th entries.89, 3.7 (Here again there is some ambiguity on which entries to use. We will accept using the th or the 9st entries or some interpolation between these entries.) So the 5th and 95th percentiles for p are /3.7 =.688, /.89 =.346 So the 5th and 95th percentiles for δ = p ˆp are.343,.499

11 These are also the.95 and.5 critical values. So the 9% CI for p is [ , ] = [.64,.3374] Problem 3. (a) The steps are the same as in the previous problem except the bootstrap samples are generated in different ways. Step. We have the point estimate q.5 ˆq.5 = 3.3. Step. Use the computer to generate many (say ) size resamples of the original data. Step 3. For each sample compute the median q.5 and δ = q.5 ˆq.5. Step 4. Sort the δ and find the critical values δ.95 and δ.5. (Remember δ.95 is the 5th percentile etc.) Step 5. The 9% bootstrap confidence interval for q.5 is [ˆq.5 δ.5, ˆq.5 δ.95 ] (b) This is very similar to the previous problem. We proceed slowly and carefully to get terms on the correct side of the inequalities. The 5th and 95th percentiles for q.5 are.89, 3.7 So the 5th and 95th percentiles for δ = q.5 ˆq.5 are [ , ] = [.4,.4] These are also the.95 and.5 critical values. So the 9% CI for p is [3.3.4, ] = [.9, 3.7] Problem 3. The model is y i = a + bx i + ε i, where ε i is random error. We assume the errors are independent with mean and the same variance for each i (homoscedastic). The total error squared is E = (y i a bx i ) = ( a b) + ( a b) + (3 a 3b) The least squares fit is given by the values of a and b which minimize E. We solve for them by setting the partial derivatives of E with respect to a and b to. In R we found that a =., b =.5

Exam 1 Practice Questions I solutions, 18.05, Spring 2014

Exam 1 Practice Questions I solutions, 18.05, Spring 2014 Exam Practice Questions I solutions, 8.5, Spring 4 Note: This is a set of practice problems for exam. The actual exam will be much shorter.. Sort the letters: A BB II L O P R T Y. There are letters in

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Class 8 Review Problems 18.05, Spring 2014

Class 8 Review Problems 18.05, Spring 2014 1 Counting and Probability Class 8 Review Problems 18.05, Spring 2014 1. (a) How many ways can you arrange the letters in the word STATISTICS? (e.g. SSSTTTIIAC counts as one arrangement.) (b) If all arrangements

More information

Post-exam 2 practice questions 18.05, Spring 2014

Post-exam 2 practice questions 18.05, Spring 2014 Post-exam 2 practice questions 18.05, Spring 2014 Note: This is a set of practice problems for the material that came after exam 2. In preparing for the final you should use the previous review materials,

More information

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom

Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, 8.5 Jeremy Orloff and Jonathan Bloom Learning Goals. Be able to compute and interpret expectation, variance, and standard

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Math Review Sheet, Fall 2008

Math Review Sheet, Fall 2008 1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

18.05 Practice Final Exam

18.05 Practice Final Exam No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Exam 2 Practice Questions, 18.05, Spring 2014

Exam 2 Practice Questions, 18.05, Spring 2014 Exam 2 Practice Questions, 18.05, Spring 2014 Note: This is a set of practice problems for exam 2. The actual exam will be much shorter. Within each section we ve arranged the problems roughly in order

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

ECON Fundamentals of Probability

ECON Fundamentals of Probability ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,

More information

Review for Exam Spring 2018

Review for Exam Spring 2018 Review for Exam 1 18.05 Spring 2018 Extra office hours Tuesday: David 3 5 in 2-355 Watch web site for more Friday, Saturday, Sunday March 9 11: no office hours March 2, 2018 2 / 23 Exam 1 Designed to be

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Common ontinuous random variables

Common ontinuous random variables Common ontinuous random variables CE 311S Earlier, we saw a number of distribution families Binomial Negative binomial Hypergeometric Poisson These were useful because they represented common situations:

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Master s Written Examination

Master s Written Examination Master s Written Examination Option: Statistics and Probability Spring 05 Full points may be obtained for correct answers to eight questions Each numbered question (which may have several parts) is worth

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014 Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem 18.5 Spring 214.5.4.3.2.1-4 -3-2 -1 1 2 3 4 January 1, 217 1 / 31 Expected value Expected value: measure of

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Smoking Habits. Moderate Smokers Heavy Smokers Total. Hypertension No Hypertension Total

Smoking Habits. Moderate Smokers Heavy Smokers Total. Hypertension No Hypertension Total Math 3070. Treibergs Final Exam Name: December 7, 00. In an experiment to see how hypertension is related to smoking habits, the following data was taken on individuals. Test the hypothesis that the proportions

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Lecture 10: Generalized likelihood ratio test

Lecture 10: Generalized likelihood ratio test Stat 200: Introduction to Statistical Inference Autumn 2018/19 Lecture 10: Generalized likelihood ratio test Lecturer: Art B. Owen October 25 Disclaimer: These notes have not been subjected to the usual

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001

Statistics Ph.D. Qualifying Exam: Part II November 3, 2001 Statistics Ph.D. Qualifying Exam: Part II November 3, 2001 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. 1 2 3 4 5 6 7 8 9 10 11 12 2. Write your

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

CONTINUOUS RANDOM VARIABLES

CONTINUOUS RANDOM VARIABLES the Further Mathematics network www.fmnetwork.org.uk V 07 REVISION SHEET STATISTICS (AQA) CONTINUOUS RANDOM VARIABLES The main ideas are: Properties of Continuous Random Variables Mean, Median and Mode

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/ STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This

More information

This paper is not to be removed from the Examination Halls

This paper is not to be removed from the Examination Halls ~~ST104B ZA d0 This paper is not to be removed from the Examination Halls UNIVERSITY OF LONDON ST104B ZB BSc degrees and Diplomas for Graduates in Economics, Management, Finance and the Social Sciences,

More information

Week 2: Review of probability and statistics

Week 2: Review of probability and statistics Week 2: Review of probability and statistics Marcelo Coca Perraillon University of Colorado Anschutz Medical Campus Health Services Research Methods I HSMP 7607 2017 c 2017 PERRAILLON ALL RIGHTS RESERVED

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

ORF 245 Fundamentals of Statistics Practice Final Exam

ORF 245 Fundamentals of Statistics Practice Final Exam Princeton University Department of Operations Research and Financial Engineering ORF 245 Fundamentals of Statistics Practice Final Exam January??, 2016 1:00 4:00 pm Closed book. No computers. Calculators

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

Solution to Assignment 3

Solution to Assignment 3 The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The

More information

18.05 Exam 1. Table of normal probabilities: The last page of the exam contains a table of standard normal cdf values.

18.05 Exam 1. Table of normal probabilities: The last page of the exam contains a table of standard normal cdf values. Name 18.05 Exam 1 No books or calculators. You may have one 4 6 notecard with any information you like on it. 6 problems, 8 pages Use the back side of each page if you need more space. Simplifying expressions:

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

The Binomial distribution. Probability theory 2. Example. The Binomial distribution Probability theory Tron Anders Moger September th 7 The Binomial distribution Bernoulli distribution: One experiment X i with two possible outcomes, probability of success P. If the experiment is repeated

More information

Confidence Intervals for Normal Data Spring 2018

Confidence Intervals for Normal Data Spring 2018 Confidence Intervals for Normal Data 18.05 Spring 2018 Agenda Exam on Monday April 30. Practice questions posted. Friday s class is for review (no studio) Today Review of critical values and quantiles.

More information

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27 Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple

More information

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours

More information

Review of Probabilities and Basic Statistics

Review of Probabilities and Basic Statistics Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to

More information

STAT 135 Lab 6 Duality of Hypothesis Testing and Confidence Intervals, GLRT, Pearson χ 2 Tests and Q-Q plots. March 8, 2015

STAT 135 Lab 6 Duality of Hypothesis Testing and Confidence Intervals, GLRT, Pearson χ 2 Tests and Q-Q plots. March 8, 2015 STAT 135 Lab 6 Duality of Hypothesis Testing and Confidence Intervals, GLRT, Pearson χ 2 Tests and Q-Q plots March 8, 2015 The duality between CI and hypothesis testing The duality between CI and hypothesis

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

Confidence Intervals for Normal Data Spring 2014

Confidence Intervals for Normal Data Spring 2014 Confidence Intervals for Normal Data 18.05 Spring 2014 Agenda Today Review of critical values and quantiles. Computing z, t, χ 2 confidence intervals for normal data. Conceptual view of confidence intervals.

More information

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual Question 1. Suppose you want to estimate the percentage of

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Non-parametric Inference and Resampling

Non-parametric Inference and Resampling Non-parametric Inference and Resampling Exercises by David Wozabal (Last update. Juni 010) 1 Basic Facts about Rank and Order Statistics 1.1 10 students were asked about the amount of time they spend surfing

More information

Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z).

Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z). Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z). For example P(X.04) =.8508. For z < 0 subtract the value from,

More information

Spring 2012 Math 541B Exam 1

Spring 2012 Math 541B Exam 1 Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote

More information

Math 50: Final. 1. [13 points] It was found that 35 out of 300 famous people have the star sign Sagittarius.

Math 50: Final. 1. [13 points] It was found that 35 out of 300 famous people have the star sign Sagittarius. Math 50: Final 180 minutes, 140 points. No algebra-capable calculators. Try to use your calculator only at the end of your calculation, and show working/reasoning. Please do look up z, t, χ 2 values for

More information

Frequentist Statistics and Hypothesis Testing Spring

Frequentist Statistics and Hypothesis Testing Spring Frequentist Statistics and Hypothesis Testing 18.05 Spring 2018 http://xkcd.com/539/ Agenda Introduction to the frequentist way of life. What is a statistic? NHST ingredients; rejection regions Simple

More information

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs STATISTICS 4 Summary Notes. Geometric and Exponential Distributions GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs P(X = x) = ( p) x p x =,, 3,...

More information

Mathematical statistics

Mathematical statistics October 18 th, 2018 Lecture 16: Midterm review Countdown to mid-term exam: 7 days Week 1 Chapter 1: Probability review Week 2 Week 4 Week 7 Chapter 6: Statistics Chapter 7: Point Estimation Chapter 8:

More information

STAT 4385 Topic 01: Introduction & Review

STAT 4385 Topic 01: Introduction & Review STAT 4385 Topic 01: Introduction & Review Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso xsu@utep.edu Spring, 2016 Outline Welcome What is Regression Analysis? Basics

More information

Probability reminders

Probability reminders CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so

More information

Statistics Examples. Cathal Ormond

Statistics Examples. Cathal Ormond Statistics Examples Cathal Ormond Contents Probability. Odds: Betting...................................... Combinatorics: kdm.................................. Hypergeometric: Card Games.............................4

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3 Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................

More information

SDS 321: Practice questions

SDS 321: Practice questions SDS 2: Practice questions Discrete. My partner and I are one of married couples at a dinner party. The 2 people are given random seats around a round table. (a) What is the probability that I am seated

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) 1. (Confidence Intervals, CLT) Let X 1,..., X n be iid with unknown mean θ and known variance σ 2. Assume

More information

CS 5014: Research Methods in Computer Science. Bernoulli Distribution. Binomial Distribution. Poisson Distribution. Clifford A. Shaffer.

CS 5014: Research Methods in Computer Science. Bernoulli Distribution. Binomial Distribution. Poisson Distribution. Clifford A. Shaffer. Department of Computer Science Virginia Tech Blacksburg, Virginia Copyright c 2015 by Clifford A. Shaffer Computer Science Title page Computer Science Clifford A. Shaffer Fall 2015 Clifford A. Shaffer

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Preliminary Statistics. Lecture 3: Probability Models and Distributions Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Institute of Actuaries of India

Institute of Actuaries of India Institute of Actuaries of India Subject CT3 Probability & Mathematical Statistics May 2011 Examinations INDICATIVE SOLUTION Introduction The indicative solution has been written by the Examiners with the

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Confidence Intervals for the Mean of Non-normal Data Class 23, Jeremy Orloff and Jonathan Bloom

Confidence Intervals for the Mean of Non-normal Data Class 23, Jeremy Orloff and Jonathan Bloom Confidence Intervals for the Mean of Non-normal Data Class 23, 8.05 Jeremy Orloff and Jonathan Bloom Learning Goals. Be able to derive the formula for conservative normal confidence intervals for the proportion

More information

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker

More information

1 Probability theory. 2 Random variables and probability theory.

1 Probability theory. 2 Random variables and probability theory. Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major

More information