IE 230 Probability & Statistics in Engineering I. Closed book and notes. 120 minutes.

Similar documents
IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 120 minutes. Cover page, five pages of exam. No calculators.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. No calculators. 120 minutes.

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Chapter 5 Joint Probability Distributions

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

Homework 10 (due December 2, 2009)

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Things to remember when learning probability distributions:

University of Illinois ECE 313: Final Exam Fall 2014

Probability and Distributions

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

Chapter 2: Random Variables

IE 581 Introduction to Stochastic Simulation. One page of notes, front and back. Closed book. 50 minutes. Score

Closed book and notes. No calculators. 60 minutes, but essentially unlimited time.

Page Max. Possible Points Total 100

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

FINAL EXAM: Monday 8-10am

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Brief Review of Probability

Math 493 Final Exam December 01

Random Variables Example:

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

1: PROBABILITY REVIEW

1 Review of Probability and Distributions

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

(Practice Version) Midterm Exam 2

IE 230 Seat # (1 point) Name (clearly) < KEY > Closed book and notes. No calculators. Designed for 60 minutes, but time is essentially unlimited.

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1

Mark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation.

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Mathematical statistics

Open book and notes. 120 minutes. Covers Chapters 8 through 14 of Montgomery and Runger (fourth edition).

Twelfth Problem Assignment

3 Continuous Random Variables

This does not cover everything on the final. Look at the posted practice problems for other topics.

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

IE 230 Seat # Name < KEY > Please read these directions. Closed book and notes. 60 minutes.

IE 581 Introduction to Stochastic Simulation

ECEN 5612, Fall 2007 Noise and Random Processes Prof. Timothy X Brown NAME: CUID:

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Experimental Design and Statistics - AGA47A

Chapter 4: Continuous Probability Distributions

FINAL EXAM: 3:30-5:30pm

MAS223 Statistical Inference and Modelling Exercises

Discrete Distributions

IE 581 Introduction to Stochastic Simulation. One page of notes, front and back. Closed book. 50 minutes. Score

Fundamental Tools - Probability Theory II

STAT 414: Introduction to Probability Theory

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Continuous Probability Distributions. Uniform Distribution

2 (Statistics) Random variables

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

STAT 418: Probability and Stochastic Processes

Guidelines for Solving Probability Problems

Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA

b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

Chapter 4: Continuous Random Variable

Distribusi Binomial, Poisson, dan Hipergeometrik

Chapter 4: CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

MA 320 Introductory Probability, Section 1 Spring 2017 Final Exam Practice May. 1, Exam Scores. Question Score Total. Name:

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables

APPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 651 APPENDIX B. BIBLIOGRAPHY 677 APPENDIX C. ANSWERS TO SELECTED EXERCISES 679

Practice Problems Section Problems

STAT509: Continuous Random Variable

1 Presessional Probability

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

STAT 3610: Review of Probability Distributions

5. Conditional Distributions

Final Examination a. STA 532: Statistical Inference. Wednesday, 2015 Apr 29, 7:00 10:00pm. Thisisaclosed bookexam books&phonesonthefloor.

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

IE 336 Seat # Name (clearly) < KEY > Open book and notes. No calculators. 60 minutes. Cover page and five pages of exam.

Probability reminders

Actuarial Science Exam 1/P

Lecture 2: Repetition of probability theory and statistics

ECE 353 Probability and Random Signals - Practice Questions

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

MATHEMATICAL METHODS

Statistics for Economists. Lectures 3 & 4

STA 2201/442 Assignment 2

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 2 Random Variables

Statistical Methods in Particle Physics

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY

MATHEMATICAL METHODS

Chapter 1: Revie of Calculus and Probability

Transcription:

Closed book and notes. 10 minutes. Two summary tables from the concise notes are attached: Discrete distributions and continuous distributions. Eight Pages. Score _ Final Exam, Fall 1999 Cover Sheet, Page 0 of 8 Schmeiser

Closed book and notes. 10 minutes. 1. True or false. (for each, points if correct, 1 point if left blank.) (a) T F When choosing a random sample without replacement from a finite population, it is possible that some member of the population is chosen twice. (b) T F For any random variables X 1 and X, the following result is true: E(X 1 +X ) = E(X 1 ) + E(X ). (c) T F A point estimator is often a single number, but it can be an interval. (d) T F A point estimator is said to be "maximum likelihood" if it is equal to the largest value in the sample. (e) T F Chebyshev s Inequality guarantees that the sum of many random variables is (at least approximately) normally distributed. (f) T F The standard deviation of a point estimator is called its standard error. (g) T F If Θˆ is an unbiased estimator, then the mean squared error of Θˆ is equal to the variance of Θˆ. (h) T F Mean squared error is one, but not the only, measure of a confidence interval s quality. (i) T F Let X denote the sample mean of a random sample of size n. The standard error of X decreases as the sample size n increases. (j) T F The ith observation from a sample of size n is called the ith order statistic. (k) T F A statistic is a function of the observed sample values. (l) T F If (X, Y) has a bivariate normal distribution, then P(X <0, Y<0) = 0. (m) T F If X and Y are independent, then corr(x, Y) = 0. (n) T F Let X have a binomial distribution with n = 5 and p =.. Without the continuity correction, the normal approximation to the binomial would yield P(X = 3) = 0. Final Exam, Fall 1999 Page 1 of 8 Schmeiser

. Let X denote a normally distributed random variable with mean µ and standard deviation σ. The pth quantile of X is the constant x p that satisfies P(X x p ) = p. When µ=0 and σ=1, the corresponding value is denoted by z p. (a) Sketch the density function of X. Label both axes. Scale the horizontal axis. (b) Circle the largest value. x.01 x 0 0 x.99 (c) Circle the true statement. z p = (x p µ) / σ z p =µ+σx p z p = x p z p = x p 0.135 3. Throughout this course we have followed a consistent notational convention that indicates the nature of various quantities. Describe each expression below by writing "random variable", "event", "constant", or "undefined" on the blank lines. (a) X (b) Y<X (c) Var(Y <X). (d) E(X) (e) θ (f) σ (g) P(X = 3) Final Exam, Fall 1999 Page of 8 Schmeiser

4. Suppose that you invest $4000 in IBM (ticker: IBM), a large hardware and software company, and $6000 in Symix Systems (ticker: SYMX), a small software company. Let V = $ 4000X + $ 6000Y be the portfolio value one year from now. That is, let X denote the relative change in IBM stock price and Y the relative price change for SYMX. Assume that µ X = 1., µ Y = 1.3, σ X = 0., σ Y = 0.4, and corr(x, Y) = 0.6. (a) Are X and Y independent? Yes No (b) Find the covariance of X and Y. (Recall that the correlation is the covariance divided by the standard deviations.) (c) Find E(V). (Recall that the expected value of a sum is the sum of expected values.) (d) Find Var(V). (Recall that the variance of a sum is the sum of the covariances.) Final Exam, Fall 1999 Page 3 of 8 Schmeiser

5. We want to estimate p, the probability of success for a sequence of Bernoulli trials. We repeatedly (and independently) observe the geometric random variable X, the number of trials to obtain a success; let X 1, X,...,X n denote n such observations. (If you wish to consider a specific example, consider the "penny-tipping" exercise, with each student tipping a penny until it lands heads up. Let X i be the number of tips required by student i, for i = 1,,..., 130. We want to estimate p, the probability that the penny lands heads up on any one trial.) (a) What are the possible values of X 1? (b) Write the mass function of X 1 in terms of p. (c) Write the likelihood function of the sample of size n for p =.5. (d) Exactly one of the following statistics is the maximum-likelihood estimator of p. Circle that estimator. (You do not need to derive the MLE. Rather, simply eliminate the choices that are not reasonable. Here X denotes the sample average.) pˆ = X pˆ = X 1 pˆ = n Π i =1 Xi Final Exam, Fall 1999 Page 4 of 8 Schmeiser

6. (Montgomery and Runger, 6-3.) In the transmission of digital information, the probability that a bit has high, moderate, or low distortion is 0.01, 0.04 and 0.95, respectively. Suppose that three bits are transmitted and that the amount of distortion of each bit is independent of the other bits. (a) Consider a fourth outcome: that a bit has no distortion. What is the probability that the first bit has no distortion? (b) What is the probability that exactly two bits have high distortion and one has moderate distortion? (c) What is the expected number of bits having low distortion? (d) Conditional that the first bit has low distortion, what is the probability that the second bit has high distortion? Final Exam, Fall 1999 Page 5 of 8 Schmeiser

7. (Montgomery and Runger, 3-106.) Suppose that a lot of washers is large enough that it can be assumed that the sampling is done with replacement. Assume that 60% of the washers exceed a target thickness. Let A i denote the event that washer i exceeds the target thickness. (a) What is the probability that a randomly selected washer does not exceed its target thickness? (b) Write the event (not its probability) that washers 1,, and 3 all exceed the target thickness. (c) What is the minimum number of washers, n, that need to be selected so that the probability that all the washers are thinner than the target is less than 0.10? (d) Circle the correct answer. The first sentence of this problem (where "with replacement" is assumed) affects the answer to Part(s) (a) (b) (c) (a,b) (a,c) (b,c) (a,b,c) Final Exam, Fall 1999 Page 6 of 8 Schmeiser

8. Let X denote the result of rolling a six-sided die; that is, X is the number of dots facing up. Assume that all six sides are equally likely. (a) Write the mass function of X? (Be complete.) (b) Find E(X ). (c) Find the conditional mass function of X given that X<3. (Be complete.) (d) Find E(X X<). Final Exam, Fall 1999 Page 7 of 8 Schmeiser

9. (Montgomery and Runger, 5-7.) Suppose that the log-ons to a computer network follow a Poisson process with an average of three log-ons per minute. (a) What is the mean time between log-ons? (b) What is the probability that exactly two log-ons occur during a particular two-minute time interval? (c) Determine the time t (in minutes) so that the probability that at least one log-on occurs before t minutes is 0.95. (d) Let Y have an Erlang distribution with shape parameter r = 3 and λ=3. In the context of this problem, what might be the physical meaning of Y? Final Exam, Fall 1999 Page 8 of 8 Schmeiser

Discrete Distributions: Summary Table random distribution range probability expected variance variable name mass function value X general x 1, x,...,x n P(X = x) Σ n x i f (x i ) i =1 Σ n (x i µ) f (x i ) i =1 = f (x) =µ=µ X =σ =σ X = f X (x) = E(X) = V(X) = E(X ) µ X discrete x 1, x,...,x n 1 /n uniform Σ n x i /n [ Σ n x i /n] µ i =1 i =1 X equal-space x = a,a +c,...,b uniform where n = (b a +c) /c 1 n a +b c (n 1) 1 "# successes in binomial x = 0, 1,..., n C n x p x (1 p) n x np np (1 p) n Bernoulli trials" "# Bernoulli geometric x = 1,,... p(1 p) x 1 1 /p (1 p) /p trials until 1st success" "# Bernoulli negative x = r, r +1,... x C r 1 p r (1 p) x r r/p r(1 p) /p trials until binomial rth success" "# successes in hyper- x = C K N x C N n x /C n np np (1 p) (N n) (N 1) a sample of geometric (n (N K)) +, size n from..., min{k, n} where p = K/N a population (Sampling and of size N without integer containing replacement) k successes" "# of counts in Poisson x = 0, 1,... e λ λ x /x! λ λ a Poissonprocess interval" Final Exam, Fall 1999 Schmeiser

Continuous Distributions: Summary Table random distribution range cumul. probability expected variance variable name dist. func. density func. value X general (, ) P(X x) df (y) dy y =x xf (x)dx (x µ) f (x)dx = F (x) = f (x) =µ=µ X =σ =σ X = F X (x) = f X (x) = E(X) = V(X) = E(X ) µ X continuous [a, b ] uniform x a b a 1 b a a + b (b a) 1 sum of normal (, ) Table II random (or variables Gaussian) 1 e x µ σ µ σ π σ time to exponential [0, ) 1 e λx λ e λx 1 / λ 1 / λ Poisson count time to rth Erlang [0, ) Poisson count Σ e λx (λx) k k! k =r λ r x r 1 e λx r/λ r/λ (r 1)! lifetime gamma [0, ) numerical λ r x r 1 e λx r/λ r/λ Γ(r) lifetime Weibull [0, ) 1 e βx β 1 e (x/δ)β (x/δ)β δγ(1+ 1 ) δ δ β Γ(1+ ) µ β β Final Exam, Fall 1999 Schmeiser