SOR201 Solutions to Examples 3
|
|
- Alfred Morgan
- 5 years ago
- Views:
Transcription
1 page 0SOR0(00) SOR0 lutions to Examples (a) An outcome is an unordered sample {i,, i n }, a subset of {,, N}, where the i j s are all different The random variable X can take the values n, n +,, N If X x, the other numbers selected have values in the range,, (x ) If X x, (n ) numbers have been selected from the set {,, (x )} and 0 numbers from the set {(x + ),, N} This can be done in ( x n ) ways The total number of possible selections is ( N P(X x) ( x n ) / ( N, x n,, N (b) If the largest number in the sample is y, then all numbers in the sample are y, and vice versa; also y n and y need not be an integer Consider y to be one of the values n, n +,, N P(X y) P(all numbers in sample y) No of ways of selecting n numbers from {,, y}, without replacement No of ways of selecting n numbers from {,, N}, without replacement ( ( y / N, y n, n +,, N P(X x) P(X x) P(X x ) [( ) ( x n x )] ( n / N ( ( x n ) / N, x n,, N (using the identity ( ( y r ) + y ) ( r y+ ) r ) (i) (a) Y X or X ± Y P(Y y) P(X y) + P(X y) p( y) + p( y), y,, 9, and P(Y 0) p(0) { X, when X 0 (b) W X X, when X < 0 P(W w) p(w) + p( w), w,, and P(W 0) p(0) {, when X > 0 (c) Z sgn(x), when X < 0 0, when X 0 P(Z ) p(x), x>0 P(Z 0) p(0), P(Z ) x<0 p(x) /continued overleaf
2 page 0SOR0(00) (ii) Let g(a) E[(X a) ] E[X ax + a ] E(X ) ae(x) + a dg E(X) + a 0 when a E(X) da d g > 0 when a E(X) da E[(X a)] is minimized when a E(X) [Note: E( X b ) is minimized when b median of X] (i) (a) We have P(X > m) (b) No memory property: xm+ P(X x) pq m + pq m+ + pq m [ + q + q + ] pqm q qm since p + q P((X > m + (X > m)) P(X > m + n X > m) P(X > m) P(X > m + since (X > m + (X > m) P(X > m) qm+n q m qn P(X > [using result in part (a)] (ii) We can write out E(X) as follows: E(X) xp(x x) x0 xp(x x) x P(X ) + P(X ) + P(X ) + P(X ) + P(X ) + P(X ) Since the series is convergent and the terms are positive, we can re-arrange the order of the terms, summing vertically, we obtain E(X) P(X > x) x0 For the geometric distribution in part (i): P(X > 0) ; P(X > x) q x (x a positive integer) E(X) + q + q + q p
3 page 0SOR0(00) We are given that X and Y are independent random variables with (a) P(X Y ) (b) We have that P(X k) P(Y k) pq k, k 0,, ; p + q P(X k, Y k) k0 P(X k)p(y k) k0 ( ) pq k k0 p (q ) k P(X + Y k0 p [since X and Y are independent] [since 0 < q < ] ( q ) p [since p q] ( + q) P(X x, Y n x) x0 P(X x)p(y n x) x0 pq x pq n x x0 p q n (n + )p q n x0 [independence] P(X x X + Y P(X x and X + Y P(X + Y P(X x, Y n x) P(X + Y pqx pq n x [using independence and (*)] (n + )p q n n +, x 0,,, n the discrete uniform distribution on (0,,,, (c) U min(x, Y ) takes the values 0,,, The event (U u) can be decomposed into mutually exclusive events thus: (U u) (X u, Y u) (X u, Y u + ) (X u, Y u + ) (X u +, Y u) (X u +, Y u) ( ) /continued overleaf
4 page 0SOR0(00), invoking the most general ( countably additive ) form of Axiom, we have: P(U u) P(U u, Y u)+ P(X u, Y y)+ P(Y u, X x) yu+ (pq u ) + pq u pq y + yu+ p q u + p q u+y yu+ p q u + p q u+ i0 pq u pq x xu+ p q u + p q u+ /( q) p q u + pq u+ pq u (p + q) pq u (q + ), u 0,, q i xu+ (a) Sample Points X Y Z H H H H H H H H H T H H H T H H H T H H H T H H H T H H H H H H H T T H H T H T H T H H T T H H H T H H T T H H T H T H T H H T H H T T H H T H T H H T T H H H H H T T T H T H T T T H H T T H T T H T T H T H T T T H H T H T T T H T H T T H T T H T H T T T H H H T T T T T H T T T T T H T T T T T H T T T T T H T T T T T /continued overleaf
5 page 0SOR0(00) (b) Let p(x, y, z) P(X x, Y y, Z z) From the listing of the sample space in (a), we deduce that All other probabilities of the form p(0, 0, 0) p(,, ) p(,, ) p(,, ) 6 p(,, ) p(,, ) 6 p(,, ) p(,, ) p(,, ) p(,, ) p(,, ) p(x, y, z), 0 x, 0 y, 0 z are zero [Check: p(x, y, z) ] x,y,z The joint probability function of (X, Y ) is given by P(X x, Y y) P(X x, Y y, Z z) z0 p(x, y, z) for 0 x, 0 y z0 If this function is tabulated in a two-way table, the row and column totals give the probability function values for the random variables Y and X respectively, since P(X x) and P(Y y) P(X x, Y y) p(x, y, z), 0 x y0 P(X x, Y y) y0 z0 x0 x0 z0 p(x, y, z), 0 y Thus: 0 X Y Similarly, the joint probability function of (Y, Z) and the probability functions of Y and Z are Z Y /continued overleaf
6 page 6 0SOR0(00) (c) E(Y ) (0 E(Y ) (0 ) + ( ) + ( ) + ( ) 8 ) + ( Var(Y ) E(Y ) {E(Y )} E(Z) (0 E(Z ) (0 ) + ( )+( ) + ( ) + (9 ) 8 8 Var(Z) E(Z ) {E(Z)} 79 ( 6 6 ) + ( ) + ( ) + ( ) + ( ) 6 6 )+( )+(9 )+(6 )+( ) ) 0 6 E(Y Z) ( ) + ( ) + ( ) + ( ) + ( +( 6 ) + ( 7 ) + ( ) + ( ) 90 Hence Cov(Y, Z) 90 6 and ρ(y, Z) (d) P(X x Y ) P(X x, Y )/P(Y ), 0 x P(Y y, Z z X ) Thus x 0 P(X x Y ) 0 P(X, Y y, Z z), 0 y, 0 z P(X ) P(Y, Z X ) 0 P(Y, Z X ) 6 0 P(Y, Z X ) all other probabilities are zero (e) The random variable E(Y X) takes the values E(Y X x), x 0,,, where Thus: E(Y X x) 0 ; yp(y y X x) y0 P(X x, Y y) y P(X x) y0 E(Y X 0) 0 / / E(Y X ) / / E(Y X ) / 0/ 0/ 0 E(Y X ) / 0/ + 6/ 0/ + / 0/ 8 0 E(Y X ) / + / 6 / / 0 E(Y X ) / / The probabilities are derived from {P(X x)} Thus: 6 E(Y X) 0 6 Prob and E[E(Y X)] (0 ) + ( 6 ) + ( 6 0 ) + ( ) 8 E(Y ) )
7 page 7 0SOR0(00) 6 Let X time to freedom (hours) Y number of door originally chosen (, or ) E(X) E(X Y )P(Y ) + E(X Y )P(Y ) + E(X Y )P(Y ) Now E(X) 0 days 7 (a) P(X x) E(X Y ) + E(X) E(Y Y ) + E(X) E(X Y ) E(X) 0[ + E(X)] + 0[ + E(X)] + 0[] + 08E(X) 0 N- N N+ N+ N P(X x) P(X N x + ), E(X N ) N x (x N )P(X x) Now st term ( N)P(X ) nd term ( N)P(X ) etc N x,, N N x nd last term (N )P(X N ) last term (N )P(X N) (N )P(X ) )P(X ) Summing st and last terms gives 0; nd and nd last terms gives 0; etc E(X N ) 0 E(X) (N + ), ie E(X) N + Since P(X N + ) P(X N + ), the median of X is N + (actually any value between N and N + can be considered the media /continued overleaf
8 page 8 0SOR0(00) (b) P(X x) 0 M- M M+ M M M x P(Y y) P(Y M y), y 0,, M By a similar procedure to that in part (a), we can show that E(Y M) 0 and hence E(Y ) M Since P(Y < M) P (Y > M), the median of Y is M 8 We have: ( n ) P(X X ) P [X k, X k] k0 P(X k, X k) k0 P(X k)p(x k) k0 [me events] [independence] But P(X k) P(X n k) since P(H) P(T ) P(X X ) P(X k)p(x n k) k0 P(X k, X n k) k0 ( n ) P [X k, X n k] k0 P(X + X [independence] [me events] Because the two people toss independently, the experiment can be regarded as a single Bernoulli process with n trials and p, X X + X being the total number of heads obtained the required probability is P(X ( ) ( n n ( n n n ) ) [binomial distribution] ( ) ( n n n )
9 page 9 0SOR0(00) 9 (a) (b) E(X) (0 ) + ( E(X ) (0 )+( X Z ) + ( 0 )+( Var(X) E(X ) {E(X)} ( ) 0 ) + ( ) + ( ) + ( ) 80 0 )+(9 )+(6 )+( ) 0 E(XY ) ( ) + ( ) + ( 6 ) + ( ) + ( 6 +( ) + ( ) + ( ) + ( ) 8 E(XZ) ( ) + ( 6 ) + ( ) + ( ) + ( +( ) + ( ) + ( ) + ( ) Cov(X, Y ), ρ(x, Y ) 0 8 and Cov(X, Z) 88, 6 ρ(x, Z) (c) P(X x Y, Z ) (d) Thus: P(X x, Y, Z ), 0 x P(Y, Z ) P(X Y, Z ) 6 7 P(X Y, Z ) 7 ; all other probabilities are zero E(Z X x) P(X x, Z z) z P(X x) z0 E(Z X 0) 0 / / E(Z X ) / / E(Z X ) 6/ 0/ 0/ 0 E(Z X ) / + 6/ + / 0/ 0/ 0/ 0 E(Z X ) / / + / / + / / 0 E(Z X ) / / /continued overleaf ) )
10 page 0 0SOR0(00) From {P(X x)} we deduce E(Z X) 0 Prob Let X k S E[E(Z X)] (0 ) + ( ) + ( 6 E(Z) return when critical value k is used value on first roll The probability distribution of S is: 0 0 ) + ( 0 0 ) + ( 0 ) + ( ) Now j P(S j) E(X k ) j 6 E(X k S j)p(s j) { 0, if j 7 E(X k S j) E(X k ), if j < k and j 7 (because we start agai j, if j k and j 7 E(X 6 ) [ ]E(X 6) + [(6 ) + (8 ) + (9 ) +(0 ) + ( ) + ( )] 0E(X 6) + 70 ie 6 E(X 6) 70 E(X 6 ) [E(X 7 ) ]E(X 8 ) [ ]E(X 8) + [(8 ) + (9 ) +(0 ) + ( ) + ( )] E(X 8) + 0 ie E(X 8) 0 E(X 8 ) E(X 9 ) [ ]E(X 9) + [(9 ) + (0 ) + ( ) + ( )] 0 E(X 9) ie E(X 9) 00 E(X 9 ) E(X k ) is maximised at k 8
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationRelationship between probability set function and random variable - 2 -
2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be
More informationHW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)
HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have
More informationRandom variables (discrete)
Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationDiscrete Probability Refresher
ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory
More informationMore than one variable
Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to
More informationMATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM
MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM YOUR NAME: KEY: Answers in Blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they
More informationBrief Review of Probability
Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic
More informationMath 510 midterm 3 answers
Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationTopic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.
Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin
More informationHomework 4 Solution, due July 23
Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var
More informationECON Fundamentals of Probability
ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,
More informationHomework for 1/13 Due 1/22
Name: ID: Homework for 1/13 Due 1/22 1. [ 5-23] An irregularly shaped object of unknown area A is located in the unit square 0 x 1, 0 y 1. Consider a random point distributed uniformly over the square;
More informationSTAT 430/510 Probability Lecture 7: Random Variable and Expectation
STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More information[POLS 8500] Review of Linear Algebra, Probability and Information Theory
[POLS 8500] Review of Linear Algebra, Probability and Information Theory Professor Jason Anastasopoulos ljanastas@uga.edu January 12, 2017 For today... Basic linear algebra. Basic probability. Programming
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationRandom Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.
Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Sta230/Mth230 Colin Rundel February 5, 2014 We have been using them for a while now in a variety of forms but it is good to explicitly define what we mean Random
More informationCS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro
CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your
More informationA random variable is a quantity whose value is determined by the outcome of an experiment.
Random Variables A random variable is a quantity whose value is determined by the outcome of an experiment. Before the experiment is carried out, all we know is the range of possible values. Birthday example
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationNotes on Mathematics Groups
EPGY Singapore Quantum Mechanics: 2007 Notes on Mathematics Groups A group, G, is defined is a set of elements G and a binary operation on G; one of the elements of G has particularly special properties
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 8 Prof. Hanna Wallach wallach@cs.umass.edu February 16, 2012 Reminders Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationPoisson approximations
Chapter 9 Poisson approximations 9.1 Overview The Binn, p) can be thought of as the distribution of a sum of independent indicator random variables X 1 + + X n, with {X i = 1} denoting a head on the ith
More informationPROBABILITY VITTORIA SILVESTRI
PROBABILITY VITTORIA SILVESTRI Contents Preface 2 1. Introduction 3 2. Combinatorial analysis 6 3. Stirling s formula 9 4. Properties of Probability measures 12 5. Independence 17 6. Conditional probability
More information18.440: Lecture 26 Conditional expectation
18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions
More informationProbability and Statistics. Vittoria Silvestri
Probability and Statistics Vittoria Silvestri Statslab, Centre for Mathematical Sciences, University of Cambridge, Wilberforce Road, Cambridge CB3 0WB, UK Contents Preface 5 Chapter 1. Discrete probability
More information6.041/6.431 Fall 2010 Quiz 2 Solutions
6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential
More informationPart (A): Review of Probability [Statistics I revision]
Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery
More informationChapter 2. Probability
2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/
More informationLecture 16. Lectures 1-15 Review
18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability
More informationRandom Variables and Expectations
Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure
More information7. Be able to prove Rules in Section 7.3, using only the Kolmogorov axioms.
Midterm Review Solutions for MATH 50 Solutions to the proof and example problems are below (in blue). In each of the example problems, the general principle is given in parentheses before the solution.
More informationProbability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.
Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for
More informationIntroduction to Probability 2017/18 Supplementary Problems
Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A
More informationIB Mathematics HL Year 2 Unit 7 (Core Topic 6: Probability and Statistics) Valuable Practice
IB Mathematics HL Year 2 Unit 7 (Core Topic 6: Probability and Statistics) Valuable Practice 1. We have seen that the TI-83 calculator random number generator X = rand defines a uniformly-distributed random
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More information3-1. all x all y. [Figure 3.1]
- Chapter. Multivariate Distributions. All of the most interesting problems in statistics involve looking at more than a single measurement at a time, at relationships among measurements and comparisons
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would
More informationMATH 151, FINAL EXAM Winter Quarter, 21 March, 2014
Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationLecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya
BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete
More informationMATH1231 Algebra, 2017 Chapter 9: Probability and Statistics
MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra
More informationTwelfth Problem Assignment
EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching
More informationCSC Discrete Math I, Spring Discrete Probability
CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields
More informationExpectation MATH Expectation. Benjamin V.C. Collins, James A. Swenson MATH 2730
MATH 2730 Expectation Benjamin V.C. Collins James A. Swenson Average value Expectation Definition If (S, P) is a sample space, then any function with domain S is called a random variable. Idea Pick a real-valued
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationLecture 10. Variance and standard deviation
18.440: Lecture 10 Variance and standard deviation Scott Sheffield MIT 1 Outline Defining variance Examples Properties Decomposition trick 2 Outline Defining variance Examples Properties Decomposition
More informationMidterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example
Midterm #1 Midterm 1 Lecture 10: and the Law of Large Numbers Statistics 104 Colin Rundel February 0, 01 Exam will be passed back at the end of class Exam was hard, on the whole the class did well: Mean:
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationIntroduction to Machine Learning
Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB
More informationEC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)
1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationSTAT509: Discrete Random Variable
University of South Carolina September 16, 2014 Motivation So far, we have already known how to calculate probabilities of events. Suppose we toss a fair coin three times, we know that the probability
More informationProbability Dr. Manjula Gunarathna 1
Probability Dr. Manjula Gunarathna Probability Dr. Manjula Gunarathna 1 Introduction Probability theory was originated from gambling theory Probability Dr. Manjula Gunarathna 2 History of Probability Galileo
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationTHE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION MODULE 2
THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE EXAMINATION NEW MODULAR SCHEME introduced from the eaminations in 7 MODULE SPECIMEN PAPER B AND SOLUTIONS The time for the eamination is ½ hours The paper
More informationCMPT 882 Machine Learning
CMPT 882 Machine Learning Lecture Notes Instructor: Dr. Oliver Schulte Scribe: Qidan Cheng and Yan Long Mar. 9, 2004 and Mar. 11, 2004-1 - Basic Definitions and Facts from Statistics 1. The Binomial Distribution
More informationNotes 12 Autumn 2005
MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationIntroductory Probability
Introductory Probability Joint Probability with Independence; Binomial Distributions Nicholas Nguyen nicholas.nguyen@uky.edu Department of Mathematics UK Agenda Comparing Two Variables with Joint Random
More informationCopyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.
Copyright & License Copyright c 2006 Jason Underdown Some rights reserved. choose notation binomial theorem n distinct items divided into r distinct groups Axioms Proposition axioms of probability probability
More informationConditional distributions (discrete case)
Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationAPM 504: Probability Notes. Jay Taylor Spring Jay Taylor (ASU) APM 504 Fall / 65
APM 504: Probability Notes Jay Taylor Spring 2015 Jay Taylor (ASU) APM 504 Fall 2013 1 / 65 Outline Outline 1 Probability and Uncertainty 2 Random Variables Discrete Distributions Continuous Distributions
More informationExpectation, inequalities and laws of large numbers
Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator
More informationLecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014
Lecture 13 Text: A Course in Probability by Weiss 5.5 STAT 225 Introduction to Probability Models February 16, 2014 Whitney Huang Purdue University 13.1 Agenda 1 2 3 13.2 Review So far, we have seen discrete
More informationLecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality
Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi
More informationRandom Variable. Pr(X = a) = Pr(s)
Random Variable Definition A random variable X on a sample space Ω is a real-valued function on Ω; that is, X : Ω R. A discrete random variable is a random variable that takes on only a finite or countably
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 7 Prof. Hanna Wallach wallach@cs.umass.edu February 14, 2012 Reminders Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationProbability. Computer Science Tripos, Part IA. R.J. Gibbens. Computer Laboratory University of Cambridge. Easter Term 2008/9
Probability Computer Science Tripos, Part IA R.J. Gibbens Computer Laboratory University of Cambridge Easter Term 2008/9 Last revision: 2009-05-06/r-36 1 Outline Elementary probability theory (2 lectures)
More informationNonparametric hypothesis tests and permutation tests
Nonparametric hypothesis tests and permutation tests 1.7 & 2.3. Probability Generating Functions 3.8.3. Wilcoxon Signed Rank Test 3.8.2. Mann-Whitney Test Prof. Tesler Math 283 Fall 2018 Prof. Tesler Wilcoxon
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationBinomial and Poisson Probability Distributions
Binomial and Poisson Probability Distributions Esra Akdeniz March 3, 2016 Bernoulli Random Variable Any random variable whose only possible values are 0 or 1 is called a Bernoulli random variable. What
More informationCS206 Review Sheet 3 October 24, 2018
CS206 Review Sheet 3 October 24, 2018 After ourintense focusoncounting, wecontinue withthestudyofsomemoreofthebasic notions from Probability (though counting will remain in our thoughts). An important
More informationProbability Review. Gonzalo Mateos
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction
More information