JOINT PROBABILITY DISTRIBUTIONS

Size: px
Start display at page:

Download "JOINT PROBABILITY DISTRIBUTIONS"

Transcription

1 MTH/STA 56 JOINT PROBABILITY DISTRIBUTIONS The study of random variables in the previous chapters was restricted to the idea that a random variable associates a single real number with each possible outcome of an experiment. Subsequently, probability distribution is de ned for the one-dimensional random variable, based upon the probability measure for the experiment. Now we equally well de ne rules that would associate two numbers, three numbers, or n numbers with possible outcomes of an experiment. These rules actually de ne two-dimensional, three-dimensional, or for the general case, n-dimensional random variables (or random vectors). Again, the probability distribution for the n-dimensional random variable (random vector) derives directly from the probability measure for the experiment. As an example, consider an experiment of tossing a fair coin and a balanced die together once. Let Y be an binary variable that takes on the value if the coin turns up to be a head; and the value otherwise. Also, let Y be the number of dots appearing on the die. The sample space associated with the experiment is S f(y ; y ) j y ; and y ; ; ; ; 5; 6g : Assume that all sample points are equally likely. Then the random variable Y and Y associates two numbers with each possible outcome of the experiment as follows: Let y and y denote the realized values of Y and Y, respectively, we then have Sample Point (y ; y ) Sample Point (y ; y ) (Head; ) (; ) (Head; ) (; ) (T ail; ) (; ) (T ail; ) (; ) (Head; ) (; ) (Head; 5) (; 5) (T ail; ) (; ) (T ail; 5) (; 5) (Head; ) (; ) (Head; 6) (; 6) (T ail; ) (; ) (T ail; 6) (; 6) and, accordingly, the bivariate probability distribution for Y and Y is given by p (y ; y ) P fy y ; Y y g for y ; and y ; ; ; ; 5; 6. It is also clear that P fy ; Y 5g p (; ) + p (; ) + p (; 5) : It should noted here that the vector (Y ; Y ) thus de ned can be thought of as a rule that associates an ordered pair (y ; y ) of realized values with each sample point of the sample

2 space S, and is referred to as a two-dimensional random vector. Since (Y ; Y ) is a twodimensional random vector, we can visualize the possible observed values (y ; y ) as being points in a two-dimensional space and an event is a collection of points or a region in the two-dimensional space. In general, an n-dimensional random vector may be de ned as follows. De nition. Let S be the sample space of an experiment. An n-dimensional random vector (Y ; Y ; ; Y n ) is a rule (or, equivalently, an ordered collection of n rules) that associates an n-tuple with each sample point of S. We will equivalently say that Y, Y,, Y n are jointly distributed random variables. If (Y ; Y ; ; Y n ) is an n-dimensional random vector, we can visualize the possible observed values (y ; y ; ; y n ) as being points in an n-dimensional space. For an n-dimensional random vector, events are collections of points or some regions in the n-dimensional space. I. Joint Probability Distributions for Discrete Random Variables Based upon the discussions in the preceding paragraphs, it is natural to de ne the bivariate probability distribution for discrete random variables Y and Y as follows. De nition. The function p (y ; y ) is a bivariate probability distribution for discrete random variables Y and Y if the following properties holds: () p (y ; y ) for all real numbers y and y. () X X p (y ; y ), where the sum ranges over all values (y ; y ) that are assigned y y nonzero probabilities. () p (y ; y ) P fy y ; Y y g. () For any region A in the two-dimensional space, P f(y ; Y ) Ag X X p (y ; y ) : (y ;y )A Below are examples with regard to the bivariate probability distribution for two discrete random variables. Example. Suppose that random variables Y and Y have the joint probability distribution given by / y +y for y p (y ; y ) ; ; ; and y ; ; elsewhere Clearly, X X y y y +y X y y y X y ;

3 where the two summations in the second slot of the above equation are geometric series with ratio. Example. A local supermarket has three checkout counters. Two customers, say A and B, arrive at the counters at di erent times when the counters are serving no other customers. Each customer chooses a counter at random, independently of the other. Let Y be the number of customers who choose counter, and Y the number of customers who choose counter. The all possible arrivals of the two customers are listed in the following table: Counter Counter Counter (y ; y ) A B (; ) A B (; ) A B (; ) B A (; ) B A (; ) B A (; ) AB (; ) AB (; ) AB (; ) Then the bivariate probability distribution for Y and Y is given by y y Example. Consider an experiment of drawing two marbles at random from an urn which contains blue marbles, red marbles, and green marbles. Let Y be the number of blue marbles drawn, and Y the number of red marbles drawn. Then (Y ; Y ) takes on values (y ; y ) with y,,, y,,, and y + y. The total number 8 of equally likely ways of drawing any two marbles from the eight is 8; and the number of ways of drawing y blue marbles, y red marbles, and y y green marbles is. Hence, the bivariate probability distribution for Y and Y is y y y y given by p (y ; y ) y y 8 y y for y,,, y,,, and y + y ; that is, (y ; y ) (; ), (; ), (; ), (; ), (; ), (; ). More speci cally, p (y ; y ) is given by

4 y y Hence, the probability that at least one green marble will be drawn is P fy + Y g p (; ) + p (; ) + p (; ) : When (Y ; Y ; ; Y n ) is a discrete n-dimensional random vector; that is, Y, Y,, Y n are n discrete (one-dimensional) random variables, the corresponding joint probability function, or joint probability distribution, of Y, Y,, Y n may be speci ed by p (y ; y ; ; y n ) P fy y ; Y y ; ; Y n y n g which gives the probability of occurrence of individual points (in the n-dimensional space); as in the one-dimensional case, the probability of any event is computed by summing the probabilities of occurrence of individual points that belong to the event of interest. Evidently, the joint probability function of Y, Y,, Y n must satisfy properties in the following de nition: De nition. The function p (y ; y ; ; y n ) is a joint probability distribution for discrete random variables Y, Y,, Y n if the following conditions hold: () p (y ; y ; ; y n ) for all real numbers y, y,, y n. () X X X p (y ; y ; ; y n ), where the sum ranges over all values (y ; y ; ; y n ) y y y n that are assigned nonzero probabilities. () p (y ; y ; ; y n ) P fy y ; Y y ; ; Y n y n g. () For any region A in the n-dimensional space, P f(y ; Y ; ; Y n ) Ag X X X p (y ; y ; ; y n ) : (y ;y ; ;y n)a II. Joint Distribution Functions Remember that the one-dimensional continuous random variable and its univariate probability density function are de ned through its univariate distribution function. In a similar fashion, an n-dimensional random vector and its joint probability density function can also be derived from its joint distribution function. With this in mind, it is necessary to de ne the joint distribution function for the n-dimensional random vector. To do so, let us rst

5 de ne the bivariate distribution function (which works for both the discrete and continuous cases) and, subsequently, its bivariate probability density function for the continuous case only. Naturally this bivariate distribution function can be readily extended to be the joint distribution function for an n-dimensional random vector. In analogue to the distribution function F (y) P fy yg for a one-dimensional random variable Y, the bivariate distribution function for any two random variables Y and Y (discrete or continuous) can be easily de ned as follows: De nition. For any two random variables Y and Y, the bivariate distribution function F (y ; y ) is given by for < y < and < y <. F (y ; y ) P fy y ; Y y g Likewise, the joint distribution function random variables Y, Y,, Y n (discrete or continuous) is de ned in a similar form as follows: De nition 5. The joint distribution function F (y ; y ; ; y n ) of random variables Y, Y,, Y n is given by F (y ; y ; ; y n ) P fy y ; Y y ; ; Y n y n g for < y i < (i ; ; ; n). It seems to be relatively easier to evaluate the bivariate distribution function for the discrete case than for the continuous case. Typical evaluation of such distribution functions for the discrete case relies upon counting possible discrete pairs (y ; y ) of realized values of the random vector (Y ; Y ) as demonstrated in the example below. Example. Refer to the coin-die-tossing example, we have F (; ) P fy ; Y g p (; ) + p (; ) + p (; ) + p (; ) + p (; ) + p (; ) 6 : Refer to Example, we obtain F ( ; ) P fy ; Y g P () and F (:5; ) P fy :5; Y g p (; ) + p (; ) + p (; ) + p (; ) + p (; ) + p (; ) : 5

6 Refer to Example, we get F (:5; :6) P fy :5; Y :6g p (; ) + p (; ) + p (; ) + p (; ) + p (; ) + p (; ) : As seen in the distribution function for one-dimensional random variable, is possible to show that the distribution function F (y ; y ; ; y n ) is nondecreasing and continuous at least from the right with respect to every variable. Since fy < g \ fy < y g \ \ fy n < y n g fy < y g \ fy < g \ \ fy n < y n g... fy < y g \ fy < y g \ \ fy n < g are impossible events, it is clear that F ( ; y ; ; y n ) F (y ; ; ; y n ) F (y ; y ; ; ) : Moreover, since f < Y < g \ f < Y < g \ \ f < Y n < g S; where S is the sample space, it satis es the equality F (; ; ; ) P f < Y < ; < Y < ; ; < Y n < g P (S) : For a two-dimensional random vector (Y ; Y ), it should be noticed that we also have the following equality: P fa < Y a ; b < Y b g P fy a ; Y b g P fy a ; Y b g as illustrated in the gure below. P fy a ; Y b g + P fy a ; Y b g F (a ; b ) F (a ; b ) F (a ; b ) + F (a ; b ) 6

7 A good demonstration for the above result is laid out in the following example. Example 5. Refer to Example, it is clear that the event f < Y ; < Y g contains only one possible pair (; ) and so P f < Y ; < Y g p (; ) On the other hand, it follows from the above formula that where P f < Y ; < Y g F (; ) F (; ) F (; ) + F (; ) F (; ) p (; ) + p (; ) + p (; ) + p (; ) + p (; ) + p (; ) F (; ) p (; ) + p (; ) + p (; ) F (; ) p (; ) + p (; ) + p (; ) + p (; ) F (; ) p (; ) + p (; ) : In contrast to the one-dimensional distribution function, in order that the function F (y ; y ) be the distribution function of a two-dimensional random vector, it is not suf- cient that this function be continuous from the right, nondecreasing with respect to each of the variables, and satisfy the following conditions: To see this, consider the function F ( ; y ) F (y ; ) and F (; ) : F (y ; y ) for y + y < for y + y, that is, the function F (y ; y ) takes on the value for the points on and above the line y y, and the value for the points below the line. This function is nondecreasing, continuous from the right with respect to y and y, and F ( ; y ) F (y ; ) and F (; ) : However, it does not satisfy the equality P fa < Y a ; b < Y b g F (a ; b ) F (a ; b ) F (a ; b ) + F (a ; b ) : 7

8 For instance, P f < Y ; < Y g F (; ) F (; ) F ( ; ) + F ( ; ) + < : Remark. A real-valued function F (y ; y ) is a distribution function of a two-dimensional random vector if and only if the following conditions hold: () F (y ; y ) is nondecreasing and continuous at least from the right with respect to both arguments y and y. () F ( ; ) F (y ; ) F ( ; y ) and F (; ). () If a a and b b, then P fa < Y a ; b < Y b g F (a ; b ) F (a ; b ) F (a ; b ) + F (a ; b ) : We shall mainly consider multi-dimensional random vectors of the discrete or continuous type. De nition 6. A two-dimensional random vector (Y ; Y ) is said to be discrete if, with probability, it takes on pairs of values belonging to a set A of pairs that is at most countable, and every pair (a; b) is taken with positive probability P fy a; Y bg. We call theses pairs of values jump points, the their probabilities jumps. The joint distribution function F (y ; y ) of two discrete random variables Y and Y is given by F (y ; y ) X X p (t ; t ) t y t y and the joint probability function p (y ; y ) is de ned to be p (y ; y ) P fy y ; Y y g : In the discrete case, the joint distribution function for random variables Y, Y,, Y n is given by F (y ; y ; ; y n ) X X X p (t ; t ; ; t n ) t y t ny n t y and the joint probability function p (y ; y ; ; y n ) is de ned to be p (y ; y ; ; y n ) P fy y ; Y y ; ; Y n y n g : 8

9 III. Joint Probability Density Functions for Continuous Random Variables As seen in the univariate continuous case, two random variables Y and Y are said to be jointly continuous if their joint distribution function F (y ; y ) is continuous in both arguments. We now formally de ne the notion of a two-dimensional random vector of the continuous type. De nition 7. A two-dimensional random vector (Y ; Y ) is said to be continuous if there exists a nonnegative function f (y ; y ) such that F (y ; y ) y y f (t ; t ) dt dt for all pairs (y ; y ) of real numbers, where F (y ; y ) is the joint distribution function of Y and Y. The function f (y ; y ) is called the joint probability density function or bivariate probability density function for continuous random variables Y and Y. Consider a continuous two-dimensional random vector (Y ; Y ). The corresponding bivariate probability density function, f (y ; y ), for Y and Y is proportional to the probability that the random vector is equal to the argument (y ; y ); that is, P fy Y y + y ; y Y y + y g f (y ; y ) y y : Let A be any event (a region in the two-dimensional space). To evaluate the probability of any event A, we simply integrate the density function over the region de ned by A. At the continuity points of (y ; y ), we write f (y ; y ) P fy Y y + y ; y Y y + y g lim : y! y y y! If the joint density function f (y ; y ) is continuous at the point (y ; y ), then f (y ; @y F (y ; y ) : The bivariate probability density function for continuous random variables Y and Y should satisfy the following conditions. Theorem. If the function f (y ; y ) is a bivariate probability density function for continuous random variables Y and Y, then the following properties holds: () f (y ; y ) for all real numbers y and y. () f (y ; y ) dy dy. 9

10 () P f(y ; Y ) Ag A f (y ; y ) dy dy for any region A in the xy-plane. When (Y ; Y ; ; Y n ) is a continuous n-dimensional random vector; that is, Y, Y,, Y n are n continuous (one-dimensional) random variables, the corresponding joint probability density function, or joint density distribution, of Y, Y,, Y n may be denoted by f (y ; y ; ; y n ) which is proportional to the probability that the random vector is equal to the argument (y ; y ; ; y n ); that is, P fy Y y + y ; y Y y + y ; ; y n Y n y n + y n g f (y ; y ; ; y n ) y y y n : We evaluate the probability of any event A (a region in the n-dimensional space) by integrating the density function over the region de ned by A. At the continuity points of (y ; y ; ; y n ), we write f (y ; y ; ; y n ) lim y! y! y n! P fy Y y + y ; y Y y + y ; ; y n Y n y n + y n g y y : If the joint density function f (y ; y ; ; y n ) for the n-dimensional continuous random vector is continuous at the point (y ; y ; ; y n ), then f (y ; y ; ; y n n F (y ; y ; ; y n ) : In summary, the joint density function of Y, Y,, Y n must satisfy properties in the following de nition: De nition 8. The function f (y ; y ; ; y n ) is a joint probability density function for continuous random variables Y, Y,, Y n if and only if () f (y ; y ; ; y n ) for all real numbers y, y,, y n. () f (y ; y ; ; y n ) dy dy dy n. () For any region A in the n-dimensional space, P f(y ; Y ; ; Y n ) Ag f (y ; y ; ; y n ) dy dy dy n : (y ;y ; ;y n)a Presented below are examples for bivariate probability density functions of continuous random variables.

11 Example 6. Let Y and Y denote the proportions of time, out of one workday, that employees and, respectively, actually spend on performing their assigned tasks. The joint probability density function is given by y + y f (y ; y ) for y and y elsewhere. Then and P f (y ; y ) dy dy Y < ; Y > (y + y ) dy dy + y dy (y + y ) dy dy y y + y y dy y y + y y y y dy 8 y + y y y + y y dy y y y : Example 7. Let Y and Y have the joint probability density function given by cy f (y ; y ) y for < y < and < y < elsewhere. Then the value of c can be obtained through c yy dy dy c y dy y dy c y y y Hence, c. Also, since the probability is zero if Y, it follows that P < Y < ; < Y < 5 P < Y < ; < Y < y y c y yy dy dy y dy y dy 8 56 y y y 8 y y y :

12 It should be noted that this probability is the volume under the surface f (y ; y ) y y and above the rectangular set (y ; y ) j < y < ; < y < 5 in the y y -plane. Example 8. Consider random variables Y and Y having the joint probability density function given by y y f (y ; y ) e y y for < y < and < y < elsewhere. This is a legitimate joint density function because y y e y y dy dy y e y dy y e y dy Also, P f < Y < ; < Y < g h lim a! i y a e y lim y b! h y y e y y dy dy e y i y b y y e y dy : y e y dy h e y i y y h e y i y y e e e 9 : Note that this probability is the volume under the surface f (y ; y ) y y e y y and above the rectangular set f(y ; y ) j < y < ; < y < g in the y y -plane. Example 9. Let Y and Y have the joint probability density function given by cy for y f (y ; y ) y elsewhere. (a) Find the value of c. (b) Evaluate P Y ; Y >.

13 Solution. (a) Since f (y ; y ) is a joint probability density function for Y and Y, it must have that Hence, c. y cy dy dy (b) Also, we can evaluate cy [y ] y y y dy cy dy c y y c y : P Y ; Y > y " y y dy dy 8 y dy y # : y [y ] y y y dy " y 8 y y 8 # y y dy This probability is the volume under the surface f (y ; y ) y and above the triangular set f(y ; y ) j y y g in the y y -plane. Example. Let Y and Y have the joint probability density function given by y y f (y ; y ) for y and y elsewhere. Then P < Y ; < Y 8 8 y y dy dy y dy y 8 y y 9 56 : y y y y dy

14 On the other hand, F ; P Y ; Y y y dy y y ; y y dy dy y y y y dy F ; P Y ; Y y y dy y y y y dy dy 9 6 ; y y y y dy F ; P Y ; Y y 8 y dy 6 y y 6 ; y y dy dy y y y y dy Hence, F ; P Y ; Y y 8 y dy 6 y y 6 P < Y ; < Y F ; F ; F y y dy dy ; 9 56 : + F ; y y y y dy 56 :

15 Example. < y <, F (y ; y ) Let f (y ; y ) be de ned as in Example 8. Then for < y < and y y t t e t t dt dt y t e t dt y t e t dt Hence, F (y ; y ) h e t i t y t h ( e y e y e t i t y t elsewhere. e y e y for < y < and < y < Example. For the joint density function de ned in Example 7, for < y < and < y <, the joint distribution function is given by y y y y F (y ; y ) t t dt dt t dt t dt Hence, t t y t t t y t y y : 8 < for y and y F (y ; y ) y : y for < y < and < y < elsewhere. Example. Given a joint density function ( (n )(n ) f (y ; y ) for y (+y +y ) n ; y ; n > elsewhere, the joint distribution function is given by + for y F (y ; y ) (+y ) n (+y ) n (+y +y ) n ; y ; n > elsewhere, since, for y and y, we have y y F (y ; y ) y (n ) (n ) ( + t + t ) n dt dt y n (n ) ( + t ) n ( + y + t ) n dt t y (n ) ( + t + t ) n dt t t y ( + t ) n + ( + y + t ) n t ( + y ) n ( + y ) n + ( + y + y ) n : 5

MULTINOMIAL PROBABILITY DISTRIBUTION

MULTINOMIAL PROBABILITY DISTRIBUTION MTH/STA 56 MULTINOMIAL PROBABILITY DISTRIBUTION The multinomial probability distribution is an extension of the binomial probability distribution when the identical trial in the experiment has more than

More information

SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker CHAPTER 2. UNIVARIATE DISTRIBUTIONS

SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker CHAPTER 2. UNIVARIATE DISTRIBUTIONS SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker CHAPTER. UNIVARIATE DISTRIBUTIONS. Random Variables and Distribution Functions. This chapter deals with the notion of random variable, the distribution

More information

Probability. Lecture Notes. Adolfo J. Rumbos

Probability. Lecture Notes. Adolfo J. Rumbos Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................

More information

1 Random variables and distributions

1 Random variables and distributions Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set

More information

Computing Probability

Computing Probability Computing Probability James H. Steiger October 22, 2003 1 Goals for this Module In this module, we will 1. Develop a general rule for computing probability, and a special case rule applicable when elementary

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

CHAPTER 4 PROBABILITY AND PROBABILITY DISTRIBUTIONS

CHAPTER 4 PROBABILITY AND PROBABILITY DISTRIBUTIONS CHAPTER 4 PROBABILITY AND PROBABILITY DISTRIBUTIONS 4.2 Events and Sample Space De nition 1. An experiment is the process by which an observation (or measurement) is obtained Examples 1. 1: Tossing a pair

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

1 Variance of a Random Variable

1 Variance of a Random Variable Indian Institute of Technology Bombay Department of Electrical Engineering Handout 14 EE 325 Probability and Random Processes Lecture Notes 9 August 28, 2014 1 Variance of a Random Variable The expectation

More information

C-N M406 Lecture Notes (part 4) Based on Wackerly, Schaffer and Mendenhall s Mathematical Stats with Applications (2002) B. A.

C-N M406 Lecture Notes (part 4) Based on Wackerly, Schaffer and Mendenhall s Mathematical Stats with Applications (2002) B. A. Lecture Next consider the topic that includes both discrete and continuous cases that of the multivariate probability distribution. We now want to consider situations in which two or more r.v.s act in

More information

Probability. Table of contents

Probability. Table of contents Probability Table of contents 1. Important definitions 2. Distributions 3. Discrete distributions 4. Continuous distributions 5. The Normal distribution 6. Multivariate random variables 7. Other continuous

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019 Week 2 Section 1.2-1.4 Texas A& M University Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019 Oğuz Gezmiş (TAMU) Topics in Contemporary Mathematics II Week2 1

More information

BINOMIAL DISTRIBUTION

BINOMIAL DISTRIBUTION BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called

More information

Introduction to Statistics. By: Ewa Paszek

Introduction to Statistics. By: Ewa Paszek Introduction to Statistics By: Ewa Paszek Introduction to Statistics By: Ewa Paszek Online: C O N N E X I O N S Rice University, Houston, Texas 2008 Ewa Paszek

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

Economics Bulletin, 2012, Vol. 32 No. 1 pp Introduction. 2. The preliminaries

Economics Bulletin, 2012, Vol. 32 No. 1 pp Introduction. 2. The preliminaries 1. Introduction In this paper we reconsider the problem of axiomatizing scoring rules. Early results on this problem are due to Smith (1973) and Young (1975). They characterized social welfare and social

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Random Variables and Probability Distributions

Random Variables and Probability Distributions CHAPTER Random Variables and Probability Distributions Random Variables Suppose that to each point of a sample space we assign a number. We then have a function defined on the sample space. This function

More information

2. AXIOMATIC PROBABILITY

2. AXIOMATIC PROBABILITY IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop

More information

Introduction to Econometrics

Introduction to Econometrics Introduction to Econometrics Michael Bar October 3, 08 San Francisco State University, department of economics. ii Contents Preliminaries. Probability Spaces................................. Random Variables.................................

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

a11 a A = : a 21 a 22

a11 a A = : a 21 a 22 Matrices The study of linear systems is facilitated by introducing matrices. Matrix theory provides a convenient language and notation to express many of the ideas concisely, and complicated formulas are

More information

LECTURE 10: REVIEW OF POWER SERIES. 1. Motivation

LECTURE 10: REVIEW OF POWER SERIES. 1. Motivation LECTURE 10: REVIEW OF POWER SERIES By definition, a power series centered at x 0 is a series of the form where a 0, a 1,... and x 0 are constants. For convenience, we shall mostly be concerned with the

More information

ECON0702: Mathematical Methods in Economics

ECON0702: Mathematical Methods in Economics ECON0702: Mathematical Methods in Economics Yulei Luo SEF of HKU January 14, 2009 Luo, Y. (SEF of HKU) MME January 14, 2009 1 / 44 Comparative Statics and The Concept of Derivative Comparative Statics

More information

1 The Well Ordering Principle, Induction, and Equivalence Relations

1 The Well Ordering Principle, Induction, and Equivalence Relations 1 The Well Ordering Principle, Induction, and Equivalence Relations The set of natural numbers is the set N = f1; 2; 3; : : :g. (Some authors also include the number 0 in the natural numbers, but number

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

Continuous Distributions

Continuous Distributions Chapter 5 Continuous Distributions 5.1 Density and Distribution Functions In many situations random variables can take any value on the real line or in a certain subset of the real line. For concrete examples,

More information

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin Random Variables Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Random Variables A Random Variable (RV) is a response of a random phenomenon which is numeric. Examples: 1. Roll a die twice

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Lecture Notes on Game Theory

Lecture Notes on Game Theory Lecture Notes on Game Theory Levent Koçkesen 1 Bayesian Games So far we have assumed that all players had perfect information regarding the elements of a game. These are called games with complete information.

More information

Probability & Statistics - FALL 2008 FINAL EXAM

Probability & Statistics - FALL 2008 FINAL EXAM 550.3 Probability & Statistics - FALL 008 FINAL EXAM NAME. An urn contains white marbles and 8 red marbles. A marble is drawn at random from the urn 00 times with replacement. Which of the following is

More information

Mean-Variance Utility

Mean-Variance Utility Mean-Variance Utility Yutaka Nakamura University of Tsukuba Graduate School of Systems and Information Engineering Division of Social Systems and Management -- Tennnoudai, Tsukuba, Ibaraki 305-8573, Japan

More information

5. Conditional Distributions

5. Conditional Distributions 1 of 12 7/16/2009 5:36 AM Virtual Laboratories > 3. Distributions > 1 2 3 4 5 6 7 8 5. Conditional Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

Discrete Finite Probability Probability 1

Discrete Finite Probability Probability 1 Discrete Finite Probability Probability 1 In these notes, I will consider only the finite discrete case. That is, in every situation the possible outcomes are all distinct cases, which can be modeled by

More information

Probability and Independence Terri Bittner, Ph.D.

Probability and Independence Terri Bittner, Ph.D. Probability and Independence Terri Bittner, Ph.D. The concept of independence is often confusing for students. This brief paper will cover the basics, and will explain the difference between independent

More information

Nonlinear Programming (NLP)

Nonlinear Programming (NLP) Natalia Lazzati Mathematics for Economics (Part I) Note 6: Nonlinear Programming - Unconstrained Optimization Note 6 is based on de la Fuente (2000, Ch. 7), Madden (1986, Ch. 3 and 5) and Simon and Blume

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions 1999 Prentice-Hall, Inc. Chap. 4-1 Chapter Topics Basic Probability Concepts: Sample

More information

Fixed Point Theorem and Sequences in One or Two Dimensions

Fixed Point Theorem and Sequences in One or Two Dimensions Fied Point Theorem and Sequences in One or Two Dimensions Dr. Wei-Chi Yang Let us consider a recursive sequence of n+ = n + sin n and the initial value can be an real number. Then we would like to ask

More information

Proofs for Stress and Coping - An Economic Approach Klaus Wälde 56 October 2017

Proofs for Stress and Coping - An Economic Approach Klaus Wälde 56 October 2017 A Appendix Proofs for Stress and Coping - An Economic Approach Klaus älde 56 October 2017 A.1 Solution of the maximization problem A.1.1 The Bellman equation e start from the general speci cation of a

More information

Module 1. Probability

Module 1. Probability Module 1 Probability 1. Introduction In our daily life we come across many processes whose nature cannot be predicted in advance. Such processes are referred to as random processes. The only way to derive

More information

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Statistics for Managers Using Microsoft Excel (3 rd Edition) Statistics for Managers Using Microsoft Excel (3 rd Edition) Chapter 4 Basic Probability and Discrete Probability Distributions 2002 Prentice-Hall, Inc. Chap 4-1 Chapter Topics Basic probability concepts

More information

Wageningen Summer School in Econometrics. The Bayesian Approach in Theory and Practice

Wageningen Summer School in Econometrics. The Bayesian Approach in Theory and Practice Wageningen Summer School in Econometrics The Bayesian Approach in Theory and Practice September 2008 Slides for Lecture on Qualitative and Limited Dependent Variable Models Gary Koop, University of Strathclyde

More information

Prof. Thistleton MAT 505 Introduction to Probability Lecture 18

Prof. Thistleton MAT 505 Introduction to Probability Lecture 18 Prof. Thistleton MAT 505 Introduction to Probability Lecture Sections from Text and MIT Video Lecture: 6., 6.4, 7.5 http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-04-probabilisticsystems-analysis-and-applied-probability-fall-200/video-lectures/lecture-6-discrete-randomvariable-examples-joint-pmfs/

More information

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

Appendix A. Sequences and series. A.1 Sequences. Definition A.1 A sequence is a function N R.

Appendix A. Sequences and series. A.1 Sequences. Definition A.1 A sequence is a function N R. Appendix A Sequences and series This course has for prerequisite a course (or two) of calculus. The purpose of this appendix is to review basic definitions and facts concerning sequences and series, which

More information

Exercises with solutions (Set B)

Exercises with solutions (Set B) Exercises with solutions (Set B) 3. A fair coin is tossed an infinite number of times. Let Y n be a random variable, with n Z, that describes the outcome of the n-th coin toss. If the outcome of the n-th

More information

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT MARCH 29, 26 LECTURE 2 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT (Davidson (2), Chapter 4; Phillips Lectures on Unit Roots, Cointegration and Nonstationarity; White (999), Chapter 7) Unit root processes

More information

EDRP lecture 7. Poisson process. Pawe J. Szab owski

EDRP lecture 7. Poisson process. Pawe J. Szab owski EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment

More information

Statistical Inference By Means of Computer Simulations By Howard G. Tucker. Introductory Remarks

Statistical Inference By Means of Computer Simulations By Howard G. Tucker. Introductory Remarks Statistical Inference By Means of Computer Simulations By Howard G. Tucker Introductory Remarks This text is based on the lecture notes for a one quarter course, Mathematics 7, that I gave during the Spring

More information

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics Probability Rules MATH 130, Elements of Statistics I J. Robert Buchanan Department of Mathematics Fall 2018 Introduction Probability is a measure of the likelihood of the occurrence of a certain behavior

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Arrovian Social Choice with Non -Welfare Attributes

Arrovian Social Choice with Non -Welfare Attributes Arrovian Social Choice with Non -Welfare Attributes Ryo-Ichi Nagahisa y Kansai University Koichi Suga z Waseda University November 14, 2017 Abstract A social state is assumed to be chracterized by welfare

More information

Measuring robustness

Measuring robustness Measuring robustness 1 Introduction While in the classical approach to statistics one aims at estimates which have desirable properties at an exactly speci ed model, the aim of robust methods is loosely

More information

Discrete Random Variables

Discrete Random Variables Chapter 5 Discrete Random Variables Suppose that an experiment and a sample space are given. A random variable is a real-valued function of the outcome of the experiment. In other words, the random variable

More information

Arrovian Social Choice with Non -Welfare Attributes

Arrovian Social Choice with Non -Welfare Attributes Arrovian Social Choice with Non -Welfare Attributes Ryo-Ichi Nagahisa y Kansai University Koichi Suga z Waseda University July 31, 2017 Abstract 1 Introduction 2 Notation and De nitions Let N = f1; 2;

More information

1 Uncertainty. These notes correspond to chapter 2 of Jehle and Reny.

1 Uncertainty. These notes correspond to chapter 2 of Jehle and Reny. These notes correspond to chapter of Jehle and Reny. Uncertainty Until now we have considered our consumer s making decisions in a world with perfect certainty. However, we can extend the consumer theory

More information

1 Selected Homework Solutions

1 Selected Homework Solutions Selected Homework Solutions Mathematics 4600 A. Bathi Kasturiarachi September 2006. Selected Solutions to HW # HW #: (.) 5, 7, 8, 0; (.2):, 2 ; (.4): ; (.5): 3 (.): #0 For each of the following subsets

More information

1.5 The Derivative (2.7, 2.8 & 2.9)

1.5 The Derivative (2.7, 2.8 & 2.9) 1.5. THE DERIVATIVE (2.7, 2.8 & 2.9) 47 1.5 The Derivative (2.7, 2.8 & 2.9) The concept we are about to de ne is not new. We simply give it a new name. Often in mathematics, when the same idea seems to

More information

8 The Discrete Fourier Transform (DFT)

8 The Discrete Fourier Transform (DFT) 8 The Discrete Fourier Transform (DFT) ² Discrete-Time Fourier Transform and Z-transform are de ned over in niteduration sequence. Both transforms are functions of continuous variables (ω and z). For nite-duration

More information

4.3 - Linear Combinations and Independence of Vectors

4.3 - Linear Combinations and Independence of Vectors - Linear Combinations and Independence of Vectors De nitions, Theorems, and Examples De nition 1 A vector v in a vector space V is called a linear combination of the vectors u 1, u,,u k in V if v can be

More information

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of

More information

2 Interval-valued Probability Measures

2 Interval-valued Probability Measures Interval-Valued Probability Measures K. David Jamison 1, Weldon A. Lodwick 2 1. Watson Wyatt & Company, 950 17th Street,Suite 1400, Denver, CO 80202, U.S.A 2. Department of Mathematics, Campus Box 170,

More information

Multivariate random variables

Multivariate random variables DS-GA 002 Lecture notes 3 Fall 206 Introduction Multivariate random variables Probabilistic models usually include multiple uncertain numerical quantities. In this section we develop tools to characterize

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Economics 241B Review of Limit Theorems for Sequences of Random Variables

Economics 241B Review of Limit Theorems for Sequences of Random Variables Economics 241B Review of Limit Theorems for Sequences of Random Variables Convergence in Distribution The previous de nitions of convergence focus on the outcome sequences of a random variable. Convergence

More information

STA Module 4 Probability Concepts. Rev.F08 1

STA Module 4 Probability Concepts. Rev.F08 1 STA 2023 Module 4 Probability Concepts Rev.F08 1 Learning Objectives Upon completing this module, you should be able to: 1. Compute probabilities for experiments having equally likely outcomes. 2. Interpret

More information

Chapter. Probability

Chapter. Probability Chapter 3 Probability Section 3.1 Basic Concepts of Probability Section 3.1 Objectives Identify the sample space of a probability experiment Identify simple events Use the Fundamental Counting Principle

More information

CHAPTER - 16 PROBABILITY Random Experiment : If an experiment has more than one possible out come and it is not possible to predict the outcome in advance then experiment is called random experiment. Sample

More information

Lecture 6: Contraction mapping, inverse and implicit function theorems

Lecture 6: Contraction mapping, inverse and implicit function theorems Lecture 6: Contraction mapping, inverse and implicit function theorems 1 The contraction mapping theorem De nition 11 Let X be a metric space, with metric d If f : X! X and if there is a number 2 (0; 1)

More information

Averaging Operators on the Unit Interval

Averaging Operators on the Unit Interval Averaging Operators on the Unit Interval Mai Gehrke Carol Walker Elbert Walker New Mexico State University Las Cruces, New Mexico Abstract In working with negations and t-norms, it is not uncommon to call

More information

Discrete Probability Refresher

Discrete Probability Refresher ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory

More information

Chapter 3. Chapter 3 sections

Chapter 3. Chapter 3 sections sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions

More information

MARKOV CHAINS: STATIONARY DISTRIBUTIONS AND FUNCTIONS ON STATE SPACES. Contents

MARKOV CHAINS: STATIONARY DISTRIBUTIONS AND FUNCTIONS ON STATE SPACES. Contents MARKOV CHAINS: STATIONARY DISTRIBUTIONS AND FUNCTIONS ON STATE SPACES JAMES READY Abstract. In this paper, we rst introduce the concepts of Markov Chains and their stationary distributions. We then discuss

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Vector Spaces. Chapter 1

Vector Spaces. Chapter 1 Chapter 1 Vector Spaces Linear algebra is the study of linear maps on finite-dimensional vector spaces. Eventually we will learn what all these terms mean. In this chapter we will define vector spaces

More information

MC3: Econometric Theory and Methods. Course Notes 4

MC3: Econometric Theory and Methods. Course Notes 4 University College London Department of Economics M.Sc. in Economics MC3: Econometric Theory and Methods Course Notes 4 Notes on maximum likelihood methods Andrew Chesher 25/0/2005 Course Notes 4, Andrew

More information

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or

More information

Discrete Random Variables. Discrete Random Variables

Discrete Random Variables. Discrete Random Variables Random Variables In many situations, we are interested in numbers associated with the outcomes of a random experiment. For example: Testing cars from a production line, we are interested in variables such

More information

Name: Firas Rassoul-Agha

Name: Firas Rassoul-Agha Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE

More information

Math-Stat-491-Fall2014-Notes-I

Math-Stat-491-Fall2014-Notes-I Math-Stat-491-Fall2014-Notes-I Hariharan Narayanan October 2, 2014 1 Introduction This writeup is intended to supplement material in the prescribed texts: Introduction to Probability Models, 10th Edition,

More information

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability Chapter 2 Conditional Probability and Independence 2.1 Conditional Probability Probability assigns a likelihood to results of experiments that have not yet been conducted. Suppose that the experiment has

More information

2.1 Elementary probability; random sampling

2.1 Elementary probability; random sampling Chapter 2 Probability Theory Chapter 2 outlines the probability theory necessary to understand this text. It is meant as a refresher for students who need review and as a reference for concepts and theorems

More information

Random Variables Example:

Random Variables Example: Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

Lecture 6 Random Variable. Compose of procedure & observation. From observation, we get outcomes

Lecture 6 Random Variable. Compose of procedure & observation. From observation, we get outcomes ENM 07 Lecture 6 Random Variable Random Variable Eperiment (hysical Model) Compose of procedure & observation From observation we get outcomes From all outcomes we get a (mathematical) probability model

More information

Matrix Basic Concepts

Matrix Basic Concepts Matrix Basic Concepts Topics: What is a matrix? Matrix terminology Elements or entries Diagonal entries Address/location of entries Rows and columns Size of a matrix A column matrix; vectors Special types

More information

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, 1.2.11, 1.2.12, 1.2.16 (extra credit) A fashionable country club has 100 members, 30 of whom are lawyers. Rumor has it that 25 of the club members are liars

More information

Maple Output Maple Plot 2D Math 2D Output

Maple Output Maple Plot 2D Math 2D Output Maple Output Maple Plot 2D Math 2D Output 0.1 Introduction Vectors 1 On one level a vector is just a point; we can regard every point in R 2 as a vector. When we do so we will write a, b instead of the

More information

Discrete Distributions

Discrete Distributions Discrete Distributions STA 281 Fall 2011 1 Introduction Previously we defined a random variable to be an experiment with numerical outcomes. Often different random variables are related in that they have

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

Notes on Mathematical Expectations and Classes of Distributions Introduction to Econometric Theory Econ. 770

Notes on Mathematical Expectations and Classes of Distributions Introduction to Econometric Theory Econ. 770 Notes on Mathematical Expectations and Classes of Distributions Introduction to Econometric Theory Econ. 77 Jonathan B. Hill Dept. of Economics University of North Carolina - Chapel Hill October 4, 2 MATHEMATICAL

More information

Math 122L. Additional Homework Problems. Prepared by Sarah Schott

Math 122L. Additional Homework Problems. Prepared by Sarah Schott Math 22L Additional Homework Problems Prepared by Sarah Schott Contents Review of AP AB Differentiation Topics 4 L Hopital s Rule and Relative Rates of Growth 6 Riemann Sums 7 Definition of the Definite

More information

Fundamentals of Probability CE 311S

Fundamentals of Probability CE 311S Fundamentals of Probability CE 311S OUTLINE Review Elementary set theory Probability fundamentals: outcomes, sample spaces, events Outline ELEMENTARY SET THEORY Basic probability concepts can be cast in

More information