3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

Size: px
Start display at page:

Download "3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES"

Transcription

1 3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions of probabilit. We then define rom variables densit functions, review some of the operations on rom variables. We conclude b defining rom processes discussing some properties of rom processes that we will need in our Kalman filter formulations. 3. Probabilit 3..1 Background Suppose we roll a fair die. We sa that the probabilit of a one appearing (the face of the die with one dot) is one-sixth. We arrived at this conclusion b considering that the die has six sides that there is an equal chance that an one side will be facing up when the die ls. This approach to determining probabilit is termed the relative frequenc approach is suitable for defining probabilities associated with simple experiments. With this approach, we determine probabilities from the underling experiment or process deductive reasoning. For example, in the die example, we know that if we roll a die that one of six faces will show. We also assume that it is equall probable that an of the six faces will show. Given this information we thus deduce that the probabilit of an one face showing is one-sixth. For more complex experiments, where the relative frequenc approach becomes more difficult to appl, we can use the axiomatic approach to determine probabilities. The axiomatic approach is based on three probabilit axioms uses them, set theor, to determine probabilities associated with certain experiment outcomes. There is a third approach to determining probabilities that is termed the experimental approach. In this approach probabilities are determined b conducting man experiment tries analzing the results. An example of the experimental approach is the method b which automotive fatalit statistics are derived. To proceed further we must define some terms. We will use the die example to do so. In the die example, the rolling of the die is termed an experiment the appearance of a particular face is an outcome of the experiment. Thus, the die rolling experiment consists of six possible outcomes. If the experiment were the flipping of a coin there would be two outcomes: heads tails. If we were to consider an experiment where we rolled two dice we could define either 36 outcomes or 11 outcomes, depending upon the how we decided to phrase the problem, what probabilities we were tring to find. For the first case the outcomes would be the face with one dot on die one the face with one dot on die two, the face with one dot on die one the face with two dots on die two, so forth. For the second case the outcomes would be the sum of the dots on the two dice equals two, the sum of the dots on the two dice equals three, so forth. M. Budge, Jr Jan

2 3.. Set Theor the Axioms of Probabilit When we discuss probabilities we talk about probabilities associated with events. An event is a collection of outcomes. Thus for the die problem, example events could be: the face with one dot or the face with one dot or the face with two dots. In the first case, the event consists of one outcome in the second case the event consists of two outcomes. This idea of using events to represent collections of outcomes leads to the use of set theor since set theor gives us a means of collecting objects. The analog here is that events are sets outcomes are set elements. There are two events that pla special roles in probabilit theor. These are the certain event the null event. The certain event contains all possible experiment outcomes is analogous to the universe in set theor. It is denoted b. The null event contains no outcomes is analogous to the empt set. It is denoted b. In summar, we will use set theor to represent events we associate probabilities with events. Thus, for the die experiment we can define an event as face with one dot or face with three dots or face with five dots f1,f3,f5 The probabilit of the event occurring is. (3-1) 1 P. (3-) We note that we could use set theor to write f1 f 3 f5 (3-3) which would allow us to write P P. (3-4) This is where the axiomatic approach to probabilit becomes useful. It allows us to use set theor derivations to compute complex probabilities from simple probabilities. as The axioms of probabilit are as follows 1. P 1,. P 0 3. If are mutuall exclusive then P P P. Recall that two sets (events) are mutuall exclusive if. Stated another wa, two events are mutuall exclusive if the appearance of one M. Budge, Jr Jan

3 precludes the appearance of another. Thus if f1 f then are mutuall exclusive because if the roll of the die results in the face with one dot showing, the face with two dots can t be showing, or is precluded from occurring. As indicated earlier, the use of the above axioms set theor allows us to derive man other probabilit relations. Two of the more important ones are P 0 (3-5) P P P P. (3-6) Another important probabilit relation concerns the intersection of two events. This relation is usuall stated as a definition since it cannot be derived from set theor the axioms of probabilit. The probabilit relation states that if two events are independent then P P P. (3-7) We sa that two events are independent if the occurrence of one event doesn t give an information about, or influence the occurrence of, the other event. For example, if we consider the experiment of rolling a pair of dice, the event consisting of a one for die one is independent of the event consisting of a three for die two. This is because we assume that what we roll on the first die will not influence what we roll on the second die. It should be noted that two events can t be mutuall exclusive independent, unless one of the events is the null event. For those who are interested in the finer points of probabilit theor independent events, it should be noted that one can t reall discuss independent events on a single experiment. The can onl be defined on combined experiments. The reader is referred to Papoulis (Papoulis 1991) or some other text on probabilit theor for further discussions of this area. 3.3 Rom Variables Background In the discussions of the previous section we referred to outcomes of the die experiment as face with one dot, etc. If we were discussing the die experiment less formall, we would refer to the rolling of a one. What we have done is assign numbers to the experiment outcomes. When we assign numbers to experiment outcomes we define a rom variable for the experiment. Aside from notational convenience, there is another reason for defining rom variables for experiments: our mathematics background. Specificall, we have developed a wealth of mathematical tools for manipulating numbers. We have also devoted a great deal of time to relating phsical processes to numbers. Unfortunatel, we don t have a wealth of mathematical tools for sets we onl have limited abilit to relate sets elements to phsical processes. For this reason, we use rom variables to take probabilit theor out of the M. Budge, Jr Jan

4 context of set theor put it in the context of mathematics that we know how to work with Definition of a Rom Variable We formall define a rom variable as follows: For ever experimental outcome,, we assign a number. With this we have defined a function x x, which is a mapping from a set of s to a set of numbers. This function is termed a rom variable. In all cases that are of concern to us, we define x as the numerical value of the outcome, x. An example of, i.e., this is the die problem we just discussed. Another example is the voltage across a resistor or the range to a target. It should be noted that we are not required to use such a convention (Papoulis 1991). However, it is convenient. In the future we will not explicitl indicate the dependenc of the rom variable on the experiment outcome. That is, we will replace x b x. We do this for two reasons. One is convenience the other is to force us to stop explicitl thinking about sets think in terms of numbers. This is somewhat subtle but is essential to preventing confusion when studing rom variable theor. There are two basic tpes of rom variables: discrete continuous. Discrete rom variables can have onl a finite number of values continuous rom variables can take on a continuum of values (i.e., there are an infinite number of them). An example of a discrete rom variable is the rom variable associated with the die experiment. An example of a continuous rom variable is the voltage across a resistor Distribution Densit Functions We define the probabilit x x P x F x (3-8) as the distribution function of the rom variable x. In the above x x x x is the event consisting of outcomes,, such that the rom variable associated with those outcomes, x, is less than, or equal to, the number x. The distribution function of the rom variable x is the probabilit associated with this event. This is a long-winded word description that we don t use. In its place we sa that the distribution function of x is the probabilit that x is less than, or equal to, x. When we make the latter statement we should realize that it is a shorth notation for the longer, more formal correct definition. In most cases we drop the subscript x denote the distribution function as F x. We define the densit function of the rom variable, x, as M. Budge, Jr Jan

5 df x fx x f x. (3-9) dx We can relate the densit function back to the distribution function, probabilit, b x x P x F x f d. (3-10) The probabilit P x x x 1 x 1 x x1 is given b P x x f d. (3-11) For discrete rom variables, f x consists of a series of impulse functions (Dirac delta functions) located at the values that the rom variables can take on. The weight on each impulse is equal to the probabilit associated with the rom variable. B this we mean that the weight is the probabilit associated with the event containing all outcomes for which x xk. In equation form k x k k f x P x x x. (3-1) As an example, the densit function associated with the die experiment is (3-13) k1 f x x k The densit function of a continuous rom variable is a continuous function of x. Rom variables are usuall referenced b the densit function associated with them. Thus a Gaussian rom variable is a rom variable whose densit function is the Gaussian function, 1 x f x e. (3-14) We can derive two ver important properties of the densit function from Equations Specificall: f x x 0 f x dx Mean, Variance Moments of Rom Variables Rather than work directl with rom variables their densit functions we often work with functions of rom variables. We do this for M. Budge, Jr Jan

6 much the same reason we transitioned from set theor probabilities to rom variables densit functions our abilit (or inabilit) to perform the mathematics. In particular, we don t have the mathematical tools to work directl with rom variables /or densit functions. Actuall, we do have the tools but, as ou will recall from rom variable theor, the are primitive difficult to use. The particular functions that we are interested in are termed the various moments of the rom variable, x. The first of these is the first moment of x, which is also called the mean. It is defined as E x. (3-15) xf x dx With this, the reader can show that a Gaussian rom variable with the densit function in Equation 3-1 has a mean of. Two important moments of x are its second moment its second central moment. The first is termed the mean squared value of x the second is termed the variance of x. The square root of the variance is termed the stard deviation of x. The mean squared value variance are given b E x x f x dx (3-16) x E x f x dx, (3-17) respectivel Multiple Rom Variables For some experiments we can define several related or unrelated rom variables. For example, if two dice are used in the experiment, separate rom variables, x, can be assigned to each die. For the case of two rom variables we define joint distribution joint densit functions as, x F x P x (3-18) f x, respectivel. F x,, (3-19) x An example of the densit function for two, jointl Gaussian rom variables is f 1 1 x r x x x x, exp 1 x x x 1r r, (3-0) M. Budge, Jr Jan 018 0

7 's where the definitions of the 's should be obvious r is the correlation coefficient defined b x C x, r (3-1) C x, below. is the covariance between the rom variables x is defined If two rom variables, x, are independent we can write their joint densit as, f x f x f. (3-) The reverse also holds. If Equation 3- is valid then x are independent. We can relate the marginal densities of x to their joint densities b the equations, f x f x d (3-3), f f x dx. (3-4) As with single rom variables, we use joint moments of joint rom variables in lieu of using the actual rom variables, marginal densities joint densities. Again, the reason for this is our depth of mathematical tools. The two specific moments we use for two rom variables are their first joint moment their first, joint, central moment. The former we term the correlation the latter we term their covariance. The correlation is given b, x, R x E xf x dxd (3-5) the covariance is given b C x, E x x x x f x, dxd. (3-6) Some properties terms associated with correlation covariance are as follows. If Rx, 0 then x are orthogonal. If C x, 0 then x are uncorrelated. If x /or are zero mean then Rx, C x,. If x are independent, the are also uncorrelated. The reverse is not true. M. Budge, Jr Jan 018 1

8 If x are uncorrelated jointl Gaussian, the are also independent. 3.4 Rom Processes Introduction In the previous section we discussed the concept of rom variables wherein we assigned a number to experiment outcomes. In some experiments it is more appropriate to assign time functions to experiment outcomes. Examples of this might be where the experiment outcome is the selection of a generator or observation of the voltage at the output of a receiver. In the first case, we might assign sinusoidal time functions (see Figure 3-1) to each experiment outcome. The sinusoids might be of a different amplitude, frequenc relative phase for each experiment outcome (generator selection). For the second case we might assign time functions such as those of Figure 3-. Figure 3-1. Two samples of a rom process generator outputs M. Budge, Jr Jan 018

9 Figure 3-. Two samples of rom processes receiver outputs If we assign a time function,,, to each experiment outcome,, the collection of time functions is termed a rom process or a stochastic process. As we did with rom variables, we will not explicitl denote the dependenc on, as. write We note that, can represent four things 1. A famil of time functions for t variable. This is a stochastic process.. A single time function for t variable fixed. 3. A rom variable for t fixed variable. 4. A number for t fixed. We use representations 3 to help us devise was to deal with rom processes. Specificall we will use rom variable theor time-function, is a rom variable for a specific analsis techniques. Indeed, since value of t we can use the densit functions to characterize it. Thus, for a time t,, is the distribution function of, x, F P t x (3-7) its densit is f x, t F x, t x (3-8) We include the variable t in the expressions for the distribution densit functions to denote the fact that the characteristics (i.e., densit function) of the M. Budge, Jr Jan 018 3

10 rom variable, can change with time. F x, t f, the first-order distribution densit of the rom process. are termed If we consider two instants of time, t a t b, we can describe the characteristics of the rom variables at those times in terms of their firstorder densities the joint densit of the rom variables at the two times; f x, x, t, t. The joint densit is termed the second-order densit specificall of the rom process Autocorrelation, Autocovariance, Crosscorrelation, Crosscovariance As with rom variables, we find it easier to use the moments of the rom process rather than the rom process or its various densities. To this end, we define the mean of a rom process as x, t E t xf dx. (3-9) Note that the mean is, in general, a function of time. We define the autocorrelation autocovariance of a rom process as, x x R t t E t t (3-30) C t, t E t t (3-31) a b a a b b respectivel. The autocorrelation of a rom process is equivalent to the correlation between two rom variables the autocovariance of a rom process is equivalent to the covariance between two rom variables. Note that the autocorrelation autocovariance describe the relation between two rom variables that result b considering samples of a single rom process at two times. The fact that we are considering two rom variables leads to the use of correlation covariance. We use the prefix auto to denote the fact that the rom variables are derived from a single rom process. If we have two rom processes, crosscorrelation crosscovariance as t, we define their Rx ta, tb E a tb (3-3) C t, t E t t t (3-33) x a b a x a b b respectivel. We use the prefix cross to denote that the two rom variables at the two times are now derived from two, different rom processes. At this point we want to briefl delineate some properties of rom processes, autocorrelations, aurocovariances, crosscorrelations crosscovariances. M. Budge, Jr Jan 018 4

11 First we note that if is zero mean then, Also, if. If t a t b If /or t C t, t R t, t t, t. a b C t, t R t, t t, t are zero mean then then C t, t t, the variance of a a a x a b x (evaluated at Cx ta, tb 0 ta, tb we sa that the rom processes are uncorrelated. t t a t If the crosscorrelation of two rom processes is zero for all times then the rom processes are orthogonal. C t, t 0 t t we sa that the rom process If will discuss this again shortl). If the rom variables then the processes, x t a t t b are independent for all are independent. is white (we t a If two processes are independent the are also uncorrelated. The reverse is not normall true. However, if the processes are jointl Gaussian uncorrelated, the are also independent. Two processes are jointl Gaussian if all orders (first, second, etc.) of their individual densit functions are Gaussian functions if all orders of their joint densit functions are Gaussian. We sa that a stochastic process,, is strict sense stationar if the statistics of of this are that the first order densit of order densit functions of in the above sentence can be written in equation form as: f x, t f x, f x, x, t, t f x, x, t t, f x, x, x, t, t, t f x, x, x, t t, t t are independent of a shift in the time origin. The implications is independent of t. all higher depend onl upon relative time. The statements a b c a b c a b c a b a c so forth. As a result of the above we can write the first second moments as x, t E t xf dx xf x dx, (3-34), R t t R (3-35), C t t C. (3-36) We sa that two rom processes, ). t b t are jointl strict sense stationar if each is strict sense stationar if their joint statistics are independent of a shift in the time origin. If two rom processes are jointl strict sense stationar we can write M. Budge, Jr Jan 018 5

12 x, R t t R (3-37) x x, C t t C. (3-38) x The requirement for strict sense stationarit is usuall more restrictive than necessar. Thus, we define a weaker condition b saing that a rom process is wide sense stationar if Equations 3-34, are valid for that rom process. We sa that two processes, sense stationar if the are each wide sense stationar Equations are valid for the two rom processes. t, are jointl wide As a final definition, we define white noise or a white rom process. A rom process,, is white if C t, t E t t t (3-39) where is the Dirac delta function. The above discussions have considered continuous-time sstems. The concepts in the previous discussions can be extended to discrete-time sstems b replacing the time variables b multiples of some sample period, T. That is, t kt, t a k a T, t kt mt, so forth where k, m are integers. When we stud sampled data sstems we often drop the T variable simpl use the integers as arguments of the various variables. With this we can write the various means, autocorrelations, etc. of Equations 3-8 through 3-38 as x, k E k xf x k dx, (3-40), x x R k k E k k, (3-41) C k, k E x k k x k k, (3-4) a b a a b b Rx k, k E x k k, (3-43) C k, k E x k k k k (3-44) x a b a x a b b similarl for the remainder. For white, discrete time rom processes we write C k m, k E x k m k m x k k k s m (3-45) where m s m is the Kronecker delta function is defined as s 1 if m 0. (3-46) 0 if m 0 k a 3.5 Vector Rom Processes M. Budge, Jr Jan 018 6

13 When we work with Kalman filters we will generall work with state variables. As such, we will be concerned with vector rom processes. That is, collections of rom processes that we represent using vectors. Because of this we need to define some vector forms of the mean, autocorrelation, autocovariance, etc. equations that we defined in the previous pages. To start we represent the mean of a continuous time, vector rom process as t t 1 E x1 t E x t t Ex t. (3-47) n t Exn t R In a similar fashion we can write T t, t E xt x t x1 a x1 b x1 a x b x1 a xn b x a x1 b x a x b x a xn b E t t E t t E t t E t t E t t E t t E x1t E xt E n a b n a b n a n b a, b a, b, n a b,,, Rx 1 t R 1 x1 t R x1x t t R t t R t t R t t 1 n R t t R t t R t t n 1 n n n x x a b x x a b x x a b,,, x x a b x x a b x x a b. (3-48) The T superscript notation in Equation 3-48 denotes the transpose operation. It will be noted that the last matrix of Equation 3-85 contains both autocorrelation crosscorrelation terms. As a result, we term this matrihe autocorrelation matrix. We will often need to discuss the crosscorrelation between two, different, vector rom processes. This is again a matrix, which we will term the crosscorrelation matrix. This matrix has the form T t, t E t t R x x x1 a 1 b x1 a b x1 a m b x a 1 b x a b x a m b E t t E t t E t t E t t E t t E t t E 1t E t E t n a b n a b n a m b M. Budge, Jr Jan 018 7

14 a, b a, b, m a b,,, Rx 1 t t R 1 x1 t t R x1 t t R t t R t t R t t 1 m R t t R t t R t t n 1 n n m x a b x a b x a b,,, x a b x a b x a b (3-49) It will be noted that this equation contains onl crosscorrelation terms. It will also be noted that, while the autocorrelation matrix is alwas square, the crosscorrelation matrix is an n m matrix is most often not square. We define the autocovariance matrix b the equation x, x C t t E t t t t. (3-50) a b a a b b If we let t a t b t C, x we obtain the covariance matrix T x P T t t E t t t t t. (3-51) Finall, we write the crosscovariance matrix as, x C t t E t t t t (3-5) x a b a x a b b We can define the sampled data versions of Equations 3-47 through 3-5 b replacing the time variable, t, with the stage variable, k. Some notes on vector rom processes Two vector rom processes are uncorrelated if, onl if, C t, t 0 t, t. x T A continuous-time, vector rom process is white if, onl if, C t, t P t. A discrete-time, vector rom process is white if, onl if, C k m k P k m., s A vector rom process is Gaussian if all the elements of the vector are jointl Gaussian. M. Budge, Jr Jan 018 8

15 3.6 Problems 1. Derive the relations given b Equations Use Equations to show that f x 0 x f xdx 1 3. Show that the Gaussian densit of Equation 3-1 has a mean of. 4. Show that Rx, C x, if x 0 /or Show that two independent rom variables are also uncorrelated. 6. Show that two uncorrelated, jointl Gaussian rom variables are also independent. 7. Show that two independent rom processes are also uncorrelated. 8. Show that two uncorrelated, jointl Gaussian rom processes are also independent. T 9. Show that R k, k R k, k x a b x b a. 10. Show that P k is a non-negative definite matrix. M. Budge, Jr Jan 018 9

Review of Probability

Review of Probability Review of robabilit robabilit Theor: Man techniques in speech processing require the manipulation of probabilities and statistics. The two principal application areas we will encounter are: Statistical

More information

CONTINUOUS SPATIAL DATA ANALYSIS

CONTINUOUS SPATIAL DATA ANALYSIS CONTINUOUS SPATIAL DATA ANALSIS 1. Overview of Spatial Stochastic Processes The ke difference between continuous spatial data and point patterns is that there is now assumed to be a meaningful value, s

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Let X denote a random variable, and z = h(x) a function of x. Consider the

Let X denote a random variable, and z = h(x) a function of x. Consider the EE385 Class Notes 11/13/01 John Stensb Chapter 5 Moments and Conditional Statistics Let denote a random variable, and z = h(x) a function of x. Consider the transformation Z = h(). We saw that we could

More information

V-0. Review of Probability

V-0. Review of Probability V-0. Review of Probabilit Random Variable! Definition Numerical characterization of outcome of a random event!eamles Number on a rolled die or dice Temerature at secified time of da 3 Stock Market at close

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

University of California, Los Angeles Department of Statistics. Joint probability distributions

University of California, Los Angeles Department of Statistics. Joint probability distributions Universit of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Joint probabilit distributions So far we have considered onl distributions with one random variable.

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

Review of probability. Nuno Vasconcelos UCSD

Review of probability. Nuno Vasconcelos UCSD Review of probability Nuno Vasconcelos UCSD robability probability is the language to deal with processes that are non-deterministic examples: if I flip a coin 00 times how many can I expect to see heads?

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables: CHAPTER 5 Jointl Distributed Random Variable There are some situations that experiment contains more than one variable and researcher interested in to stud joint behavior of several variables at the same

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Discrete Probability Refresher

Discrete Probability Refresher ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Lecture 7 Random Signal Analysis

Lecture 7 Random Signal Analysis Lecture 7 Random Signal Analysis 7. Introduction to Probability 7. Amplitude Distributions 7.3 Uniform, Gaussian, and Other Distributions 7.4 Power and Power Density Spectra 7.5 Properties of the Power

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

Preliminary statistics

Preliminary statistics 1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),

More information

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES LECTURE NOTES ON PTSP (15A04304) B.TECH ECE II YEAR I SEMESTER

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

STA 247 Solutions to Assignment #1

STA 247 Solutions to Assignment #1 STA 247 Solutions to Assignment #1 Question 1: Suppose you throw three six-sided dice (coloured red, green, and blue) repeatedly, until the three dice all show different numbers. Assuming that these dice

More information

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

Problems and results for the ninth week Mathematics A3 for Civil Engineering students Problems and results for the ninth week Mathematics A3 for Civil Engineering students. Production line I of a factor works 0% of time, while production line II works 70% of time, independentl of each other.

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

System Identification

System Identification System Identification Arun K. Tangirala Department of Chemical Engineering IIT Madras July 27, 2013 Module 3 Lecture 1 Arun K. Tangirala System Identification July 27, 2013 1 Objectives of this Module

More information

Random Processes Why we Care

Random Processes Why we Care Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

Chapter 3. Expectations

Chapter 3. Expectations pectations - Chapter. pectations.. Introduction In this chapter we define five mathematical epectations the mean variance standard deviation covariance and correlation coefficient. We appl these general

More information

Parameterized Joint Densities with Gaussian and Gaussian Mixture Marginals

Parameterized Joint Densities with Gaussian and Gaussian Mixture Marginals Parameterized Joint Densities with Gaussian and Gaussian Miture Marginals Feli Sawo, Dietrich Brunn, and Uwe D. Hanebeck Intelligent Sensor-Actuator-Sstems Laborator Institute of Computer Science and Engineering

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

A brief review of basics of probabilities

A brief review of basics of probabilities brief review of basics of probabilities Milos Hauskrecht milos@pitt.edu 5329 Sennott Square robability theory Studies and describes random processes and their outcomes Random processes may result in multiple

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar PROBABILITY THEORY By Prof. S. J. Soni Assistant Professor Computer Engg. Department SPCE, Visnagar Introduction Signals whose values at any instant t are determined by their analytical or graphical description

More information

Properties of Probability

Properties of Probability Econ 325 Notes on Probability 1 By Hiro Kasahara Properties of Probability In statistics, we consider random experiments, experiments for which the outcome is random, i.e., cannot be predicted with certainty.

More information

Mathematical Statistics. Gregg Waterman Oregon Institute of Technology

Mathematical Statistics. Gregg Waterman Oregon Institute of Technology Mathematical Statistics Gregg Waterman Oregon Institute of Technolog c Gregg Waterman This work is licensed under the Creative Commons Attribution. International license. The essence of the license is

More information

Lecture - 30 Stationary Processes

Lecture - 30 Stationary Processes Probability and Random Variables Prof. M. Chakraborty Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 30 Stationary Processes So,

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

So Called Vacuum Fluctuations as Correlation Functions. Robert D. Klauber August 23,

So Called Vacuum Fluctuations as Correlation Functions. Robert D. Klauber August 23, So Called Vacuum Fluctuations as Correlation Functions Robert D. Klauber August, 6 www.quantumfieldtheor.info Refs: Introduction to Quantum Effects in Gravit, Muhanov, V., and Winitzi, S. (Cambridge, 7

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

Section 3.1. ; X = (0, 1]. (i) f : R R R, f (x, y) = x y

Section 3.1. ; X = (0, 1]. (i) f : R R R, f (x, y) = x y Paul J. Bruillard MATH 0.970 Problem Set 6 An Introduction to Abstract Mathematics R. Bond and W. Keane Section 3.1: 3b,c,e,i, 4bd, 6, 9, 15, 16, 18c,e, 19a, 0, 1b Section 3.: 1f,i, e, 6, 1e,f,h, 13e,

More information

Some Concepts of Probability (Review) Volker Tresp Summer 2018

Some Concepts of Probability (Review) Volker Tresp Summer 2018 Some Concepts of Probability (Review) Volker Tresp Summer 2018 1 Definition There are different way to define what a probability stands for Mathematically, the most rigorous definition is based on Kolmogorov

More information

Brief Review of Probability

Brief Review of Probability Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic

More information

A review of probability theory

A review of probability theory 1 A review of probability theory In this book we will study dynamical systems driven by noise. Noise is something that changes randomly with time, and quantities that do this are called stochastic processes.

More information

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc. Chapter 14 From Randomness to Probability Copyright 2012, 2008, 2005 Pearson Education, Inc. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,

More information

Lecture 4: Exact ODE s

Lecture 4: Exact ODE s Lecture 4: Exact ODE s Dr. Michael Doughert Januar 23, 203 Exact equations are first-order ODE s of a particular form, and whose methods of solution rel upon basic facts concerning partial derivatives,

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

ODIF FOR MORPHOLOGICAL FILTERS. Sari Peltonen and Pauli Kuosmanen

ODIF FOR MORPHOLOGICAL FILTERS. Sari Peltonen and Pauli Kuosmanen ODIF FOR MORPHOLOGICAL FILTERS Sari Peltonen Pauli Kuosmanen Signal Processing Laborator, Digital Media Institute, Tampere Universit of Technolog P.O. Box 553, FIN-331 TAMPERE, FINLAND e-mail: sari@cs.tut.fi

More information

8 Laws of large numbers

8 Laws of large numbers 8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable

More information

Single Maths B: Introduction to Probability

Single Maths B: Introduction to Probability Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction

More information

9.2. Cartesian Components of Vectors. Introduction. Prerequisites. Learning Outcomes

9.2. Cartesian Components of Vectors. Introduction. Prerequisites. Learning Outcomes Cartesian Components of Vectors 9.2 Introduction It is useful to be able to describe vectors with reference to specific coordinate sstems, such as the Cartesian coordinate sstem. So, in this Section, we

More information

Probability and Independence Terri Bittner, Ph.D.

Probability and Independence Terri Bittner, Ph.D. Probability and Independence Terri Bittner, Ph.D. The concept of independence is often confusing for students. This brief paper will cover the basics, and will explain the difference between independent

More information

Notes on Mathematics Groups

Notes on Mathematics Groups EPGY Singapore Quantum Mechanics: 2007 Notes on Mathematics Groups A group, G, is defined is a set of elements G and a binary operation on G; one of the elements of G has particularly special properties

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

Question Paper Code : AEC11T03

Question Paper Code : AEC11T03 Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)

More information

Notes on Probability

Notes on Probability Notes on Probability Mark Schmidt January 7, 2017 1 Probabilites Consider an event A that may or may not happen. For example, if we roll a dice then we may or may not roll a 6. We use the notation p(a)

More information

2 Ordinary Differential Equations: Initial Value Problems

2 Ordinary Differential Equations: Initial Value Problems Ordinar Differential Equations: Initial Value Problems Read sections 9., (9. for information), 9.3, 9.3., 9.3. (up to p. 396), 9.3.6. Review questions 9.3, 9.4, 9.8, 9.9, 9.4 9.6.. Two Examples.. Foxes

More information

Fundamentals of Probability CE 311S

Fundamentals of Probability CE 311S Fundamentals of Probability CE 311S OUTLINE Review Elementary set theory Probability fundamentals: outcomes, sample spaces, events Outline ELEMENTARY SET THEORY Basic probability concepts can be cast in

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Probability Theory Refresher

Probability Theory Refresher Machine Learning WS24 Module IN264 Sheet 2 Page Machine Learning Worksheet 2 Probabilit Theor Refresher Basic Probabilit Problem : A secret government agenc has developed a scanner which determines whether

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

Fundamentals of Noise

Fundamentals of Noise Fundamentals of Noise V.Vasudevan, Department of Electrical Engineering, Indian Institute of Technology Madras Noise in resistors Random voltage fluctuations across a resistor Mean square value in a frequency

More information

6. Vector Random Variables

6. Vector Random Variables 6. Vector Random Variables In the previous chapter we presented methods for dealing with two random variables. In this chapter we etend these methods to the case of n random variables in the following

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

IV. Covariance Analysis

IV. Covariance Analysis IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.

More information

P [(E and F )] P [F ]

P [(E and F )] P [F ] CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 This worksheet supplements our textbook material on the concepts of conditional probability and independence. The exercises at the end of each

More information

4.3 Exercises. local maximum or minimum. The second derivative is. e 1 x 2x 1. f x x 2 e 1 x 1 x 2 e 1 x 2x x 4

4.3 Exercises. local maximum or minimum. The second derivative is. e 1 x 2x 1. f x x 2 e 1 x 1 x 2 e 1 x 2x x 4 SECTION 4.3 HOW DERIVATIVES AFFECT THE SHAPE OF A GRAPH 297 local maimum or minimum. The second derivative is f 2 e 2 e 2 4 e 2 4 Since e and 4, we have f when and when 2 f. So the curve is concave downward

More information

VISION TRACKING PREDICTION

VISION TRACKING PREDICTION VISION RACKING PREDICION Eri Cuevas,2, Daniel Zaldivar,2, and Raul Rojas Institut für Informati, Freie Universität Berlin, ausstr 9, D-495 Berlin, German el 0049-30-83852485 2 División de Electrónica Computación,

More information

Probability: Review. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Probability: Review. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics robabilit: Review ieter Abbeel UC Berkele EECS Man slides adapted from Thrun Burgard and Fo robabilistic Robotics Wh probabilit in robotics? Often state of robot and state of its environment are unknown

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Lecture 4: Probability and Discrete Random Variables

Lecture 4: Probability and Discrete Random Variables Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1

More information

Lecture 4 Propagation of errors

Lecture 4 Propagation of errors Introduction Lecture 4 Propagation of errors Example: we measure the current (I and resistance (R of a resistor. Ohm's law: V = IR If we know the uncertainties (e.g. standard deviations in I and R, what

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Symmetry Arguments and the Role They Play in Using Gauss Law

Symmetry Arguments and the Role They Play in Using Gauss Law Smmetr Arguments and the Role The la in Using Gauss Law K. M. Westerberg (9/2005) Smmetr plas a ver important role in science in general, and phsics in particular. Arguments based on smmetr can often simplif

More information

Multivariate Distributions (Hogg Chapter Two)

Multivariate Distributions (Hogg Chapter Two) Multivariate Distributions (Hogg Chapter Two) STAT 45-1: Mathematical Statistics I Fall Semester 15 Contents 1 Multivariate Distributions 1 11 Random Vectors 111 Two Discrete Random Variables 11 Two Continuous

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Probability Theory Review Reading Assignments

Probability Theory Review Reading Assignments Probability Theory Review Reading Assignments R. Duda, P. Hart, and D. Stork, Pattern Classification, John-Wiley, 2nd edition, 2001 (appendix A.4, hard-copy). "Everything I need to know about Probability"

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

Experimental Uncertainty Review. Abstract. References. Measurement Uncertainties and Uncertainty Propagation

Experimental Uncertainty Review. Abstract. References. Measurement Uncertainties and Uncertainty Propagation Experimental Uncertaint Review Abstract This is intended as a brief summar of the basic elements of uncertaint analsis, and a hand reference for laborator use. It provides some elementar "rules-of-thumb"

More information

University of Regina. Lecture Notes. Michael Kozdron

University of Regina. Lecture Notes. Michael Kozdron University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating

More information

Statistics and Econometrics I

Statistics and Econometrics I Statistics and Econometrics I Random Variables Shiu-Sheng Chen Department of Economics National Taiwan University October 5, 2016 Shiu-Sheng Chen (NTU Econ) Statistics and Econometrics I October 5, 2016

More information

PROBABILITY AND RANDOM VARIABLES

PROBABILITY AND RANDOM VARIABLES CHAPTER PROBABILIT AND RANDOM VARIABLES. Introduction.. Basics of Probability.3.. Terminology.3.. Probability of an event.5.3 Random Variables.3.3. Distribution function.3.3. Probability density function.7.3.3

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Properties of the Autocorrelation Function

Properties of the Autocorrelation Function Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

CS 246 Review of Proof Techniques and Probability 01/14/19

CS 246 Review of Proof Techniques and Probability 01/14/19 Note: This document has been adapted from a similar review session for CS224W (Autumn 2018). It was originally compiled by Jessica Su, with minor edits by Jayadev Bhaskaran. 1 Proof techniques Here we

More information

ECE531: Principles of Detection and Estimation Course Introduction

ECE531: Principles of Detection and Estimation Course Introduction ECE531: Principles of Detection and Estimation Course Introduction D. Richard Brown III WPI 22-January-2009 WPI D. Richard Brown III 22-January-2009 1 / 37 Lecture 1 Major Topics 1. Web page. 2. Syllabus

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Chapter 3 Single Random Variables and Probability Distributions (Part 1) Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function

More information

Introduction to Information Entropy Adapted from Papoulis (1991)

Introduction to Information Entropy Adapted from Papoulis (1991) Introduction to Information Entropy Adapted from Papoulis (1991) Federico Lombardo Papoulis, A., Probability, Random Variables and Stochastic Processes, 3rd edition, McGraw ill, 1991. 1 1. INTRODUCTION

More information