noise = function whose amplitude is is derived from a random or a stochastic process (i.e., not deterministic)

Similar documents
Random variables. DS GA 1002 Probability and Statistics for Data Science.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Probability Distributions

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

CS 361: Probability & Statistics

Data, Estimation and Inference

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

Chapter 2: The Random Variable

Chapter 4. Repeated Trials. 4.1 Introduction. 4.2 Bernoulli Trials

Lecture 3. Discrete Random Variables

Binomial and Poisson Probability Distributions

Statistics, Probability Distributions & Error Propagation. James R. Graham

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Chapter 2: Random Variables

Probability Distributions.

Introduction to Probability

Photon noise in holography. By Daniel Marks, Oct

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Probability Theory for Machine Learning. Chris Cremer September 2015

Guidelines for Solving Probability Problems

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

BINOMIAL DISTRIBUTION

Probability and Statistics

Chapter 3: Random Variables 1

Discrete Random Variables

Discrete Distributions

Chapter 5. Means and Variances

Lecture 20 : Markov Chains

ΔP(x) Δx. f "Discrete Variable x" (x) dp(x) dx. (x) f "Continuous Variable x" Module 3 Statistics. I. Basic Statistics. A. Statistics and Physics

Week 1 Quantitative Analysis of Financial Markets Distributions A

Chap 1: Experiments, Models, and Probabilities. Random Processes. Chap 1 : Experiments, Models, and Probabilities

Lecture 2: Discrete Probability Distributions

Random Variables Example:

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Physics 1140 Lecture 6: Gaussian Distributions

MFM Practitioner Module: Quantitative Risk Management. John Dodson. September 23, 2015

Random variables (section 6.1)

Probability Distribution

Lecture 2 Binomial and Poisson Probability Distributions

Topic 3: The Expectation of a Random Variable

Probability Distributions - Lecture 5

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Notes 10.1 Probability

functions Poisson distribution Normal distribution Arbitrary functions

Discrete Structures for Computer Science

DS-GA 1002 Lecture notes 11 Fall Bayesian statistics

Lecture 4: Probability and Discrete Random Variables

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Basics on Probability. Jingrui He 09/11/2007

L2: Review of probability and statistics

+ + ( + ) = Linear recurrent networks. Simpler, much more amenable to analytic treatment E.g. by choosing

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Multiple Random Variables

Probability and Statistics Concepts

Expected Value - Revisited

Discrete Random Variables. David Gerard Many slides borrowed from Linda Collins

If there exists a threshold k 0 such that. then we can take k = k 0 γ =0 and achieve a test of size α. c 2004 by Mark R. Bell,

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Topic 9 Examples of Mass Functions and Densities

Chapter 3: Random Variables 1

SUFFICIENT STATISTICS

Lecture 2 31 Jan Logistics: see piazza site for bootcamps, ps0, bashprob

Introduction to Error Analysis

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

DEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Probability Distributions Columns (a) through (d)

Physics 6720 Introduction to Statistics April 4, 2017

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution

Page Max. Possible Points Total 100

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Bernoulli Trials, Binomial and Cumulative Distributions

I will post Homework 1 soon, probably over the weekend, due Friday, September 30.

5. Conditional Distributions

CS 361: Probability & Statistics

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

The random variable 1

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Discrete Random Variable

Human-Oriented Robotics. Probability Refresher. Kai Arras Social Robotics Lab, University of Freiburg Winter term 2014/2015

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Chapter 8: An Introduction to Probability and Statistics

1 Random Variable: Topics

Lecture 7 Random Signal Analysis

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

Some Statistics. V. Lindberg. May 16, 2007

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

STATISTICAL THINKING IN PYTHON I. Probabilistic logic and statistical inference

Question Paper Code : AEC11T03

CME 106: Review Probability theory

INTRODUCTION TO BAYESIAN INFERENCE PART 2 CHRIS BISHOP

Lecture 1: Probability Fundamentals

Transcription:

SIMG-716 Linear Imaging Mathematics I, Handout 05 1 1-D STOCHASTIC FUCTIOS OISE noise = function whose amplitude is is derived from a random or a stochastic process (i.e., not deterministic) Deterministic: f at x speciþed completely by some parameters (width, amplitude, etc.) Stochastic function n at x selected from distribution that describes the probability of occurrence Only statistical averages of signal amplitude are speciþed Example of discrete process: number of individual photons measured by detector in time interval t must be integer number counted during disjoint intervals of equal length exhibit a degree of scatter about some mean value. Example of continuous stochastic process spatial distribution of photon arrivals from an optical system; the location [x n,y n ] otation: n [x] instead of f [x]; n is used to indicate its noisy character. potential confusion because n also speciþes an integer coordinate, e.g., f [n] Complex-valued noise constructed by adding two random variables at each coordinate after weighting one by i: n [x] =n 1 [x]+in 2 [x] 1.1 PROBABILITY DISTRIBUTIO P n = probability of the integer value n for a discrete distribution p [n] = probability density of a real-valued amplitude n derived from a continuous distribution probability that continuous variable n lies within a speciþc interval is the integral of p [n] over that interval, e.g. Z b P (a n b) = p [n] dn 1.2 THE MOMETS of a PROBABILITY DISTRIBUTIO distribution of probability about origin of coordinates determined by moments of p [n] k th moment is area of product of p [n] and a weight factor n k : m k {p [n]} a n k p [n] dn 1

Zeroth moment evaluated by setting k =0; is projection of unit constant onto p [n], which must be the unit area of p [n]: m 0 {p [n]} = Firstmomentisprojectionofn 1 onto p [n] n 0 p [n] dn = p [n] dn = n 0 = 1 ampliþes contributions from large values of n. Mean value of amplitude n selected from probability distribution p [n] otations: hni, ˆn, andµ m 1 {p [n]} = n 1 p [n] dn = n 1 = hni hni is value of n that divides p [n] into two equal parts 1.3 Central Moments expected values of (n hni) k measure weighted variation of p [n] about mean: Þrst central moment µ 1 =0. µ k {p [n]} second moment is projection of n 2 onto p [n]: m 2 {p [n]} = second central moment of p [n] is variance σ 2 (n hni) k p [n] dn n 2 p [n] dn n 2 measures spread of noise probability about mean value: µ 2 {p [n]} (n hni) 2 p [n] dn σ 2 Expand (n hni) 2 and evaluating three resulting integrals to obtain important relationship among the mean, variance, and n 2 σ 2 = n 2 hni 2 2

1.4 DISCRETE PROBABILITY LAWS model processes that have discrete (and often binary) outcomes particularly useful when constructing models of such imaging processes as photon absorption by sensor. Simplest type of discrete probability law applies to events that have only two possible outcomes success or failure true or false on or off head or tail. Individual implementation of binary event is Bernoulli trial Collective realizations of many such events described by binomial and Poisson distributions. 1.4.1 BEROULLI TRIALS Consider ßux of photons onto imperfect absorbing material Individual photons may be absorbed or not Testing of absorption of a photon is a Bernoulli trial successful absorption indicated by 1, and failure by 0 Statistics of string of Bernoulli trials speciþed by probability of success (outcome 1, denoted by p ) 0 p 1 Probability of failure (outcome 0 ) is 1 p, often denoted by q. Relative probabilities for particular absorber determined from physical model of interaction or from observed results of large number of Bernoulli trials Images of independent Bernoulli trials for different values of p: Examples of 64 Bernoulli trials: (a) p =0.2, (b) p =0.5. 3

1.4.2 MULTIPLE BEROULLI TRIALS THE BIOMIAL PROBABILITY LAW Bernoulli trials where probability of outcome 1 is p and the probability of outcome 0 is 1 p umber of possible distinct outcomes is 2 Relative likelihood of outcomes is determined by p. Probability of speciþcsequence101010 10 (alternating 1 and 0, assuming is even) p (10101010 101010) = p q p q p q = p 2 q 2 Equal numbers of successes and failures. Probability of a different sequence where the Þrst 2 outcomes are 1 and the remaining 2 outcomes are 0 is same: p (1111111 0000000) = p p p p q q q q q = p 2 q 2 Two distinct outcomes have same number of successes and failures, and therefore have identical histograms. Outcome of trials with exactly n successes. The probability of a speciþc such outcome is: p 1 p 2 p 3 p n q 1 q 2 q 3 q n = p n q n In many applications, order of arrangement is not signiþcant; only total number of successes n matters. Compute number of possible combinations of n successes in trials. It is straightforward to show that this number is the binomial coefficient : µ! ( n)! n! n probability of n successes in trials is: P n =! n! ( n)! pn (1 p) n = µ n Consider a coin ßip with two outcomes, H and T. p n [1 p] [ n] =4= number combinations with n =2heads is 4! 2!2! =6 HHTT, HTHT, THHT, THTH, HTTH, andtthh If H and T equally likely (p = q =0.5) = probability of two heads in four ßips is P 2 = 6(0.5) 2 (0.5) 2 =0.375. umber of realizations of no heads in four ßips is 4! 0!4! = 1, Probability of no heads in four ßips is P 0 = 1 (0.5) 0 (0.5) 4 =0.0625. Binomial law also applies to cases where p 6= q Flipping unfair coin p =0.75 = probability that four ßips would produce two heads is P 2 =6(0.75) 2 (0.25) 2 ' 0.211 4

less than probability of two heads in four ßips of fair coin. Mean and variance of number of outcomes with individual probability p in experiments obtained by substitution hni = p σ 2 = p[1 p] Shape of histogram is approximately Gaussian. Gaussian distribution is continuous Binomial distribution is discrete Observation suggests that samples obtained from large number of independent Bernoulli trials with probability p may be approximately generated by thresholding values generated by a Gaussian distribution. 5

Realizations of Bernoulli trials at 8192 samples with p = 0.5 and the resulting histograms: (a) = 1 trial per sample, two (nearly) equally likely outcomes; (b) =2;(c) = 10; and(d) = 100. The histogram approaches a Gaussian function for large. 6

1.4.3 POISSO PROBABILITY LAW Approximation to binomial law for large numbers of rarely occurring events, i.e., >>1 and p 0 Mean number of events is hni = p, denoted by λ Form of Poisson law obtained by substituting into the binomial law in limit : ½µ n λ p n = lim 1 λ ¾ [ n] n Take the natural logarithm of both sides to obtain: ½ µ n λ log e [p n ]= lim log e 1 λ n ½ µ λ = lim log e n µ ½log e = lim! [ n]! n! ¾ [ n] n ¾ + lim ½[ n] log e λ n ¾ + lim 1 λ ) ( loge 1 λ [ n] 1 Use fact that n is small to evaluate Þrst additive term: ½ µ n ¾ ½ ( 1) ( 2) ( n + 1) λ n lim log e ' log n! e n! ¾ µ n ¾ λ Second term evaluated by recognizing as ratio of two terms that both approach zero in the limit Apply l Hôspital s rule: Collect terms: lim ( loge 1 λ ( n) 1 ) = lim = lim = lim ( d d d d ( 1 λ ) loge 1 λ ( n) 1 1 λ ) 2 ( n) 2 ( µ n λ λn log e [p n ]=log e λ = p n = n! 2 µ 1 λ ) 1 λn n! Poisson distribution is particular limiting case of binomial distribution Mean, variance, and third central moment of Poisson distribution are identically λ e λ 7

Comparison of binomial and Poisson random variables, = 100: (a) binomial, p =0.75, hni =75.05, σ 2 = 18.68; (b) Poisson, λ =75, hni =74.86, σ 2 =74.05; (c) binomial, p =0.25, hni =24.93, σ 2 = 18.77; (d) Poisson, λ =25, hni =25.01, σ 2 =24.85; (e) binomial, p =0.05, hni =5.00, σ 2 =4.71; (f)poisson,λ =5, hni =4.97, σ 2 =4.97. 8

1.5 COTIUOUS PROBABILITY DISTRIBUTIOS 1.5.1 UIFORM DISTRIBUTIO generates most intuitive type of noise amplitude n equally likely to occur within any Þnite interval of equal size. p Uniform [n] = 1 n hni b RECT b b is width of allowed values of n hni is mean value Multiplicative scale factor b 1 ensures unit area Variance of uniform distribution is σ 2 = b2 12. Uniformly distributed random variable on interval [0, 1) with µ =0.5 and σ 2 = 1 12 :(a)sample,(b) histogram. 9

1.5.2 ORMAL DISTRIBUTIO Familiar symmetric bell curve of probability Most applicable of all probability laws hni = most likely amplitude (peak of the probability density) Probability that amplitude will differ from mean progressively decreases as the value moves away from hni Probability density function is Gaussian function with width parameter b proportional to standard deviation σ of probability distribution p ormal [n] = 1 e (n hni)2 2σ 2 2πσ Leading factor 2π 1 ensures that area of the probability density function is unity Samples of a random variable generated by a normal distribution with hni =0, σ 2 = 1: (a) samples, (b) histogram. Central-Limit Theorem: Cascade of stochastic processes derived from (nearly) arbitrary set of probability density functions generates a normal distribution Central-limit theorem ensures that probability law of outputs is generally Gaussian, to good approximation. 10

1.5.3 RAYLEIGH DISTRIBUTIO Imaging applications that involve Fourier transforms of distributions of complex-valued random variables description of Fraunhofer diffraction from a random scatterer computer-generated holography. Distribution of magnitude where where real and imaginary parts are random variables selected from same Gaussian distribution. probability density function characterized by single parameter a: p Rayleigh [n] = n a 2 e ³ n2 2a 2 STEP [n] STEP function ensures that allowed amplitudes n must be nonnegative Mean hni and variance σ 2 of the Rayleigh distribution must be functions of the parameter a: σ 2 = hni = ³ 2 π 2 r π 2 a ' 1.25a a 2 ' 0.429a 2, σ ' 0.655a Rayleigh-distributed random variable generated from Gaussian-distributed random variables in quadrature, each with hni =0, σ 2 = 1: (a) sample, (b) histogram. Resulting mean and variance are hni = 1.270, σ 2 =0.424. 11