NOISE-BENEFIT FORBIDDEN INTERVAL THEOREMS FOR THRESHOLD SIGNAL DETECTORS BASED ON CROSS CORRELATION

Size: px
Start display at page:

Download "NOISE-BENEFIT FORBIDDEN INTERVAL THEOREMS FOR THRESHOLD SIGNAL DETECTORS BASED ON CROSS CORRELATION"

Transcription

1 OISE-BEEFIT FORBIDDE ITERVAL THEOREMS FOR THRESHOLD SIGAL DETECTORS BASED O CROSS CORRELATIO Sanya Mitaim and Bart Kosko Department of Electrical and Computer Engineering, Faculty of Engineering Thammasat University, Pathumthani, Thailand Department of Electrical Engineering, Signal and Image Processing Institute University of Southern California, Los Angeles, California , USA ABSTRACT We show that the main forbidden interval theorems of stochastic resonance hold for a correlation performance measure. Earlier theorems held only for performance measures based on mutual information or the probability of error detection. Forbidden interval theorems ensure that a threshold signal detector benefits from deliberately added noise if the average noise does not lie in an interval that depends on the threshold value. We first show that this result also holds for all finitevariance noise and for all forms of infinite-variance stable noise. A second forbidden-interval theorem gives necessary and sufficient conditions for a local noise benefit in a bipolar signal system when the noise comes from a location-scale family. A third theorem gives a general condition for a local noise benefit for arbitrary signals with finite second moments and for locationscale noise. This result also extends forbidden intervals to forbidden bands of parameters. A fourth theorem gives necessary and sufficient conditions for a local noise benefit when both the independent signal and noise are normal. A final theorem derives necessary and sufficient conditions for forbidden bands when using arrays of threshold detectors for arbitrary signals and location-scale noise. Keywords: cross correlation, forbidden interval theorem, noise benefit, stable noise, stochastic resonance, threshold systems. I. FORBIDDE ITERVAL THEOREMS FOR CROSS CORRELATIO We show for the first time that forbidden interval theorems (FITs hold for a cross-correlation performance measure. These correlation noise benefits also extend to arrays of threshold neurons. Earlier results found similar noise benefits only for mutual information [], [], [3], [4], [5], [6] or probability of error [7]. We have found experimental evidence of both correlation-based and bit-error-based noise benefits for carbon nanotube detectors [8], [9] when testing a FIT prediction of a mutual-entropy-based noise benefit in a threshold detector. Other SR results have used a correlation performance measure to demonstrate a noise benefit [], [], [] but not as a FIT. Moskowitz has recently extended FITs to algebraic information theory [3]. A forbidden interval theorem states a sufficient or necessary condition for a nonlinear signal system to benefit from added noise so long as the average noise does not fall in an interval of parameter values. A FIT acts as a type of screening device for a nonlinear system because it tells the user whether the system can have a noise benefit at all. This screening effect also holds for the related necessary and sufficient inequalities that ensure a noise benefit based on maximum likelihood or eyman-pearson signal detection [4]. Adaptive algorithms or other schemes can then find the optimal noise level in systems that possess noise benefits. The first FITs applied only to nonlinear signal systems that used mutual information as the performance measure [], [3]. Corollary in [7] was a necessary-condition FIT based not on mutual information but on the probability of detection error. All FITs give rigorous conditions for a noise benefit or so-called stochastic resonance or SR [], [], [3], [7], [], [4], [5], [6], [7], [8], [9], [], [], [], [3]. This paper extends correlation-based SR to threshold systems and threshold arrays that obey quantitative FITs. These threshold systems model threshold-like behavior in a wide range of physical and biological systems [8], [4], [5], [6], [7], [8]. Theorem below gives a direct correlation FIT dual of our earlier mutual-information FIT for threshold signals. It holds for all possible additive noise that has a finite variance. It further holds for all infinite-variance noise from the general stable family of probability density functions that includes Cauchy and Gaussian noise as special cases. on-gaussian stable noise does not have a mean but it does have a location parameter

2 that acts like the mean and that equals the median if the stable noise is symmetric. The FIT holds in the stable case for noise whose location parameter does not lie in the forbidden interval. This first correlation FIT produces total SR in the sense that added noise achieves the correlation maximum for the threshold system. Figure shows a simulation instance of Theorem. We converted the binary yin-yang image to a bipolar image with amplitude A and used it as input to the threshold system (. The threshold θ is while the bipolar signal amplitude A is.7. This gives the FIT of (θ A, θ +A (.3,.7. The uniform noise mean µ is. So there is a noise benefit because µ / (.3,.7. The second panel uses uniform noise with mean µ. So there is no noise benefit because the noise mean falls in the FIT (.3,.7. Theorem gives a new type of correlation FIT for partial SR or a local noise benefit as in [7], [3], [4]. The FIT gives necessary and sufficient conditions for a positive correlation derivative with respect to the standard deviation of the added noise : >. The signal system is a more complex bipolar signal system but the FIT applies only to noise that comes from a location-scale family. Location-scale family noise includes many common types of noise such as uniform, Gaussian, and alpha-stable noise. It does not include Poisson noise. The next section describes the stochastic threshold signal system and the basic properties of the crosscorrelation performance measure. Section II states and proves the correlation FIT of Theorem for finitevariance noise and for infinite-variance stable noise. Section III sets up and proves the local correlation FIT of Theorem for location-scale noise and bipolar signals. The final sections extend the correlation FITs to different noise and signal types and to arrays of detectors. II. CROSS CORRELATIO I A STOCHASTIC THRESHOLD DETECTOR WITH SUBTHRESHOLD BIPOLAR IPUT SIGALS This section describes the stochastic threshold signal system and introduces the performance measure of cross correlation. Consider the standard discrete-time threshold signal detector [], [3], [], [5], [7], [8], [], [9], [3] { if S + θ Y sgn(s + θ if S + < θ. ( Here θ > is the system s threshold. We assume that the additive noise is white with the probability density function (pdf f (n. This white-noise assumption does not limit the FIT analysis. The same system analysis applies to other types of noise with correlation functions R (τ because these threshold systems are feedforward systems and thus have no dynamics. We consider bipolar input Bernoulli signal S with arbitrary success probability p such that < p < with amplitude A >. Thus the signal s pdf has the form f S (s pδ(s + A + ( pδ(s A ( where δ denotes the Dirac delta function. We assume subthreshold input signals for this threshold system: A < θ. The output Y of the threshold system exactly matches the input signal S when A > θ and so there is no noise benefit. Let the symbol denote the input signal S A and output signal Y. The symbol denotes the input signal S A and output signal Y. Then the conditional probabilities P Y S (y s are P Y S ( P r{s + < θ} (3 S A P r{ < θ + A} (4 f (ndn F (θ + A (5 P Y S ( P Y S ( F (θ + A (6 P Y S ( P r{s + < θ} (7 SA f (ndn F (θ A (8 P Y S ( P Y S ( F (θ A (9 where F (z is the cumulative distribution function (cdf of. The marginal density P Y is P Y (y s P Y S (y sp S (s ( by the theorem on total probability. Other researchers have derived the conditional probabilities P Y S (y s of the threshold system with Gaussian noise with bipolar inputs [5] and Gaussian inputs [3]. We neither restrict the noise density to be Gaussian nor require that the density have finite variance even if the density has a bell-curve shape. We use cross correlation to measure the noise benefit or the SR effect. The cross correlation of the input random variable S and output random variable Y is the expectation of the product of the two variables S

3 Sample cross correlation C /n Σ s ij y ij σ * Standard deviation of additive uniform noise Sample cross correlation C /n Σ s ij y ij Standard deviation of additive uniform noise Fig.. Forbidden-interval noise benefit in a subthreshold signal system ( that uses the bipolar yin-yang image as its input. The threshold θ is θ. The bipolar signal has amplitude A.7. Top panels: The noise is uniform with mean µ. So the mean does not lie in the forbidden interval (θ A, θ + A (.3,.7. Thus the correlation system benefits from added noise and the correlation curve shows the signature nonmonotonic inverted-u of a global SR noise benefit. The optimal noise is approximately σ.98. Bottom panels: The noise is uniform with mean µ. So the mean lies in the forbidden interval (θ A, θ + A (.3,.7. Thus there is no noise benefit. y Y and Y : C E[SY ] syp SY (s, y ( s S y Y syp Y S (y sp S (s. ( s S y Y We can also use the respective cross covariance K E[(S µ S (Y µ Y ] C µ S µ Y (3 where µ S E[S] sp S (s and µ Y E[Y ] s S yp Y (y. The cross correlation C and cross covariance K can be any real number. We next prove a lemma that allows a direct proof of Theorem. The lemma states that the signal S and output Y always have a lower-bounded correlation: C µ S µ Y. S and Y are uncorrelated if and only if C µ S µ Y. And independence implies uncorrelatedness. Then Theorem states that the noise mean µ E[] does not lie in the forbidden subthreshold interval (θ A, θ + A if and only if S and Y are asymptotically independent (and thus C µ S µ Y as the noise standard deviation σ for finite-variance noise (or as the noise dispersion γ for alpha-stable noise with infinite variance. The lemma further shows that C > µ S µ Y when the forbidden interval (θ A, θ + A has positive noise probability. The case C µ S µ Y holds just when the

4 4 forbidden interval has zero noise probability. So the idea behind the proof of Theorem is that what goes down must go up [], []: Increasing the noise variance or dispersion must necessarily increase the correlation C at some point. That increase is precisely an SR noise benefit. So the entire proof of Theorem rests on showing that the forbidden interval has asymptotically zero noise probability as the noise variance or dispersion shrinks to zero. Lemma. The threshold system ( has an input-output cross correlation C that is at least µ S µ Y : C µ S µ Y. Further: C > µ S µ Y if λ > and C µ S µ Y if λ where λ is the noise probability of the forbidden interval (θ A, θ + A: λ f (ndn. (4 Proof. The cross correlation has the form C syp Y S (y sp S (s. (5 s S y Y ote that C µ S µ Y if S and Y are uncorrelated where µ S µ Y syp S (sp Y (y. (6 s S y Y Thus (5 (9 imply that P Y S ( P Y S ( P Y S ( P Y S ( f (ndn λ. (7 f (ndn λ. (8 The two-symbol alphabet set S and the theorem on total probability give P Y (y P Y S (y sp S (s s (9 P Y S (y P S ( + P Y S (y P S ( ( P Y S (y P S ( + P Y S (y ( P S ( ( (P Y S (y P Y S (y P S ( + P Y S (y ( P Y S (y ( P S ( + P Y S (y P S ( (3 P Y S (y (P Y S (y P Y S (y P S (. (4 It follows from (7 (8 that P Y ( P Y S ( + λp S ( P Y S ( (5 P Y ( P Y S ( λp S ( P Y S (. (6 We have similarly that P Y ( P Y S ( + λp S ( P Y S ( (7 P Y ( P Y S ( λp S ( P Y S (. (8 Thus P Y S (y s P Y (y if sy A (9 P Y S (y s P Y (y if sy A (3 since either y or y. Then (9 (3 imply that syp S (sp Y S (y s syp S (sp Y (s (3 for all s { A, A} and y {, }. So summing gives C µ S µ Y. Suppose last that the noise pdf f (n has nonzero measure in (θ A, θ + A. Then λ > and the inequalities (5 (8 become strict inequalities: Then P Y ( > P Y S ( (3 P Y ( < P Y S ( (33 P Y ( > P Y S ( (34 P Y ( < P Y S (. (35 C > µ S µ Y and K >. (36 ote that λ implies that f (n has zero mass in the interval (θ A, θ + A. Then C µ S µ Y and K. (37 Q.E.D. III. THE FIRST CORRELATIO FORBIDDE ITERVAL THEOREM: THRESHOLD SYSTEMS WITH BIPOLAR SUBTHRESHOLD SIGALS This section states and proves the first necessary and sufficient condition for SR based on correlation in a threshold system. The theorem holds for all noise with finite second moments and all alpha-stable impulsive noise. Stable noise has infinite variance if α < but such noise still has finite lower-order moments up to order α if α <. Gaussian noise is stable with α. Cauchy noise is stable with α. A general alpha-stable pdf f has characteristic function or Fourier transform φ [3], [33], [34], [35], [36]: where φ(ω exp {iaω γ ω α ( + iβsgn(ωγ} (38 Γ { tan απ for α π ln ω for α (39 and i, < α, β, and γ >. The parameter α is the characteristic exponent. Again the variance of an alpha-stable density does not exist if

5 5 α <. The location parameter a acts like the mean of the density when α >. β is a skewness parameter. The density is symmetric about a when β. Then α controls the tail thickness. The bell curve has thicker tails as α falls. The theorem below still holds even when β. The dispersion parameter γ κ α acts like a variance because it controls the width of a symmetric alpha-stable bell curve where κ is the scale parameter. There are few known closed forms of the α-stable densities for symmetric bell curves (when β [34]. umerical integration of φ gives the probability densities f(n. Figure gives examples of alpha-stable pdfs and their white-noise realizations. Figure (c shows that non-gaussian bell curves can have infinite variance and yet have a finite dispersion. Below we state the first noise-benefit theorem for the threshold system ( for any finite-variance noise and alpha-stable noise. ote that when the noise standard deviation shrinks to zero (or dispersion γ then the pdf f (n δ(n µ where µ is the noise mean (or f (n δ(n a for alpha-stable noise with location a for delta pulse δ. The lemma allows the proof of this noise-benefit theorem to turn on showing that S and Y are asymptotically independent (and thus asymptotically uncorrelated as the noise probability of the forbidden interval shrinks to zero. Figure illustrates the sufficient and necessary conditions in Theorem below for uniform noise added to a binary yin-yang image. The first panel shows sufficiency: There is a correlation noise benefit because the noise mean of does not fall in the forbidden interval (.3,.7. The second panel shows necessity: There is no correlation noise benefit because the lowerintensity noise has mean and thus the mean falls in the forbidden interval (.3,.7. The theorem s necessity result shows that the correlation maximum occurs when there is no noise. It does not rule out possible local noise fluctuations where a local increase in the noise intensity can produce a local increase in correlation. Theorem : Suppose that the threshold signal system ( has noise pdf f (n and that the input signal S is subthreshold (A < θ. Suppose that the noise mean µ E[] does not lie in the signal-threshold interval (, θ+a if has finite variance. Suppose that a / (θ A, θ +A for the location parameter a of an alphastable noise density with characteristic function (38. Then the threshold system ( exhibits the nonmonotone SR effect in the sense that C µ S µ Y as σ or γ. Conversely: There is no noise benefit in C if µ (θ A, θ + A or a (θ A, θ + A Proof. Sufficiency. Assume < P S (s < to avoid triviality when P S (s or. The lemma implies that we need show only that S and Y are asymptotically independent and so C µ S µ Y as σ (or as γ. So we need to show only that P SY (s, y P S (sp Y (y or P Y S (y s P Y (y as σ (or as γ for all signal symbols s S and y Y. Thus the result follows (similar to the proofs in [], [] for mutual information if we can show that λ f (ndn as σ or γ. (4 Case : Finite-variance noise Let the mean of the noise be µ E[] and the variance be σ E[( µ ]. Then µ / (, θ+ A by hypothesis. ow suppose that µ < θ A. Pick ε (θ A µ >. So θ A ε θ A ε + µ µ µ + (θ A µ ε µ + ε ε µ + ε. Then λ ε µ +ε f (ndn (4 f (ndn (4 f (ndn (43 f (ndn (44 Pr{ µ + ε} Pr{ µ ε} (45 Pr{ µ ε} (46 σ ε by Chebyshev s inequality (47 as σ. (48 Suppose next that µ > θ + A. Then pick ε (µ > and so θ+a+ε θ+a+ε+µ µ µ (µ θ A + ε µ ε +ε µ ε. Then λ +ε µ ε f (ndn (49 f (ndn (5 f (ndn (5 Pr{ µ ε} Pr{ µ ε} (5 Pr{ µ ε} (53 σ ε by Chebyshev s inequality (54 as σ. (55

6 6 f (n α α.7 α.5 α 5 n α t n α.7 t n α.5 t n α t f (n n (a γ.5 γ γ n (c Time step t 4 γ.5 n t (b γ n t γ n t Time step t (d Fig.. Samples of symmetric alpha-stable probability densities and their white-noise realizations. (a Standard symmetric alpha-stable density functions with zero location (a and unit dispersion (γ for α,.7,.5, and. The densities are bell curves that have thicker tails as α decreases and thus that model increasingly impulsive noise as α decreases. The case α gives a Gaussian density with variance two (or unit dispersion. The parameter α gives the Cauchy density with infinite variance. (b White-noise samples of alpha-stable random variables n t with zero location and unit dispersion. The plots show realizations when α,.7,.5, and. ote the scale differences on the y axes. The alpha-stable variable becomes more impulsive as the parameter α falls and thus as the bell curves get thicker. The algorithm in [37], [38] generated these realizations. (c Alpha-stable density functions for α.7 with dispersions γ.5,, and. (d Samples of finite and infinite alpha-stable noise n t for α.7 with finite dispersions γ.5,, and. Case : Alpha-stable noise The characteristic function φ(ω of alpha-stable density f (n has the exponential form (38. This reduces to a simple complex exponential in the zero-dispersion limit: lim φ(ω exp {iaω} (56 γ for each α, each skewness β, and each location a. So Fourier transformation gives the corresponding density function in the limiting case (γ as a translated delta function: lim f (n δ(n a. (57 γ Then λ f (ndn (58 δ(n adn (59 because a / (θ A, θ + A. (6 Then P Y (y P Y S (y s as γ. Thus Cases and both imply that S and Y are asymptotically independent and so they are asymptotically uncorrelated: C µ S µ Y as σ for finitevariance noise or as γ for alpha-stable noise. ecessity. We show that the cross correlation C is max-

7 7 imum as (or γ if µ (θ A, θ + A (or if a (θ A, θ + A. Assume < P S (s < to avoid triviality when P S (s or. Case : Finite-variance noise We now show that P Y S (y s is either or as. Let the noise mean be µ E[] and the variance be σ E[( µ ]. Then again µ (, θ+a by hypothesis. Consider P Y S (. Pick ε (θ + A µ >. So θ + A ε θ + A ε + µ µ µ + (θ + A µ ε µ + ε ε µ + ε. Then P Y S ( ε µ +ε f (ndn (6 f (ndn (6 f (ndn (63 µ +ε f (ndn (64 Pr{ µ + ε} (65 Pr{ µ ε} (66 Pr{ µ ε} (67 σ ε by Chebyshev s inequality (68 as. (69 So P Y S ( and P Y S (. Similarly for P Y S ( : Pick ε (µ θ+a >. So θ A + ε θ A + ε + µ µ µ + (θ A µ + ε µ ε + ε µ ε. Then P Y S ( +ε θ+a µ ε µ ε f (ndn (7 f (ndn (7 f (ndn (7 f (ndn (73 Pr{ µ ε} (74 Pr{ µ ε} (75 Pr{ µ ε} (76 σ ε by Chebyshev s inequality (77 as. (78 So P Y S ( and P Y S (. Case : Alpha-stable noise Again we have Then P Y S ( Similarly: P Y S ( lim f (n δ(n a. (79 γ f (ndn (8 δ(n adn as γ. (8 f (ndn (8 δ(n adn as γ. (83 The four conditional probabilities for both finitevariance and infinite-variance cases imply that the cross correlation C A as σ (or γ. Q.E.D. IV. ARGE A FORBIDDE ITERVAL THEOREM FOR LOCATIO-SCALE OISE AD BIPOLAR SIGALS This section derives necessary and sufficient conditions for the local noise benefit >. Theorem shows that this takes the form of a correlation FIT and that the conditions depend on the system parameters. The signal system now is a threshold system with bipolar signals and additive white noise that has pdf f (n that belongs to a location-scale family: f (n fñ( n µ (84 with mean µ and variance σ. Thus the cdf is F (n FÑ( n µ (85 where FÑ is the cdf of the standardized random variable Ñ µ. ote that we can replace the mean µ with location parameter a and the standard deviation with the scale parameter κ γ /α for the alphastable noise. Thus the alpha-stable family belongs to the location-scale family for fixed α and β. We can rewrite the pdf of any random variable Y in terms of the pdf of the standard random variable X (Y a/κ: f(y; a, κ, α, β κ f(y a ;,, α, β (86 κ for fixed α and β. So the pdf f(y; a, κ, α, β of an alpha-stable random variable Y with location a and scale κ has the form f(y; a, κ, α, β exp { ity + iat κt α ( + iβsgn(tγ} dt (87

8 8 Let x (y a/κ and τ κt. Then f(y; a, κ, α, β exp { it(xκ + a + ita κt α ( βsgn(tγ} dt (88 exp { iτx τ α ( βsgn(τγ} dτ κ (89 κ f(x;,, α, β κ f(y a ;,, α, β. (9 κ The location-scale structure of the pdf and (5 (9 give the conditional pdf f Y S (y s as { FÑ( θ s µ f Y S (y s y (9 FÑ( θ s µ y FÑ( θ s µ δ(y + + ( FÑ( θ s µ δ(y. (9 Then the input-output cross correlation measure C E[SY ] of the threshold system ( with bipolar signal S with pdf f S (s pδ(s + A + ( pδ(s A and location-scale noise has the form C µ S + pafñ( θ + A µ ( pafñ( θ A µ. (93 Suppose the cross correlation (93 is differentiable with respect to. Theorem below states the necessary and sufficient condition for the partial noise benefit > for bipolar signal S and location-scale noise. Theorem : Suppose the signal S is bipolar with pdf f S (s pδ(s + A + ( pδ(s A. Suppose the location-scale noise has mean µ and variance σ with pdf f (n fñ( n µ. ecessity: The threshold system ( does not have a local noise benefit < if µ (θ A, θ + A. Sufficiency: The threshold system ( has a local noise benefit > if µ / (θ A, θ + A and if the system parameters satisfy inequalities (i or (ii: (i µ > θ + A and ( p (θ A µ µ fñ p > (. (94 (θ + A µ θ+a µ fñ or (ii µ < θ A and Proof: ( p (θ A µ µ fñ p < (. (95 (θ + A µ θ+a µ fñ C E[SY ] (96 sf S (s syf S,Y (s, y ds dy (97 yf Y S (y s dy ds (98 ( sf S (s ( FÑ( θ s µ + (( FÑ( θ s µ ds (99 ( sf S (s FÑ( θ s µ ds ( sf S (sds sf S (sfñ( θ s µ ds ( µ S sf S (sfñ( θ s µ ds. ( The input-output cross correlation measure for the threshold system with bipolar signal S with pdf f S (s pδ(s + A + ( pδ(s A and locationscale noise follows from ( as ( C µ S s pδ(s + A + ( pδ(s A FÑ( θ s µ ds (3 µ S + pafñ( θ + A µ ( pafñ( θ A µ. (4 We first show that the local noise benefit > holds if and only if the following inequality holds: ( p(θ A µ fñ > p(θ + A µ fñ ( θ A µ ( θ + A µ. (5 Suppose the cdf FÑ is absolutely continuous and thus differentiable [39]: dfñ (z dz fñ(z. Then the partial

9 9 derivative has the form ( θ + A µ pa σ ( θ A µ + ( pa σ fñ( θ + A µ fñ( θ A µ. (6 Then set > and (6 becomes (5. Inequality (5 requires checking the four cases that correspond to whether µ is positive or negative and whether θ + A µ is positive or negative. Case : θ A µ < and θ + A µ <. The conditions imply that µ > θ+a. Thus we have a local noise benefit > iff µ > θ + A and the signal-noise-system parameters in (5 that satisfy ( p (θ A µ µ fñ p > (. (7 (θ + A µ θ+a µ fñ Case : θ A µ > and θ + A µ >. The conditions imply that µ < θ A. Thus we have a local noise benefit > iff µ < θ A and the signal-noise-system parameters in (5 that satisfy ( p (θ A µ µ fñ p < (. (8 (θ + A µ θ+a µ fñ Thus Cases and give sufficient conditions for a local noise benefit i and ii. Case 3: θ A µ < and θ + A µ >. The conditions imply that θ A < µ < θ + A. The left-hand side of (5 is negative while the right-hand side of (5 is positive and thus (5 does not hold. So there is no noise benefit. The condition θ A < µ < θ + A results in a negative partial derivative ( θ + A µ pa σ : ( θ A µ + ( pa σ fñ( θ + A µ fñ( θ A µ (9 <. ( Thus θ A < µ < θ + A is a necessary condition for a local noise benefit. Case 4: θ A µ > and θ + A µ <. The conditions imply that µ < θ A and µ > θ + A. This case is logically impossible since A >. Q.E.D. The proof also shows that the noise benefit is a local maximum when equality replaces the inequalities in (94 or in (8. V. LOCAL OISE BEEFITS FOR ARBITRARY SIGAL AD LOCATIO-SCALE OISE We next consider the threshold system ( when the signal S has arbitrary pdf f S (s and the noise comes from the location-scale family f (n fñ( n µ. Thus the conditional pdf of Y given a signal value s is { FÑ( θ s µ f Y S (y s y FÑ( θ s µ y ( FÑ( θ s µ δ(y + + ( FÑ( θ s µ δ(y ( where FÑ is the cdf of the standardized random variable Ñ µ. The input-output cross correlation measure C for the threshold system with arbitrary signal pdf f S (s and location-scale noise with pdf f (n fñ( n µ has the form (: C µ S sf S (sfñ( θ s µ ds. (3 Theorem 3 below states a necessary and sufficient condition for a local noise benefit > for an arbitrary signal S and location-scale noise. Theorem 3: Suppose the input signal S has pdf f S (s. Suppose the location-scale noise has mean µ and variance σ with pdf f (n fñ( n µ. Then the threshold system ( has the local noise benefit > if and only if where µ S r S r S < (θ µ µ S (4 s ( θ s µ f S (sfñ k S σ ( θ s µ s k S f S (sfñ ds (5 ds (6 and the ( normalizer k S > when the the product f θ s µ S (sfñ has nonzero support: ( θ s µ k S f S (sfñ ds. (7

10 Proof: Suppose C is differentiable with respect to the noise standard deviation. Then sf S (s FÑ( θ s µ ds. The partial derivative FÑ FÑ( θ s µ Then θ s µ σ fñ s(θ s µ σ f S (sfñ has the form θ s µ ( θ s µ (8 fñ(zdz. (9 ( θ s µ ( θ s µ ds ( ( s(θ µ θ s µ σ f S (sfñ ds s ( θ s µ σ f S (sfñ ds. ( Consider the product f S (sfñ. Suppose the supports of both pdfs overlap. ( We can define f S (θ; s k S f θ s µ S (sfñ as a pdf where k S (θ > is the normalizer such that ( k S f θ s µ S (sfñ ds. So the partial derivative has the form ( s(θ µ θ s µ k S σ f S (sfñ ds k S s ( θ s µ k S f S (sfñ ds k S σ k S(θ µ k S σ σ sf S (θ; sds ( s f S (θ; sds (3 (θ µ k S µ S σ k S rs σ (4 k ( S (θ µ µ S rs. (5 σ A local noise benefit occurs when >. This holds in (5 if and only if r S < (θ µ µ S (6 since k S > and σ >. Q.E.D. The necessary and sufficient condition (4 characterizes noise benefits in threshold systems with arbitrary input signals and location-scale noise. But the first moment µ S and the second moment rs depend on θ µ. So specific forms of (4 require knowledge of µ S and rs. VI. FORBIDDE ITERVAL THEOREM FOR GAUSSIA SIGAL AD OISE Suppose the threshold system ( has Gaussian input signal S with pdf f S (s (µ S, σs and Gaussian noise with pdf f (n (µ, σ. Then the inputoutput cross correlation measure C (3 becomes C µ S sf S (sφ( θ s µ ds (7 where Φ(z is the standard normal cdf. Theorem 4 states a forbidden interval theorem for a local noise benefit in a threshold system with Gaussian signal and Gaussian noise. Theorem 4: Suppose the signal S is Gaussian with mean µ S and variance σs : S (µ S, σs and the noise is Gaussian with mean µ and variance σ : (µ, σ. Then the threshold system ( has a local noise benefit > if and only if µ / (θ a, θ a where a µ S( σ σ S a µ S( σ + µ S (σ + 4(σS + σ + µ σ S σs (8 σ S µ S (σ + 4(σS + σ + µ σ S σs. (9 Proof: There is a local noise benefit if and only if >. The partial derivative (5 is k ( S (θ µ µ S rs (3 σ where the first moment µ S and second moment r S for Gaussian signal and noise are (similar to the parameters of the posterior density in Bayes theorem [4]: µ S µ Sσ + (θ µ σ S σ S + σ σ S σ σs σs + σ (3 (3

11 and The normalizer is k S σ S πσs e r S σ S + µ S. (33 ( µ S σ +(θ µ σ S σ S σ µ S σ +(θ µ σ S σ S σ (σ S +σ. Substitute (3-(33 into (4 to obtain ( µs σ + (θ µ σs σ S σ σ S + σ + (σ S + σ (34 < (θ µ µ Sσ + (θ µ σs σs +. (35 σ This holds if and only if σ Sσ (σ S + σ + µ Sσ 4 + (θ µ µ S σ Sσ + (θ µ σ 4 S < (θ µ µ S (σ Sσ + σ 4 + (θ µ σ 4 S + (θ µ µ S σ. (36 This holds in turn if and only if ( σ (θ µ + µ S σs (θ µ ( σs + σ + µ S σ4 S σs 4 >. (37 But (37 is quadratic in (θ µ. So we can replace the inequality in (37 with an equality and find its roots from the quadratic formula: θ µ ( σ µ S σs ± ( σ µ ( S σs + 4 σs + σ + µ S σ4 S σs 4 (38 a (µ S, σ S, σ, a (µ S, σ S, σ (39 where the roots a and a depend on µ S, σs, and σ and a (µ S, σs, σ < < a (µ S, σs, σ. Then a local noise benefit > holds if and only if (θ µ a (µ S, σ S, σ (θ µ a (µ S, σ S, σ This is equivalent to >. (4 µ < θ a (µ S, σ S, σ (4 or µ > θ a (µ S, σ S, σ. (4 Q.E.D. ote that the correlation-based forbidden interval changes as the noise variance σ changes for the given signal s mean µ S and variance σs. This also applies to forbidden intervals for all signal and noise pdfs. VII. OISE BEEFITS FOR ARRAYS OF THRESHOLD DETECTORS This section derives correlation-based noise benefits for an array of Q threshold detectors. The i.i.d. input sequence S i has arbitrary pdf with finite mean and variance and the i.i.d. noise i has location-scale pdf with finite mean and variance. The input signal S i and noise i are independent. Each threshold detector has input-output relationship Y i sgn(s i + i θ i (43 for i,..., Q. This resembles the case when we collect Q samples of output signals of a threshold detector ( from the input signal sequence S i. But the structure of the array of threshold detectors here differs from the parallel summing array of noisy threshold elements in [3] and the array of signal quantizers used for maximum-likelihood detection and eyman- Pearson detection in [4]. Those systems use an array of threshold detectors to process a signal sample S with independent additive noise i and then sum the results as an output Y. Figure 3 shows examples of noise benefits for the array of thresholds on the 5 5 baboon image with uniform noise. The histogram of the baboon image approximates the pdf f S (s of the input signal. We obtain the first and second moments µ S and rs from the histogram and the noise pdf. Then we solve (48 to obtain the forbidden bands in Figures 3b and 3c. Figures 3b and 3c show how well the Gaussian approximation applies to the baboon images with uniform noise. Figure 4 shows the regions where the conditions in Theorem 3 hold. We use the sample cross-correlation measure Ĉ of two Q-d random vectors S [S S Q ] t and Y [Y Y Q ] t as a performance measure of the array: Ĉ Q S i Y i. (44 Q i Thus Ĉ is random with mean µĉ E[Ĉ] C (45 µ S and variance [ sf S (sfñ( θ s µ ds (46 σ Ĉ σs + 4µ S sf S (sfñ( θ s µ ds Q ( ] 4 sf S (sfñ( θ s µ ds. (47

12 .5 normalized frequency..5 Forbidden interval (a, b Forbidden interval (a, b.5.5 pixel value [,] Standard deviation of additive white uniform noise Standard deviation of additive white uniform noise (a Histogram of baboon image (b Forbidden bands when θ.8 (c Forbidden bands when θ (d θ.8, (e θ.8,. (f θ.8,.66 (g θ.8,.5 (h θ, (i θ,. (j θ,.66 (k θ,.5 Sample cross correlation C /n Σ s ij y ij σ * Standard deviation of additive uniform noise Sample cross correlation C /n Σ s ij y ij σ *.5.5 Standard deviation of additive uniform noise (l θ.8 (m θ Fig. 3. Forbidden-interval noise benefit in an array of threshold detectors ( that uses the baboon image as its input. (a Histogram of the gray-scale baboon image. We scale the pixel values to be in the interval [-,] and use these values as input signals to the array of threshold detectors. (b-(c Forbidden intervals (bands using the Gaussian approximation of Theorem 5 for θ.8 and θ. Each interval depends on the signal and noise statistics. Solid lines are intervals from the statistics of the baboon image with uniform noise. Dash lines are estimated intervals using Gaussian approximation of both signal and noise. (d-(g Quantized (thresholded baboon images when the detectors have threshold θ.8. The noise is uniform with zero mean. Thus µ / (θ a, θ a and this gives an SR noise benefit. (h-(k The detectors have threshold θ and the noise is uniform with zero mean. Thus µ (θ a, θ a and there is no noise benefit. (l Cross correlation C when θ.8: This gives an SR noise benefit. (m Cross correlation C when θ : There is no noise benefit.

13 Mean µ Mean µ Standard deviation σ of additive white uniform noise 5.5 (b C when θ Standard deviation σ of additive white uniform noise (c σ.5 4 Mean µ 5.5 (a C when θ.8 Mean µ.5 Standard deviation σ of additive white uniform noise.5 Standard deviation σ of additive white uniform noise.5 when θ.8 (d σ when θ Fig. 4. Cross correlations C in (a-(b and derivatives in (c-(d of an array of threshold detectors for the baboon image with uniform σ noise. The forbidden intervals (bands for noise benefits in an array of threshold detectors ( in Figure derive from the contours of. σ µcˆ So on average we have a local noise benefit if σ >. Thus the condition for an average local σ noise benefit for the sample cross correlation Cˆ has the same form as the condition in Theorem 3. We state this condition as Theorem 5. Theorem 5: Suppose the signal S has arbitrary pdf with finite mean µs and finite variance σs. Suppose the noise also has an arbitrary pdf with finite mean µ. Then the Q-array of threshold and finite variance σ µcˆ systems ( has an average local noise benefit σ > if and only if rs < (θ µ µs (48 VIII. C OCLUSIO We have found necessary and sufficient conditions for five correlation forbidden interval theorems. All theorems find a correlation-based noise benefit for one of three types of forbidden interval: (θ A, θ + A with bipolar input signals, rs < (θ µ µs with arbitrary signal and location-scale noise, or (θ, θ a (µ, σ, σ with a Gaussian a (µs, σs, σ S S input signal and Gaussian noise for the stochastic threshold signal detector in (. Earlier FITs applied only to two-valued signals and not to continuous signals. Correlation noise benefits may hold for other combinations of random signals and noise. Adaptive algorithms should help find the optimal noise level in all such cases. R EFERECES where µs and rs have the same form as in (5(6. [] B. Kosko and S. Mitaim. Stochastic Resonance in oisy Threshold eurons. eural etworks, 6(5-6:755 76, JuneJuly 3.

14 4 [] B. Kosko and S. Mitaim. Robust Stochastic Resonance for Simple Threshold eurons. Physical Review E, 7(39:, September 4. [3] S. Mitaim and B. Kosko. Adaptive Stochastic Resonance in oisy eurons Based on Mutual Information. Transactions on eural etworks, 5(6:56 54, ovember 4. [4] A. Patel and B. Kosko. Stochastic Resonance in oisy Spiking Retinal and Sensory euron Models. eural etworks, 8: , August 5. [5] A. Patel and B. Kosko. Stochastic Resonance in Continuous and Spiking euron Models With Levy oise. IEEE Transactions on eural etworks, 9(:993 8, December 8. [6] M. M. Wilde and B. Kosko. Quantum Forbidden-Interval Theorems for Stochastic Resonance. Journal of Physics A: Mathematical and Theoretical, 4(46539, 9. [7] A. Patel and B. Kosko. Error-Probability oise Benefits in Threshold eural Signal Detection. eural etworks, (4:697 76, July 9. [8] B. Kosko, I. Lee, S. Mitaim, A. Patel, and M. M. Wilde. Applications of Forbidden Interval Theorems in Stochastic Resonance. In V. In, P. Longhini, and A. Palacios, editors, Applications of onlinear Dynamics Model and Design of Complex Systems, pages 7 89, 9. [9] I. Lee, X. Liu, C. Zhou, and B. Kosko. oise-enhanced Detection of Subthreshold Signals With Carbon anotubes. IEEE Transactions on anotechnology, 5(6:63 67, ovember 6. [] J. J. Collins, C. C. Chow, and T. T. Imhoff. Stochastic Resonance without Tuning. ature, 376:36 38, July 995. [] B. Kosko and S. Mitaim. Robust Stochastic Resonance: Signal Detection and Adaptation in Impulsive oise. Physical Review E, 64(5, October. [] M. D. McDonnell and D. Abbott. A Characterization of Suprathreshold Stochastic Resonance in an Array of Comparators by Correlation Coefficient. Fluctuation and oise Letters, (3:L3 L8,. [3] I.S. Moskowitz, P.. Safier, and P. Cotae. Algebraic Information Theory and Kosko s Forbidden Interval Theorem. In Proceedings of Modelling, Identification and Control/8: Advances in Computer Science, DOI:.36/P Acta Press, April 3. [4] A. Patel and B. Kosko. oise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding. IEEE Transactions on Signal Processing, 59(:488 55, February. [5] A. R. Bulsara and A. Zador. Threshold Detection of Wideband Signals: A oise-induced Maximum in the Mutual Information. Physical Review E, 54(3:R85 R88, September 996. [6] B. Franzke and B. Kosko. oise Can Speed Convergence in Markov Chains. Physical Review E, 84(4: 4, October. [7] L. Gammaitoni. Stochastic Resonance in Multi-Threshold Systems. Physics Letters A, 8:35 3, December 995. [8] P. Jung. Stochastic Resonance and Optimal Design of Threshold Detectors. Physics Letters A, 7:93 4, October 995. [9] B. Kosko. oise. Viking/Penguin, 6. [] M. McDonnell,. Stocks, C. Pearce, and D. Abbott. Stochastic Resonance: From Suprathreshold Stochastic Resonance to Stochastic Signal Quantization. Cambridge University Press, 8. [] S. Mitaim and B. Kosko. Adaptive Stochastic Resonance. Proceedings of the IEEE: Special Issue on Intelligent Signal Processing, 86(:5 83, ovember 998. [] O. Osoba, S. Mitaim, and B. Kosko. The oisy Expectation- Maximization Algorithm. Fluctuation and oise Letters, (3: , September 3. [3] A. Patel and B. Kosko. Optimal Mean-Square oise Benefits in Quantizer-Array Linear Estimation. IEEE Signal Processing Letters, 7(:5 9, December. [4] C. Y. Cai, Li-Ping Yang, and C. P. Sun. Threshold for nonthermal stabilization of open quantum systems. Phys. Rev. A, 89:8, Jan 4. [5] L. Gammaitoni. Stochastic Resonance and the Dithering Effect in Threshold Physical Systems. Physical Review E, 5(5: , ovember 995. [6] Z. Gingl, L. B. Kiss, and F. Moss. on-dynamical Stochastic Resonance: Theory and Experiments with White and Arbitrarily Coloured oise. Europhysics Letters, 9(3:9 96, January 995. [7] P. Jung. Threshold Devices: Fractal oise and eural Talk. Physical Review E, 5(4:53 5, October 994. [8] K. Szczepaniec and B. Dybiec. on-gaussian, non-dynamical stochastic resonance. The European Physical Journal B, 86: 468. [9] J. J. Hopfield. eural etworks and Physical Systems with Emergent Collective Computational Abilities. Proceedings of the ational Academy of Science, 79: , 98. [3] B. Kosko. eural etworks and Fuzzy Systems: A Dynamical Systems Approach to Machine Intelligence. Prentice Hall, 99. [3]. G. Stocks. Information Transmission in Parallel Threshold Arrays. Physical Review E, 63(44,. [3] V. Akgiray and C. G. Lamoureux. Estimation of Stable-Law Parameters: A Comparative Study. Journal of Business and Economic Statistics, 7:85 93, January 989. [33] H. Bergstrom. On Some Expansions of Stable Distribution Functions. Arkiv för Matematik, : , 95. [34] K. Górska and K. A. Penson. Lévy Stable Two-Sided Distributions: Exact and Explicit Densities for Asymmetric Case. Physical Review E, 83(65: 4, June. [35] M. Grigoriu. Applied on-gaussian Processes. Prentice Hall, 995. [36] C. L. ikias and M. Shao. Signal Processing with Alpha- Stable Distributions and Applications. John Wiley & Sons, 995. [37] J. M. Chambers, C. L. Mallows, and B. W. Stuck. A Method for Simulating Stable Random Variables. Journal of the American Statistical Association, 7(354:34 344, 976. [38] P. Tsakalides and C. L. ikias. The Robust Covariation-Based MUSIC (ROC-MUSIC Algorithm for Bearing Estimation in Impulsive oise Environments. IEEE Transactions on Signal Processing, 44(7:63 633, July 996. [39] P. Billingsley. Probability and Measure. John Wiley & Sons, anniversary edition edition,. [4] R.V. Hogg, J.W. McKean, and A.T. Craig. Introduction to Mathematical Statistics. Prentice Hall, 7th edition, 3.

Optimal Mean-Square Noise Benefits in Quantizer-Array Linear Estimation Ashok Patel and Bart Kosko

Optimal Mean-Square Noise Benefits in Quantizer-Array Linear Estimation Ashok Patel and Bart Kosko IEEE SIGNAL PROCESSING LETTERS, VOL. 17, NO. 12, DECEMBER 2010 1005 Optimal Mean-Square Noise Benefits in Quantizer-Array Linear Estimation Ashok Patel and Bart Kosko Abstract A new theorem shows that

More information

Robust stochastic resonance for simple threshold neurons

Robust stochastic resonance for simple threshold neurons PHYSICAL REVIEW E 70, 031911 (004) Robust stochastic resonance for simple threshold neurons Bart Kosko 1 and Sanya Mitaim 1 Department of Electrical Engineering, Signal and Image Processing Institute,

More information

Stochastic resonance in noisy threshold neurons

Stochastic resonance in noisy threshold neurons Neural Networks 16 (2003) 755 761 2003 Special issue Stochastic resonance in noisy threshold neurons Bart Kosko a, *, Sanya Mitaim b www.elsevier.com/locate/neunet a Department of Electrical Engineering,

More information

NOISE can sometimes improve nonlinear signal processing

NOISE can sometimes improve nonlinear signal processing 488 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 59, NO. 2, FEBRUARY 2011 Noise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding Ashok Patel, Member, IEEE, and Bart Kosko, Fellow,

More information

Noise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding

Noise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding To appear in the IEEE Transactions on Signal Processing Current draft: 9 November 010 Noise Benefits in Quantizer-Array Correlation Detection and Watermark Decoding Ashok Patel, Member IEEE, and Bart Kosko,

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Applications of Forbidden Interval Theorems in Stochastic Resonance

Applications of Forbidden Interval Theorems in Stochastic Resonance Applications of Forbidden Interval Theorems in Stochastic Resonance Bart Kosko, Ian Lee, Sanya Mitaim, Ashok Patel and Mark M. Wilde Abstract Forbidden interval theorems state whether a stochastic-resonance

More information

DETECTION theory deals primarily with techniques for

DETECTION theory deals primarily with techniques for ADVANCED SIGNAL PROCESSING SE Optimum Detection of Deterministic and Random Signals Stefan Tertinek Graz University of Technology turtle@sbox.tugraz.at Abstract This paper introduces various methods for

More information

Neural Networks. Error-probability noise benefits in threshold neural signal detection

Neural Networks. Error-probability noise benefits in threshold neural signal detection Neural Networks (9 697 76 Contents lists available at ScienceDirect Neural Networks journal homepage: www.elsevier.com/locate/neunet 9 Special Issue Error-probability noise benefits in threshold neural

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

The Noisy Expectation-Maximization Algorithm for Multiplicative Noise Injection

The Noisy Expectation-Maximization Algorithm for Multiplicative Noise Injection The Noisy Expectation-Maximization Algorithm for Multiplicative Noise Injection Osonde Osoba, Bart Kosko We generalize the noisy expectation-maximization (NEM) algorithm to allow arbitrary noise-injection

More information

Lecture 11: Continuous-valued signals and differential entropy

Lecture 11: Continuous-valued signals and differential entropy Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistics & Data Sciences: First Year Prelim Exam May 2018 Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1)

Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1) Chapter 2. Binary and M-ary Hypothesis Testing 2.1 Introduction (Levy 2.1) Detection problems can usually be casted as binary or M-ary hypothesis testing problems. Applications: This chapter: Simple hypothesis

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

ROBUST MINIMUM DISTANCE NEYMAN-PEARSON DETECTION OF A WEAK SIGNAL IN NON-GAUSSIAN NOISE

ROBUST MINIMUM DISTANCE NEYMAN-PEARSON DETECTION OF A WEAK SIGNAL IN NON-GAUSSIAN NOISE 17th European Signal Processing Conference EUSIPCO 2009) Glasgow, Scotland, August 24-28, 2009 ROBUST MIIMUM DISTACE EYMA-PEARSO DETECTIO OF A WEAK SIGAL I O-GAUSSIA OISE Georgy Shevlyakov, Kyungmin Lee,

More information

Minimum Feedback Rates for Multi-Carrier Transmission With Correlated Frequency Selective Fading

Minimum Feedback Rates for Multi-Carrier Transmission With Correlated Frequency Selective Fading Minimum Feedback Rates for Multi-Carrier Transmission With Correlated Frequency Selective Fading Yakun Sun and Michael L. Honig Department of ECE orthwestern University Evanston, IL 60208 Abstract We consider

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

UNIFORMLY MOST POWERFUL CYCLIC PERMUTATION INVARIANT DETECTION FOR DISCRETE-TIME SIGNALS

UNIFORMLY MOST POWERFUL CYCLIC PERMUTATION INVARIANT DETECTION FOR DISCRETE-TIME SIGNALS UNIFORMLY MOST POWERFUL CYCLIC PERMUTATION INVARIANT DETECTION FOR DISCRETE-TIME SIGNALS F. C. Nicolls and G. de Jager Department of Electrical Engineering, University of Cape Town Rondebosch 77, South

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Gaussian distributions and processes have long been accepted as useful tools for stochastic

Gaussian distributions and processes have long been accepted as useful tools for stochastic Chapter 3 Alpha-Stable Random Variables and Processes Gaussian distributions and processes have long been accepted as useful tools for stochastic modeling. In this section, we introduce a statistical model

More information

Chapter 2 Signal Processing at Receivers: Detection Theory

Chapter 2 Signal Processing at Receivers: Detection Theory Chapter Signal Processing at Receivers: Detection Theory As an application of the statistical hypothesis testing, signal detection plays a key role in signal processing at receivers of wireless communication

More information

Random Variables. P(x) = P[X(e)] = P(e). (1)

Random Variables. P(x) = P[X(e)] = P(e). (1) Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment

More information

Statistical Methods for Handling Incomplete Data Chapter 2: Likelihood-based approach

Statistical Methods for Handling Incomplete Data Chapter 2: Likelihood-based approach Statistical Methods for Handling Incomplete Data Chapter 2: Likelihood-based approach Jae-Kwang Kim Department of Statistics, Iowa State University Outline 1 Introduction 2 Observed likelihood 3 Mean Score

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Encoding or decoding

Encoding or decoding Encoding or decoding Decoding How well can we learn what the stimulus is by looking at the neural responses? We will discuss two approaches: devise and evaluate explicit algorithms for extracting a stimulus

More information

THE information capacity is one of the most important

THE information capacity is one of the most important 256 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 1, JANUARY 1998 Capacity of Two-Layer Feedforward Neural Networks with Binary Weights Chuanyi Ji, Member, IEEE, Demetri Psaltis, Senior Member,

More information

The main results about probability measures are the following two facts:

The main results about probability measures are the following two facts: Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a

More information

The Noisy Expectation-Maximization Algorithm for Multiplicative Noise Injection

The Noisy Expectation-Maximization Algorithm for Multiplicative Noise Injection The Noisy Expectation-Maximization Algorithm for Multiplicative Noise Injection Osonde Osoba, Bart Kosko We generalize the noisy expectation-maximization (NEM) algorithm to allow arbitrary modes of noise

More information

Applied Probability and Stochastic Processes

Applied Probability and Stochastic Processes Applied Probability and Stochastic Processes In Engineering and Physical Sciences MICHEL K. OCHI University of Florida A Wiley-Interscience Publication JOHN WILEY & SONS New York - Chichester Brisbane

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Optimal quantization for energy-efficient information transfer in a population of neuron-like devices

Optimal quantization for energy-efficient information transfer in a population of neuron-like devices Optimal quantization for energy-efficient information transfer in a population of neuron-like devices Mark D. McDonnell a, Nigel G. Stocks b, Charles E. M. Pearce c, Derek Abbott a a Centre for Biomedical

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

Suprathreshold stochastic resonance and signal-to-noise ratio improvement in arrays of comparators

Suprathreshold stochastic resonance and signal-to-noise ratio improvement in arrays of comparators Physics Letters A 321 (2004) 280 290 www.elsevier.com/locate/pla Suprathreshold stochastic resonance and signal-to-noise ratio improvement in arrays of comparators David Rousseau, François Chapeau-Blondeau

More information

Expect Values and Probability Density Functions

Expect Values and Probability Density Functions Intelligent Systems: Reasoning and Recognition James L. Crowley ESIAG / osig Second Semester 00/0 Lesson 5 8 april 0 Expect Values and Probability Density Functions otation... Bayesian Classification (Reminder...3

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

STATISTICS SYLLABUS UNIT I

STATISTICS SYLLABUS UNIT I STATISTICS SYLLABUS UNIT I (Probability Theory) Definition Classical and axiomatic approaches.laws of total and compound probability, conditional probability, Bayes Theorem. Random variable and its distribution

More information

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2.0 THEOREM OF WIENER- KHINTCHINE An important technique in the study of deterministic signals consists in using harmonic functions to gain the spectral

More information

Impulsive Noise Filtering In Biomedical Signals With Application of New Myriad Filter

Impulsive Noise Filtering In Biomedical Signals With Application of New Myriad Filter BIOSIGAL 21 Impulsive oise Filtering In Biomedical Signals With Application of ew Myriad Filter Tomasz Pander 1 1 Division of Biomedical Electronics, Institute of Electronics, Silesian University of Technology,

More information

ELEG 833. Nonlinear Signal Processing

ELEG 833. Nonlinear Signal Processing Nonlinear Signal Processing ELEG 833 Gonzalo R. Arce Department of Electrical and Computer Engineering University of Delaware arce@ee.udel.edu February 15, 2005 1 INTRODUCTION 1 Introduction Signal processing

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Week 1 Quantitative Analysis of Financial Markets Distributions A

Week 1 Quantitative Analysis of Financial Markets Distributions A Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October

More information

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Fall 2005; Revised for Winter 2017 7 Rate-Based Recurrent etworks of Threshold eurons: Basis for Associative Memory 7.1 A recurrent network with threshold elements The

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators Estimation theory Parametric estimation Properties of estimators Minimum variance estimator Cramer-Rao bound Maximum likelihood estimators Confidence intervals Bayesian estimation 1 Random Variables Let

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Outline. Random Variables. Examples. Random Variable

Outline. Random Variables. Examples. Random Variable Outline Random Variables M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Random variables. CDF and pdf. Joint random variables. Correlated, independent, orthogonal. Correlation,

More information

LTI Systems, Additive Noise, and Order Estimation

LTI Systems, Additive Noise, and Order Estimation LTI Systems, Additive oise, and Order Estimation Soosan Beheshti, Munther A. Dahleh Laboratory for Information and Decision Systems Department of Electrical Engineering and Computer Science Massachusetts

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Product measure and Fubini s theorem

Product measure and Fubini s theorem Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω

More information

Nonlinear Stochastic Resonance with subthreshold rectangular pulses arxiv:cond-mat/ v1 [cond-mat.stat-mech] 15 Jan 2004.

Nonlinear Stochastic Resonance with subthreshold rectangular pulses arxiv:cond-mat/ v1 [cond-mat.stat-mech] 15 Jan 2004. Nonlinear Stochastic Resonance with subthreshold rectangular pulses arxiv:cond-mat/4163v1 [cond-mat.stat-mech] 15 Jan 4 Jesús Casado-Pascual, José Gómez-Ordóñez, and Manuel Morillo Física Teórica, Universidad

More information

Towards control over fading channels

Towards control over fading channels Towards control over fading channels Paolo Minero, Massimo Franceschetti Advanced Network Science University of California San Diego, CA, USA mail: {minero,massimo}@ucsd.edu Invited Paper) Subhrakanti

More information

BASICS OF DETECTION AND ESTIMATION THEORY

BASICS OF DETECTION AND ESTIMATION THEORY BASICS OF DETECTION AND ESTIMATION THEORY 83050E/158 In this chapter we discuss how the transmitted symbols are detected optimally from a noisy received signal (observation). Based on these results, optimal

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Notes on Mathematical Expectations and Classes of Distributions Introduction to Econometric Theory Econ. 770

Notes on Mathematical Expectations and Classes of Distributions Introduction to Econometric Theory Econ. 770 Notes on Mathematical Expectations and Classes of Distributions Introduction to Econometric Theory Econ. 77 Jonathan B. Hill Dept. of Economics University of North Carolina - Chapel Hill October 4, 2 MATHEMATICAL

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Economics 241B Review of Limit Theorems for Sequences of Random Variables

Economics 241B Review of Limit Theorems for Sequences of Random Variables Economics 241B Review of Limit Theorems for Sequences of Random Variables Convergence in Distribution The previous de nitions of convergence focus on the outcome sequences of a random variable. Convergence

More information

REVEAL. Receiver Exploiting Variability in Estimated Acoustic Levels Project Review 16 Sept 2008

REVEAL. Receiver Exploiting Variability in Estimated Acoustic Levels Project Review 16 Sept 2008 REVEAL Receiver Exploiting Variability in Estimated Acoustic Levels Project Review 16 Sept 2008 Presented to Program Officers: Drs. John Tague and Keith Davidson Undersea Signal Processing Team, Office

More information

Stable Limit Laws for Marginal Probabilities from MCMC Streams: Acceleration of Convergence

Stable Limit Laws for Marginal Probabilities from MCMC Streams: Acceleration of Convergence Stable Limit Laws for Marginal Probabilities from MCMC Streams: Acceleration of Convergence Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham NC 778-5 - Revised April,

More information

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory Physics 178/278 - David Kleinfeld - Winter 2019 7 Recurrent etworks of Threshold (Binary) eurons: Basis for Associative Memory 7.1 The network The basic challenge in associative networks, also referred

More information

Lecture 4: Exponential family of distributions and generalized linear model (GLM) (Draft: version 0.9.2)

Lecture 4: Exponential family of distributions and generalized linear model (GLM) (Draft: version 0.9.2) Lectures on Machine Learning (Fall 2017) Hyeong In Choi Seoul National University Lecture 4: Exponential family of distributions and generalized linear model (GLM) (Draft: version 0.9.2) Topics to be covered:

More information

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004 Independent Component Analysis and Its Applications By Qing Xue, 10/15/2004 Outline Motivation of ICA Applications of ICA Principles of ICA estimation Algorithms for ICA Extensions of basic ICA framework

More information

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna

More information

STAT 7032 Probability. Wlodek Bryc

STAT 7032 Probability. Wlodek Bryc STAT 7032 Probability Wlodek Bryc Revised for Spring 2019 Printed: January 14, 2019 File: Grad-Prob-2019.TEX Department of Mathematical Sciences, University of Cincinnati, Cincinnati, OH 45221 E-mail address:

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

HANDBOOK OF APPLICABLE MATHEMATICS

HANDBOOK OF APPLICABLE MATHEMATICS HANDBOOK OF APPLICABLE MATHEMATICS Chief Editor: Walter Ledermann Volume II: Probability Emlyn Lloyd University oflancaster A Wiley-Interscience Publication JOHN WILEY & SONS Chichester - New York - Brisbane

More information

Financial Econometrics

Financial Econometrics Financial Econometrics Nonlinear time series analysis Gerald P. Dwyer Trinity College, Dublin January 2016 Outline 1 Nonlinearity Does nonlinearity matter? Nonlinear models Tests for nonlinearity Forecasting

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Statistics: Learning models from data

Statistics: Learning models from data DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial

More information

EECS564 Estimation, Filtering, and Detection Exam 2 Week of April 20, 2015

EECS564 Estimation, Filtering, and Detection Exam 2 Week of April 20, 2015 EECS564 Estimation, Filtering, and Detection Exam Week of April 0, 015 This is an open book takehome exam. You have 48 hours to complete the exam. All work on the exam should be your own. problems have

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

Least-Squares Performance of Analog Product Codes

Least-Squares Performance of Analog Product Codes Copyright 004 IEEE Published in the Proceedings of the Asilomar Conference on Signals, Systems and Computers, 7-0 ovember 004, Pacific Grove, California, USA Least-Squares Performance of Analog Product

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons

Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Efficient coding of natural images with a population of noisy Linear-Nonlinear neurons Yan Karklin and Eero P. Simoncelli NYU Overview Efficient coding is a well-known objective for the evaluation and

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Brief Review on Estimation Theory

Brief Review on Estimation Theory Brief Review on Estimation Theory K. Abed-Meraim ENST PARIS, Signal and Image Processing Dept. abed@tsi.enst.fr This presentation is essentially based on the course BASTA by E. Moulines Brief review on

More information

Iterative Markov Chain Monte Carlo Computation of Reference Priors and Minimax Risk

Iterative Markov Chain Monte Carlo Computation of Reference Priors and Minimax Risk Iterative Markov Chain Monte Carlo Computation of Reference Priors and Minimax Risk John Lafferty School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 lafferty@cs.cmu.edu Abstract

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Kartik Audhkhasi, Osonde Osoba, Bart Kosko

Kartik Audhkhasi, Osonde Osoba, Bart Kosko To appear: 2013 International Joint Conference on Neural Networks (IJCNN-2013 Noise Benefits in Backpropagation and Deep Bidirectional Pre-training Kartik Audhkhasi, Osonde Osoba, Bart Kosko Abstract We

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lessons in Estimation Theory for Signal Processing, Communications, and Control

Lessons in Estimation Theory for Signal Processing, Communications, and Control Lessons in Estimation Theory for Signal Processing, Communications, and Control Jerry M. Mendel Department of Electrical Engineering University of Southern California Los Angeles, California PRENTICE HALL

More information