Spectrum Sensing in Cognitive Radios using Distributed Sequential Detection

Size: px
Start display at page:

Download "Spectrum Sensing in Cognitive Radios using Distributed Sequential Detection"

Transcription

1 Spectrum Sensing in Cognitive Radios using Distributed Sequential Detection Jithin K. S. Research Supervisor: Prof. Vinod Sharma External Examiner: Prof. Shankar Prakriya, IIT Delhi ECE Dept., Indian Institute of Science, Bangalore February 4, 2013

2 Outline Introduction Parametric Distributed Sequential Tests Nonparametric Distributed Sequential Tests Multihypothesis scenario Conclusions

3 Introduction

4 Introduction CR: Radio systems, which perform Radio Environment Analysis, identify spectral holes and operate in those holes.

5 Introduction CR: Radio systems, which perform Radio Environment Analysis, identify spectral holes and operate in those holes. Licensed (Primary) and Unlicensed (Secondary) users

6 Introduction CR: Radio systems, which perform Radio Environment Analysis, identify spectral holes and operate in those holes. Licensed (Primary) and Unlicensed (Secondary) users Spectral holes in various dimensions

7 Introduction CR: Radio systems, which perform Radio Environment Analysis, identify spectral holes and operate in those holes. Licensed (Primary) and Unlicensed (Secondary) users Spectral holes in various dimensions Aided by evolving SDR technology

8 Spectrum Sensing Identify spectral holes and quickly detect the onset of primary transmission

9 Spectrum Sensing Identify spectral holes and quickly detect the onset of primary transmission Spectrum Sensing Hypothesis testing formulation: H 0 : Primary user transmission is absent H 1 : Primary user transmission is present

10 Spectrum Sensing Identify spectral holes and quickly detect the onset of primary transmission Spectrum Sensing Hypothesis testing formulation: H 0 : Primary user transmission is absent H 1 : Primary user transmission is present Primary Techniques:

11 Spectrum Sensing Identify spectral holes and quickly detect the onset of primary transmission Spectrum Sensing Hypothesis testing formulation: H 0 : Primary user transmission is absent H 1 : Primary user transmission is present Primary Techniques: Matched filter

12 Spectrum Sensing Identify spectral holes and quickly detect the onset of primary transmission Spectrum Sensing Hypothesis testing formulation: H 0 : Primary user transmission is absent H 1 : Primary user transmission is present Primary Techniques: Matched filter Cyclostationary detector

13 Spectrum Sensing Identify spectral holes and quickly detect the onset of primary transmission Spectrum Sensing Hypothesis testing formulation: H 0 : Primary user transmission is absent H 1 : Primary user transmission is present Primary Techniques: Matched filter Cyclostationary detector Energy detector

14 Spectrum Sensing Identify spectral holes and quickly detect the onset of primary transmission Spectrum Sensing Hypothesis testing formulation: H 0 : Primary user transmission is absent H 1 : Primary user transmission is present Primary Techniques: Matched filter Cyclostationary detector Energy detector Challenges:

15 Spectrum Sensing Identify spectral holes and quickly detect the onset of primary transmission Spectrum Sensing Hypothesis testing formulation: H 0 : Primary user transmission is absent H 1 : Primary user transmission is present Primary Techniques: Matched filter Cyclostationary detector Energy detector Challenges: Shadowing, Fading and low SNR

16 Spectrum Sensing Identify spectral holes and quickly detect the onset of primary transmission Spectrum Sensing Hypothesis testing formulation: H 0 : Primary user transmission is absent H 1 : Primary user transmission is present Primary Techniques: Matched filter Cyclostationary detector Energy detector Challenges: Shadowing, Fading and low SNR Sensing Frequency and duration

17 Cooperative setup Mitigates multipath fading, shadowing and hidden node problem. Improves Probability of Errors (P FA and P MD ) and Expected Detection Delay (E DD ) A distributed statistical inference problem

18 Cooperative setup Mitigates multipath fading, shadowing and hidden node problem. Improves Probability of Errors (P FA and P MD ) and Expected Detection Delay (E DD ) A distributed statistical inference problem dbm PU Transmit Power Sensitivity Level Loss due to Path Loss Threshold with Cooperation Loss due to Multipath fading and Shadowing Improvement with Cooperation Noise Floor Threshold without Cooperation

19 Problems addressed in this thesis Distributed Detection:

20 Problems addressed in this thesis Distributed Detection:

21 Problems addressed in this thesis Distributed Detection:

22 Problems addressed in this thesis Distributed Detection:

23 Problems addressed in this thesis Distributed Detection: centralized

24 Problems addressed in this thesis Distributed Detection: centralized/decentralized

25 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR:

26 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous

27 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous

28 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous

29 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous

30 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous /Asynchronous

31 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous /Asynchronous

32 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous /Asynchronous

33 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous /Asynchronous

34 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous /Asynchronous Decision Mechanism: fixed sample size

35 Problems addressed in this thesis Distributed Detection: centralized/decentralized Sampling at CR: synchronous /Asynchronous Decision Mechanism: fixed sample size /Sequential

36 Problems addressed in this thesis (Contd.) Focus on reducing the number of samples in the cooperative setup with reliability constraints (Decentralized Sequential Hypothesis Testing)

37 Problems addressed in this thesis (Contd.) Focus on reducing the number of samples in the cooperative setup with reliability constraints (Decentralized Sequential Hypothesis Testing) Fast detection of idle TV bands

38 Problems addressed in this thesis (Contd.) Focus on reducing the number of samples in the cooperative setup with reliability constraints (Decentralized Sequential Hypothesis Testing) Fast detection of idle TV bands Quickest detection of unoccupied spectrum in slotted primary user transmission systems

39 Problems addressed in this thesis (Contd.) Focus on reducing the number of samples in the cooperative setup with reliability constraints (Decentralized Sequential Hypothesis Testing) Fast detection of idle TV bands Quickest detection of unoccupied spectrum in slotted primary user transmission systems Imprecise estimates of the channel gains and noise power (parameter uncertainty-composite Sequential Testing)

40 Problems addressed in this thesis (Contd.) Focus on reducing the number of samples in the cooperative setup with reliability constraints (Decentralized Sequential Hypothesis Testing) Fast detection of idle TV bands Quickest detection of unoccupied spectrum in slotted primary user transmission systems Imprecise estimates of the channel gains and noise power (parameter uncertainty-composite Sequential Testing) Channel gain statistic unknown (Universal Sequential Testing)

41 Problems addressed in this thesis (Contd.) Focus on reducing the number of samples in the cooperative setup with reliability constraints (Decentralized Sequential Hypothesis Testing) Fast detection of idle TV bands Quickest detection of unoccupied spectrum in slotted primary user transmission systems Imprecise estimates of the channel gains and noise power (parameter uncertainty-composite Sequential Testing) Channel gain statistic unknown (Universal Sequential Testing) Detect the spectrum and identify the primary user (Decentralized Multihypothesis Sequential Testing)

42 Related Work Sequential tests: Wald 47, Irle 84 Decentralized Sequential tests: Mei 08, Fellouris et al. 12, Yilmaz et al. 12 Cooperative Spectrum Sensing and Censored detection: Akyildiz et al. 11 Universal hypothesis tests: Levitan et al. 02, Jacob et al. 08, Unnikrishnan et al. 11 Multihypothesis Sequential tests: Draglia et al. 99, Tartakovsky 00

43 Model N k,1 h 1 + X k,1 Processing Y k,1 N k,2 h 2 Primary User X k,2 Processing Y k,2 S k h L +. N k,l + X k,l. Processing Y k,l Secondary Fusion Center

44 Model N k,1 h 1 + X k,1 Processing Y k,1 N k,2 h 2 Primary User X k,2 Processing Y k,2 S k h L +. N k,l + X k,l. Processing Y k,l Secondary Fusion Center General Problem: N = Stopping time, δ N = decision rule min E[N H 0 ] and min E[N H 1 ], N,δ N N,δ N subject to P FA α 1 and P MD α 2

45 Parametric Distributed Sequential Tests At l th CR, H 0 : X k,l f 0,l H 1 : X k,l f 1,l f 0,l and f 1,l are fully known f 0,l and f 1,l are not fully known: parametric family.

46 DualSPRT Algorithm 1. At Node, l, Sequential Probability Ratio Test (SPRT), W k,l = W k 1,l + log f 1,l(X k,l ) f 0,l (X k,l ), W 0,l = 0.

47 DualSPRT Algorithm 1. At Node, l, Sequential Probability Ratio Test (SPRT), 2. Node l transmits W k,l = W k 1,l + log f 1,l(X k,l ) f 0,l (X k,l ), W 0,l = 0. Y k,l = b 1 1 {Wk,l γ 1} + b 0 1 {Wk,l γ 0}

48 DualSPRT Algorithm 1. At Node, l, Sequential Probability Ratio Test (SPRT), 2. Node l transmits W k,l = W k 1,l + log f 1,l(X k,l ) f 0,l (X k,l ), W 0,l = 0. Y k,l = b 1 1 {Wk,l γ 1} + b 0 1 {Wk,l γ 0} Noisy MAC at the FC, Z k i.i.d. MAC noise Y k = L Y k,l + Z k l=1

49 DualSPRT Algorithm 1. At Node, l, Sequential Probability Ratio Test (SPRT), 2. Node l transmits W k,l = W k 1,l + log f 1,l(X k,l ) f 0,l (X k,l ), W 0,l = 0. Y k,l = b 1 1 {Wk,l γ 1} + b 0 1 {Wk,l γ 0} Noisy MAC at the FC, Z k i.i.d. MAC noise Y k = L Y k,l + Z k l=1 3. FC runs SPRT: g µ is p.d.f. of µ + Z k

50 DualSPRT Algorithm 1. At Node, l, Sequential Probability Ratio Test (SPRT), 2. Node l transmits W k,l = W k 1,l + log f 1,l(X k,l ) f 0,l (X k,l ), W 0,l = 0. Y k,l = b 1 1 {Wk,l γ 1} + b 0 1 {Wk,l γ 0} Noisy MAC at the FC, Z k i.i.d. MAC noise Y k = L Y k,l + Z k l=1 3. FC runs SPRT: g µ is p.d.f. of µ + Z k F k = F k 1 + log g µ 1 (Y k ) g µ0 (Y k ), F 0 = 0,

51 DualSPRT Algorithm 1. At Node, l, Sequential Probability Ratio Test (SPRT), 2. Node l transmits W k,l = W k 1,l + log f 1,l(X k,l ) f 0,l (X k,l ), W 0,l = 0. Y k,l = b 1 1 {Wk,l γ 1} + b 0 1 {Wk,l γ 0} Noisy MAC at the FC, Z k i.i.d. MAC noise Y k = L Y k,l + Z k l=1 3. FC runs SPRT: g µ is p.d.f. of µ + Z k F k = F k 1 + log g µ 1 (Y k ) g µ0 (Y k ), F 0 = 0, 4. FC decides, at N = inf{k : F k / ( β 0, β 1 )}, H 1 if F N β 1, H 0 if F N β 0.

52 Performance Analysis of DualSPRT- E DD Sample path argument W k,l γ LLR Sum at local node l under H 1 γ T ime Transmission from local node l Mean of the received signal at FC F k β β LLR Sum at FC under H 1

53 Performance Analysis of DualSPRT- E DD Sample path argument W k,l γ LLR Sum at local node l under H 1 γ b N 1 l T ime Transmission from local node l b Mean of the received signal at FC β (1 node transmits) t 1 F k β LLR Sum at FC under H 1

54 Performance Analysis of DualSPRT- E DD Sample path argument W k,l γ LLR Sum at local node l under H 1 γ b N 1 l T ime Transmission from local node l b Mean of the received signal at FC β (1 node transmits) t 1 t 2 F k β LLR Sum at FC under H 1

55 Performance Analysis of DualSPRT- E DD Sample path argument W k,l γ lim γ P1(Nl = N l 1 ) 1, lim Nl/γ = lim N l 1 /γ γ γ lim γ,β P1(N = N 1 ) 1, lim γ,β N/γ = lim γ,β N 1 /γ LLR Sum at local node l under H 1 γ b N 1 l T ime Transmission from local node l b Mean of the received signal at FC β (1 node transmits) t 1 t 2 t 3 t l F k β P1(Nl = Nl i ) 1, Nl N 1 LLR Sum at FC under H 1

56 Performance Analysis of DualSPRT- E DD (Contd.) At each node l, time to cross the threshold N 1 l N ( γ, ρ2 l γ δ l δl 3 ), δ l = E 1 [LLR l ] and ρ 2 l = Var 1 [LLR l ]

57 Performance Analysis of DualSPRT- E DD (Contd.) At each node l, time to cross the threshold N 1 l N ( γ, ρ2 l γ δ l δl 3 ), δ l = E 1 [LLR l ] and ρ 2 l = Var 1 [LLR l ] t j = jth order statistics of N 1 1, N 1 2,..., N 1 L

58 Performance Analysis of DualSPRT- E DD (Contd.) At each node l, time to cross the threshold N 1 l N ( γ, ρ2 l γ δ l δl 3 ), δ l = E 1 [LLR l ] and ρ 2 l = Var 1 [LLR l ] t j = jth order statistics of N 1 1, N 1 2,..., N 1 L l = min{j : δ j FC > 0 and β F j δ j FC < E[t j+1 ] E[t j ]}

59 Performance Analysis of DualSPRT- E DD (Contd.) At each node l, time to cross the threshold N 1 l N ( γ, ρ2 l γ δ l δl 3 ), δ l = E 1 [LLR l ] and ρ 2 l = Var 1 [LLR l ] t j = jth order statistics of N 1 1, N 1 2,..., N 1 L l = min{j : δ j FC > 0 and β F j δ j FC < E[t j+1 ] E[t j ]} Fj = E[F tj 1], F j = F j 1 + δ j FC (E[t j] E[t j 1 ]), F 0 = 0. E DD (E 1 [N]) E 1 [N 1 ] E[t l ] + β F l δ l FC

60 Performance Analysis of DualSPRT- P MD P MD = P 1 (N 0 < N 1 ) P 1 (N 0 < t 1, N 1 > t 1 )

61 Performance Analysis of DualSPRT- P MD P MD = P 1 (N 0 < N 1 ) P 1 (N 0 < t 1, N 1 > t 1 ) P 1 (N 0 < t 1 )

62 Performance Analysis of DualSPRT- P MD P MD = P 1 (N 0 < N 1 ) P 1 (N 0 < t 1, N 1 > t 1 ) P 1 (N 0 < t 1 ) Also P 1 (N 0 < N 1 ) P 1 (N 0 < )

63 Performance Analysis of DualSPRT- P MD P MD = P 1 (N 0 < N 1 ) P 1 (N 0 < t 1, N 1 > t 1 ) P 1 (N 0 < t 1 ) Also P 1 (N 0 < N 1 ) P 1 (N 0 < ) = P 1 (N 0 < t 1 ) + P 1 (t 1 N 0 < t 2 ) + P 1 (t 2 N 0 < t 3 ) +...

64 Performance Analysis of DualSPRT- P MD P MD = P 1 (N 0 < N 1 ) P 1 (N 0 < t 1, N 1 > t 1 ) P 1 (N 0 < t 1 ) Also P 1 (N 0 < N 1 ) P 1 (N 0 < ) = P 1 (N 0 < t 1 ) + P 1 (t 1 N 0 < t 2 ) + P 1 (t 2 N 0 < t 3 ) +... P MD P 1 (N 0 < t 1 )

65 Performance Analysis of DualSPRT- P MD P MD = P 1 (N 0 < N 1 ) P 1 (N 0 < t 1, N 1 > t 1 ) P 1 (N 0 < t 1 ) Also P 1 (N 0 < N 1 ) P 1 (N 0 < ) = P 1 (N 0 < t 1 ) + P 1 (t 1 N 0 < t 2 ) + P 1 (t 2 N 0 < t 3 ) +... P MD P 1 (N 0 < t 1 ) = P k=1 [ {F k < β} k 1 n=1 {F n > θ} t 1 > k ] P[t 1 > k]

66 Comparison of Analysis with Simulations 5 Secondary nodes channel gains (0, -1.5, -2.5, -4 and -6 db)

67 Comparison of Analysis with Simulations 5 Secondary nodes channel gains (0, -1.5, -2.5, -4 and -6 db) f 0 N (0, 1) and f 1,l N (θ l, 1), Var(Z k ) = 1

68 Comparison of Analysis with Simulations 5 Secondary nodes channel gains (0, -1.5, -2.5, -4 and -6 db) f 0 N (0, 1) and f 1,l N (θ l, 1), Var(Z k ) = 1 P MD Sim. P MD Anal. E DD Sim. E DD Anal e e e e e e

69 Asymptotic Properties of DualSPRT R c(δ) = π[c E 0(N) + W 0P 0{reject H 0}] + (1 π)[c E 1(N) + W 1P 1{rejectH 1}]

70 Asymptotic Properties of DualSPRT R c(δ) = π[c E 0(N) + W 0P 0{reject H 0}] + (1 π)[c E 1(N) + W 1P 1{rejectH 1}] L L Dtot 0 = D(f 0,l f 1,l ), Dtot 1 = D(f 1,l f 0,l ), r l = D(f 0,l f 1,l ), ρ Dtot 0 l = D(f 1,l f 0,l ) Dtot 1 l=1 l=1

71 Asymptotic Properties of DualSPRT R c(δ) = π[c E 0(N) + W 0P 0{reject H 0}] + (1 π)[c E 1(N) + W 1P 1{rejectH 1}] L L Dtot 0 = D(f 0,l f 1,l ), Dtot 1 = D(f 1,l f 0,l ), r l = D(f 0,l f 1,l ), ρ Dtot 0 l = D(f 1,l f 0,l ) Dtot 1 l=1 Theorem l=1 γ 0,l = r l log c,γ 1,l = ρ l log c, β 0 = log c, β 1 = log c.

72 Asymptotic Properties of DualSPRT R c(δ) = π[c E 0(N) + W 0P 0{reject H 0}] + (1 π)[c E 1(N) + W 1P 1{rejectH 1}] L L Dtot 0 = D(f 0,l f 1,l ), Dtot 1 = D(f 1,l f 0,l ), r l = D(f 0,l f 1,l ), ρ Dtot 0 l = D(f 1,l f 0,l ) Dtot 1 l=1 Theorem l=1 γ 0,l = r l log c,γ 1,l = ρ l log c, β 0 = log c, β 1 = log c. E 0 [N] lim c 0 log c 1 E 1 [N] Dtot 0 + M 0 and lim c 0 log c 1 Dtot 1 + M 1

73 Asymptotic Properties of DualSPRT R c(δ) = π[c E 0(N) + W 0P 0{reject H 0}] + (1 π)[c E 1(N) + W 1P 1{rejectH 1}] L L Dtot 0 = D(f 0,l f 1,l ), Dtot 1 = D(f 1,l f 0,l ), r l = D(f 0,l f 1,l ), ρ Dtot 0 l = D(f 1,l f 0,l ) Dtot 1 l=1 Theorem l=1 γ 0,l = r l log c,γ 1,l = ρ l log c, β 0 = log c, β 1 = log c. E 0 [N] lim c 0 log c 1 E 1 [N] Dtot 0 + M 0 and lim c 0 log c 1 Dtot 1 + M 1 Gaussian MAC noise at FC µ chosen appropriately

74 Asymptotic Properties of DualSPRT R c(δ) = π[c E 0(N) + W 0P 0{reject H 0}] + (1 π)[c E 1(N) + W 1P 1{rejectH 1}] L L Dtot 0 = D(f 0,l f 1,l ), Dtot 1 = D(f 1,l f 0,l ), r l = D(f 0,l f 1,l ), ρ Dtot 0 l = D(f 1,l f 0,l ) Dtot 1 l=1 Theorem l=1 γ 0,l = r l log c,γ 1,l = ρ l log c, β 0 = log c, β 1 = log c. E 0 [N] lim c 0 log c 1 E 1 [N] Dtot 0 + M 0 and lim c 0 log c 1 Dtot 1 + M 1 Gaussian MAC noise at FC µ chosen appropriately lim c 0 P FA = 0 and lim c log c c 0 P MD c log c = 0

75 Asymptotic Properties of DualSPRT R c(δ) = π[c E 0(N) + W 0P 0{reject H 0}] + (1 π)[c E 1(N) + W 1P 1{rejectH 1}] L L Dtot 0 = D(f 0,l f 1,l ), Dtot 1 = D(f 1,l f 0,l ), r l = D(f 0,l f 1,l ), ρ Dtot 0 l = D(f 1,l f 0,l ) Dtot 1 l=1 Theorem l=1 γ 0,l = r l log c,γ 1,l = ρ l log c, β 0 = log c, β 1 = log c. E 0 [N] lim c 0 log c 1 E 1 [N] Dtot 0 + M 0 and lim c 0 log c 1 Dtot 1 + M 1 Gaussian MAC noise at FC µ chosen appropriately lim c 0 P FA = 0 and lim c log c c 0 R c (δbayes lim ) = 1 c 0 R c (δ) P MD c log c = 0

76 Asymptotic Properties of DualSPRT R c(δ) = π[c E 0(N) + W 0P 0{reject H 0}] + (1 π)[c E 1(N) + W 1P 1{rejectH 1}] L L Dtot 0 = D(f 0,l f 1,l ), Dtot 1 = D(f 1,l f 0,l ), r l = D(f 0,l f 1,l ), ρ Dtot 0 l = D(f 1,l f 0,l ) Dtot 1 l=1 Theorem l=1 γ 0,l = r l log c,γ 1,l = ρ l log c, β 0 = log c, β 1 = log c. E 0 [N] lim c 0 log c 1 E 1 [N] Dtot 0 + M 0 and lim c 0 log c 1 Dtot 1 + M 1 Gaussian MAC noise at FC µ chosen appropriately lim c 0 P FA = 0 and lim c log c c 0 P MD c log c = 0 R c (δbayes lim ) R c (δcent. = 1 and lim ) = 1 c 0 R c (δ) c 0 R c (δ)

77 Modification to DualSPRT: SPRT-CSPRT Algorithm Using SPRT at Fusion center is not optimal since the optimality of SPRT is known for i.i.d. observations only.

78 Modification to DualSPRT: SPRT-CSPRT Algorithm Using SPRT at Fusion center is not optimal since the optimality of SPRT is known for i.i.d. observations only. What is the best test at FC?

79 Modification to DualSPRT: SPRT-CSPRT Algorithm Using SPRT at Fusion center is not optimal since the optimality of SPRT is known for i.i.d. observations only. What is the best test at FC? Sequential change detection formulation µ k = E[Y k ] E[Yk] Change point t1 k

80 Modification to DualSPRT: SPRT-CSPRT Algorithm Using SPRT at Fusion center is not optimal since the optimality of SPRT is known for i.i.d. observations only. What is the best test at FC? Sequential change detection formulation µ k = E[Y k ] E[Yk] Change point t1 k Reducing the false alarms caused by FC MAC noise before t 1.

81 SPRT-CSPRT: (inspired from CUSUM) FC algorithm, ( Fk 1 = Fk log g ) + ( µ 1 (Y k ), Fk 0 = Fk log g ) z(y k ) g z (Y k ) g µ0 (Y k )

82 SPRT-CSPRT: (inspired from CUSUM) FC algorithm, ( Fk 1 = Fk log g ) + ( µ 1 (Y k ), Fk 0 = Fk log g ) z(y k ) g z (Y k ) g µ0 (Y k ) FC decides at N = inf{k : F 1 k β 1 or F 0 k β 0}, H 1 if F 1 N β 1, H 0 if F 0 N β 0

83 SPRT-CSPRT: (inspired from CUSUM) FC algorithm, ( Fk 1 = Fk log g ) + ( µ 1 (Y k ), Fk 0 = Fk log g ) z(y k ) g z (Y k ) g µ0 (Y k ) FC decides at N = inf{k : F 1 k β 1 or F 0 k β 0}, H 1 if F 1 N β 1, H 0 if F 0 N β 0 [ ] µ k = E[Y k ], E µk log gµ 1 (Y k ) = D(g µk g Z ) D(g µk g µ1 ) g Z (Y k )

84 SPRT-CSPRT: (inspired from CUSUM) FC algorithm, ( Fk 1 = Fk log g ) + ( µ 1 (Y k ), Fk 0 = Fk log g ) z(y k ) g z (Y k ) g µ0 (Y k ) FC decides at N = inf{k : F 1 k β 1 or F 0 k β 0}, H 1 if F 1 N β 1, H 0 if F 0 N β 0 [ ] µ k = E[Y k ], E µk log gµ 1 (Y k ) = D(g µk g Z ) D(g µk g µ1 ) g Z (Y k ) Exp. drift of F 1 k +ve if D(g µ k g Z ) > D(g µk g µ1 ).

85 SPRT-CSPRT: (inspired from CUSUM) FC algorithm, ( Fk 1 = Fk log g ) + ( µ 1 (Y k ), Fk 0 = Fk log g ) z(y k ) g z (Y k ) g µ0 (Y k ) FC decides at N = inf{k : F 1 k β 1 or F 0 k β 0}, H 1 if F 1 N β 1, H 0 if F 0 N β 0 [ ] µ k = E[Y k ], E µk log gµ 1 (Y k ) = D(g µk g Z ) D(g µk g µ1 ) g Z (Y k ) Exp. drift of F 1 k +ve if D(g µ k g Z ) > D(g µk g µ1 ). Exp. drift of F 0 k -ve if D(g µ k g µ0 ) < D(g µk g Z ).

86 SPRT-CSPRT: (inspired from CUSUM) FC algorithm, ( Fk 1 = Fk log g ) + ( µ 1 (Y k ), Fk 0 = Fk log g ) z(y k ) g z (Y k ) g µ0 (Y k ) FC decides at N = inf{k : F 1 k β 1 or F 0 k β 0}, H 1 if F 1 N β 1, H 0 if F 0 N β 0 [ ] µ k = E[Y k ], E µk log gµ 1 (Y k ) = D(g µk g Z ) D(g µk g µ1 ) g Z (Y k ) Exp. drift of F 1 k +ve if D(g µ k g Z ) > D(g µk g µ1 ). Exp. drift of F 0 k -ve if D(g µ k g µ0 ) < D(g µk g Z ). Sample path argument SPRT sum CSPRT sum β 1 20 F k Time (k)

87 Modification of quantisation at SU s γ 1 +3Δ 1 γ 1 +2Δ 1 γ 1 +Δ 1 γ 1 b 1 1 b 1 2 b 1 3 b 1 4 W k b 1 1 <b 1 2< b 1 3< b 1 4 b 0 1 >b 0 2>b 0 3>b 0 4 Time γ 0 γ 0 -Δ 0 b 0 1 γ 0-2Δ 0 γ 0-3Δ 0 b 0 2 b 0 3 b 0 4

88 Performance Analysis of SPRT-CSPRT: P MD P H1 (MD) = P H1 (MD < t 1 ) + P H1 (t 1 MD t 2 ) +... First term dominates.

89 Performance Analysis of SPRT-CSPRT: P MD P H1 (MD) = P H1 (MD < t 1 ) + P H1 (t 1 MD t 2 ) +... First term dominates. P H1 (FA before t 1 ) = P(τ β k k < t 1 )P(t 1 > k) k=1 τ β = inf{k 1 : F 0 k β}

90 Performance Analysis of SPRT-CSPRT: P MD P H1 (MD) = P H1 (MD < t 1 ) + P H1 (t 1 MD t 2 ) +... First term dominates. P H1 (FA before t 1 ) = P(τ β k k < t 1 )P(t 1 > k) k=1 τ β = inf{k 1 : F 0 k β} lim P 0{τ β > x τ β < t 1 } = exp( λ β x), x > 0 β

91 Performance Analysis of SPRT-CSPRT: P MD P H1 (MD) = P H1 (MD < t 1 ) + P H1 (t 1 MD t 2 ) +... First term dominates. P H1 (FA before t 1 ) = P(τ β k k < t 1 )P(t 1 > k) k=1 τ β = inf{k 1 : F 0 k β} lim P 0{τ β > x τ β < t 1 } = exp( λ β x), x > 0 β P H1 (FA before t 1 ) L (1 e λβk ) (1 Φ τγ,l (k)) k=1 l=1

92 Comparison of Analysis with Simulations 5 Secondary nodes channel gains (0, -1.5, -2.5, -4 and -6 db) f 0 N (0, 1) and f 1,l N (θ l, 1), Var(Z k ) = 1 P MD Sim. P MD Anal. E DD Sim. E DD Anal

93 Comparison with asymptotically optimal tests 10 E DD DualSPRT DSPRT SPRT CSPRT Mei s SPRT P x 10 3 FA Mei s SPRT: Mei, Trans. Inf. Theory DSPRT: Fellouris and Moustakides, Trans. Inf. Theory 2011.

94 Different and unknown SNR s: GLR-SPRT SNR at local CR nodes Unknown

95 Different and unknown SNR s: GLR-SPRT SNR at local CR nodes Unknown Energy detector output is used as the observations to the local node SPRT s

96 Different and unknown SNR s: GLR-SPRT SNR at local CR nodes Unknown Energy detector output is used as the observations to the local node SPRT s Modelled as Gaussian mean change under low SNR s and large number of samples

97 Different and unknown SNR s: GLR-SPRT SNR at local CR nodes Unknown Energy detector output is used as the observations to the local node SPRT s Modelled as Gaussian mean change under low SNR s and large number of samples At SU l, H 0 : θ = θ 0 ; H 1 : θ θ 1. where θ 0 = 0 and θ 1 is appropriately chosen,

98 Different and unknown SNR s: GLR-SPRT SNR at local CR nodes Unknown Energy detector output is used as the observations to the local node SPRT s Modelled as Gaussian mean change under low SNR s and large number of samples At SU l, H 0 : θ = θ 0 ; H 1 : θ θ 1. where θ 0 = 0 and θ 1 is appropriately chosen, [ n W n,l = max log f ˆθ n (X k ) n f θ0 (X k ), log f ] ˆθ n (X k ), f θ1 (X k ) k=1 k=1

99 Different and unknown SNR s: GLR-SPRT SNR at local CR nodes Unknown Energy detector output is used as the observations to the local node SPRT s Modelled as Gaussian mean change under low SNR s and large number of samples At SU l, H 0 : θ = θ 0 ; H 1 : θ θ 1. where θ 0 = 0 and θ 1 is appropriately chosen, [ n W n,l = max log f ˆθ n (X k ) n f θ0 (X k ), log f ] ˆθ n (X k ), f θ1 (X k ) k=1 k=1 N = inf {n : W n,l g(cn)}, g(t) log(1/t) as t 0

100 Decide upon H 0 or H 1 as ˆθ N θ or ˆθ N θ where θ solves D(f θ f θ0 ) = D(f θ f θ1 ) D(f θ f λ ) = E θ [log[f θ (X )/f λ (X )]

101 Decide upon H 0 or H 1 as ˆθ N θ or ˆθ N θ where θ solves D(f θ f θ0 ) = D(f θ f θ1 ) D(f θ f λ ) = E θ [log[f θ (X )/f λ (X )] Similar to GLR test in Neyman-Pearson approach in FSS tests Diminishing uncertainty of ˆθ n as an estimate of θ with n is incorporated into the varying stopping boundary g(cn) Asymptotically optimal over a broad range of θ as c 0

102 Comparison between DualSPRT and GLRSPRT 5 Secondary nodes

103 Comparison between DualSPRT and GLRSPRT 5 Secondary nodes channel gains (0, -1.5, -2.5, -4 and -6 db)

104 Comparison between DualSPRT and GLRSPRT 5 Secondary nodes channel gains (0, -1.5, -2.5, -4 and -6 db) f 0 N (0, 1) and f 1,l N (θ l, 1), Var(Z k ) = 1

105 Comparison between DualSPRT and GLRSPRT 5 Secondary nodes channel gains (0, -1.5, -2.5, -4 and -6 db) f 0 N (0, 1) and f 1,l N (θ l, 1), Var(Z k ) = 1 hyp E DD P E = 0.1 P E = 0.05 P E = 0.01 H1 DualSPRT H1 GLRSPRT H0 DualSPRT H0 GLRSPRT

106 Rayleigh Slow fading case N is much less than the coherence time of the channel

107 Rayleigh Slow fading case N is much less than the coherence time of the channel Fading gain is unknown to the CR nodes

108 Rayleigh Slow fading case N is much less than the coherence time of the channel Fading gain is unknown to the CR nodes In case of Rayleigh fading, the received signal power, P l is exponentially distributed

109 Rayleigh Slow fading case N is much less than the coherence time of the channel Fading gain is unknown to the CR nodes In case of Rayleigh fading, the received signal power, P l is exponentially distributed H 0 : f 0,l N (0, σ 2 ); H 1 : f 1,l N (θ, σ 2 ) where θ is exponential distributed r.v which reflects the unknown channel gain.

110 Simulation results No. of nodes :5 σ 2 = 1, θ = exp(1), Var(Z k ) = 1

111 Simulation results No. of nodes :5 σ 2 = 1, θ = exp(1), Var(Z k ) = 1 E DD P FA = 0.1 P FA = 0.05 P FA = 0.01 DualSPRT GLRSPRT Table: slow-fading between primary and secondary user under H0

112 Simulation results No. of nodes :5 σ 2 = 1, θ = exp(1), Var(Z k ) = 1 E DD P FA = 0.1 P FA = 0.05 P FA = 0.01 DualSPRT GLRSPRT Table: slow-fading between primary and secondary user under H0 E DD P MD = 0.1 P MD = 0.08 P MD = 0.06 DualSPRT GLRSPRT Table: slow-fading between primary and secondary user under H1

113 Comparison between SPRT-CSPRT and GLR-CSPRT 5 Secondary nodes Unknown SNR: channel gains (0, -1.5, -2.5, -4 and -6 db) f 0 N (0, 1) and f 1,l N (θ l, 1), Var(Z k ) = 1

114 Comparison between SPRT-CSPRT and GLR-CSPRT 5 Secondary nodes Unknown SNR: channel gains (0, -1.5, -2.5, -4 and -6 db) f 0 N (0, 1) and f 1,l N (θ l, 1), Var(Z k ) = 1 Hyp E DD P E = 0.1 P E = 0.05 P E = 0.01 H 1 SPRT-CSPRT H 1 GLR-CSPRT H 0 SPRT-CSPRT H 0 GLR-CSPRT

115 Comparison between SPRT-CSPRT and GLR-CSPRT 5 Secondary nodes Unknown SNR: channel gains (0, -1.5, -2.5, -4 and -6 db) f 0 N (0, 1) and f 1,l N (θ l, 1), Var(Z k ) = 1 Hyp E DD P E = 0.1 P E = 0.05 P E = 0.01 H 1 SPRT-CSPRT H 1 GLR-CSPRT H 0 SPRT-CSPRT H 0 GLR-CSPRT Rayleigh Slow fading: σ 2 = 1, θ = exp(1), Var(Z k ) = 1 Hyp E DD P E = 0.1 P E = 0.07 P E = 0.04 H 1 SPRT-CSPRT H 1 GLR-CSPRT H 0 SPRT-CSPRT H 0 GLR-CSPRT

116 Universal Sequential Hypothesis Testing

117 Universal sequential hypothesis testing problem Model for Single Cognitive Radio Nonparametric or universal setup: Noise statistics under no PU transmission is fully known Transmit power, channel gains, modulation schemes etc. of PU transmissions is not available (SNR uncertainty). i.i.d. observations X i, i = 1, 2,... H 0 : X i P 0 ( p.d.f. f 0 ) known ; H 1 : X i P 1 ( p.d.f. f 1 ) unknown

118 Discrete alphabet for P 0 and P 1 SPRT: P 0 and P 1 fully known, for γ 0, γ 1 > 0

119 Discrete alphabet for P 0 and P 1 SPRT: P 0 and P 1 fully known, for γ 0, γ 1 > 0 N = inf{n : W n = n k=1 log P 1(X k ) P 0 (X k ) / ( γ 0, γ 1 )},

120 Discrete alphabet for P 0 and P 1 SPRT: P 0 and P 1 fully known, for γ 0, γ 1 > 0 N = inf{n : W n = n k=1 log P 1(X k ) P 0 (X k ) / ( γ 0, γ 1 )}, δ = H 1 if W N γ 1 ; H 0 if W N γ 0

121 Discrete alphabet for P 0 and P 1 SPRT: P 0 and P 1 fully known, for γ 0, γ 1 > 0 N = inf{n : W n = n k=1 log P 1(X k ) P 0 (X k ) / ( γ 0, γ 1 )}, δ = H 1 if W N γ 1 ; H 0 if W N γ 0 P 0 is known; P 1 is not known.

122 Discrete alphabet for P 0 and P 1 SPRT: P 0 and P 1 fully known, for γ 0, γ 1 > 0 N = inf{n : W n = n k=1 log P 1(X k ) P 0 (X k ) / ( γ 0, γ 1 )}, δ = H 1 if W N γ 1 ; H 0 if W N γ 0 P 0 is known; P 1 is not known. Replace W n by Ŵ n = L n (X n 1 ) log P 0 (X n 1 ) n λ 2, λ > 0. L n (X n 1 )=Codelength of a universal lossless source code for X n 1

123 Discrete alphabet case Discussion of the test By Shanon-Macmillan Thorem, for any stationary ergodic source lim n n 1 log P(X n 1 ) = H(X ) a.s. ( H(X ) is the entropy rate).

124 Discrete alphabet case Discussion of the test By Shanon-Macmillan Thorem, for any stationary ergodic source lim n n 1 log P(X n 1 ) = H(X ) a.s. ( H(X ) is the entropy rate). Consider universal codes which satisfy lim n n 1 L n = H(X ) a.s. atleast for i.i.d. sources.

125 Discrete alphabet case Discussion of the test By Shanon-Macmillan Thorem, for any stationary ergodic source lim n n 1 log P(X n 1 ) = H(X ) a.s. ( H(X ) is the entropy rate). Consider universal codes which satisfy lim n n 1 L n = H(X ) a.s. atleast for i.i.d. sources. Under H 1 : E1 [( log P 0 (X n 1 ))] = nh 1(X ) + nd(p 1 P 0 )

126 Discrete alphabet case Discussion of the test By Shanon-Macmillan Thorem, for any stationary ergodic source lim n n 1 log P(X n 1 ) = H(X ) a.s. ( H(X ) is the entropy rate). Consider universal codes which satisfy lim n n 1 L n = H(X ) a.s. atleast for i.i.d. sources. Under H 1 : E1 [( log P 0 (X1 n))] = nh 1(X ) + nd(p 1 P 0 ) L(X1 n) nh 1(X ) for large n

127 Discrete alphabet case Discussion of the test By Shanon-Macmillan Thorem, for any stationary ergodic source lim n n 1 log P(X n 1 ) = H(X ) a.s. ( H(X ) is the entropy rate). Consider universal codes which satisfy lim n n 1 L n = H(X ) a.s. atleast for i.i.d. sources. Under H 1 : E1 [( log P 0 (X1 n))] = nh 1(X ) + nd(p 1 P 0 ) L(X1 n) nh 1(X ) for large n Thus average drift under H 1 : D(P 1 P 0 ) λ/2

128 Discrete alphabet case Discussion of the test By Shanon-Macmillan Thorem, for any stationary ergodic source lim n n 1 log P(X n 1 ) = H(X ) a.s. ( H(X ) is the entropy rate). Consider universal codes which satisfy lim n n 1 L n = H(X ) a.s. atleast for i.i.d. sources. Under H 1 : E1 [( log P 0 (X1 n))] = nh 1(X ) + nd(p 1 P 0 ) L(X1 n) nh 1(X ) for large n Thus average drift under H 1 : D(P 1 P 0 ) λ/2 Average drift under H 0 : λ/2

129 Discrete alphabet case Discussion of the test By Shanon-Macmillan Thorem, for any stationary ergodic source lim n n 1 log P(X n 1 ) = H(X ) a.s. ( H(X ) is the entropy rate). Consider universal codes which satisfy lim n n 1 L n = H(X ) a.s. atleast for i.i.d. sources. Under H 1 : E1 [( log P 0 (X1 n))] = nh 1(X ) + nd(p 1 P 0 ) L(X1 n) nh 1(X ) for large n Thus average drift under H 1 : D(P 1 P 0 ) λ/2 Average drift under H 0 : λ/2 Thus consider P 1 belonging to the class C = {P 1 : D(P 1, P 0 ) λ}.

130 Discrete alphabet case Asymptotic Properties (A): stronger version of pointwise universality Theorem (1) P 0 (N < ) = 1 and P 1 (N < ) = 1.

131 Discrete alphabet case Asymptotic Properties (A): stronger version of pointwise universality Theorem (1) P 0 (N < ) = 1 and P 1 (N < ) = 1. (2) P FA = P0 (ŴN γ 1 ) exp( γ 1 ).

132 Discrete alphabet case Asymptotic Properties (A): stronger version of pointwise universality Theorem (1) P 0 (N < ) = 1 and P 1 (N < ) = 1. (2) P FA = P0 (ŴN γ 1 ) exp( γ 1 ). (3) Under (A), P MD = P1 (ŴN γ 0 ) = O(exp( γ 0 s)), s > 0..

133 Discrete alphabet case Asymptotic Properties (A): stronger version of pointwise universality Theorem (1) P 0 (N < ) = 1 and P 1 (N < ) = 1. (2) P FA = P0 (ŴN γ 1 ) exp( γ 1 ). (3) Under (A), P MD = P1 (ŴN γ 0 ) = O(exp( γ 0 s)), s > 0.. (4) N P 0 a.s. 2 log γ 0 γ 0,γ 1 λ ; N log γ 1 P 1, a.s. γ 0,γ 1 1 D(P 1 P 0 ) λ/2

134 Discrete alphabet case Asymptotic Properties (A): stronger version of pointwise universality Theorem (1) P 0 (N < ) = 1 and P 1 (N < ) = 1. (2) P FA = P0 (ŴN γ 1 ) exp( γ 1 ). (3) Under (A), P MD = P1 (ŴN γ 0 ) = O(exp( γ 0 s)), s > 0.. (4) N P 0 a.s. 2 log γ 0 γ 0,γ 1 λ ; N log γ 1 P 1, a.s. γ 0,γ 1 1 D(P 1 P 0 ) λ/2 (5) Under (A), if E 1 [(log P 1 (X 1 )) p+1 ] < & E 1 [(log P 0 (X 1 )) p+1 ] < for some p 1, then for all 0 < q p, E 0 [N q ( ) ( q ] 2 E 1 [N q ] 1 log γ 0 q, γ 0,γ 1 λ log γ 1 q γ 0,γ 1 D(P 1 P 0 ) λ 2 ) q

135 Continuous alphabet for P 0 and P 1 Need to estimate n k=1 log f 1(X k )

136 Continuous alphabet for P 0 and P 1 Need to estimate n k=1 log f 1(X k ) If E[log f 1 (X k )] < then by SLLN n 1 n log f 1 (X k ) h(x 1 ) a.s. k=1

137 Continuous alphabet for P 0 and P 1 Need to estimate n k=1 log f 1(X k ) If E[log f 1 (X k )] < then by SLLN n 1 n log f 1 (X k ) h(x 1 ) a.s. k=1 X k X k = [X k/ ] ;

138 Continuous alphabet for P 0 and P 1 Need to estimate n k=1 log f 1(X k ) If E[log f 1 (X k )] < then by SLLN n 1 n log f 1 (X k ) h(x 1 ) a.s. k=1 X k Xk = [X k/ ] ; Use Universal lossless coding on X1, X 2,..., X n

139 Continuous alphabet for P 0 and P 1 Need to estimate n k=1 log f 1(X k ) If E[log f 1 (X k )] < then by SLLN n 1 n log f 1 (X k ) h(x 1 ) a.s. k=1 X k Xk = [X k/ ] ; Use Universal lossless coding on X1, X 2,..., X n H(X1 ) + log h(x 1) as 0.

140 Continuous alphabet for P 0 and P 1 Need to estimate n k=1 log f 1(X k ) If E[log f 1 (X k )] < then by SLLN n 1 n log f 1 (X k ) h(x 1 ) a.s. k=1 X k Xk = [X k/ ] ; Use Universal lossless coding on X1, X 2,..., X n H(X1 ) + log h(x 1) as 0. Test is modified to W n = L n (X 1:n) n log n log f 0 (X k ) n λ 2 k=1

141 Continuous alphabet for P 0 and P 1 Need to estimate n k=1 log f 1(X k ) If E[log f 1 (X k )] < then by SLLN n 1 n log f 1 (X k ) h(x 1 ) a.s. k=1 X k Xk = [X k/ ] ; Use Universal lossless coding on X1, X 2,..., X n H(X1 ) + log h(x 1) as 0. Test is modified to W n = L n (X 1:n) n log n log f 0 (X k ) n λ 2 k=1 f 1 C = {f 1 : D(f 1, f 0 ) λ}.

142 Continuous alphabet case Why Uniform Quantization?

143 Continuous alphabet case Why Uniform Quantization? Uniform scalar quantization with optimal source coding is optimal at high rates.

144 Continuous alphabet case Why Uniform Quantization? Uniform scalar quantization with optimal source coding is optimal at high rates. An adaptive uniform quantizer makes {X n } non i.i.d. L n unable to learn the underlying distribution in such scenario.

145 Continuous alphabet case Why Uniform Quantization? Uniform scalar quantization with optimal source coding is optimal at high rates. An adaptive uniform quantizer makes {X n } non i.i.d. L n unable to learn the underlying distribution in such scenario. Non-uniform partitions with j in j th bin and with p.m.f p j require knowledge of p j which is unknown under H 1.

146 Continuous alphabet case Why Uniform Quantization? Uniform scalar quantization with optimal source coding is optimal at high rates. An adaptive uniform quantizer makes {X n } non i.i.d. L n unable to learn the underlying distribution in such scenario. Non-uniform partitions with j in j th bin and with p.m.f p j require knowledge of p j which is unknown under H 1. f 0 (X i ) p 0 (X i ). W n = L n (Xi ) n j=1 log p 0(Xi ) nλ/2.

147 Continuous alphabet case Why Uniform Quantization? Uniform scalar quantization with optimal source coding is optimal at high rates. An adaptive uniform quantizer makes {X n } non i.i.d. L n unable to learn the underlying distribution in such scenario. Non-uniform partitions with j in j th bin and with p.m.f p j require knowledge of p j which is unknown under H 1. j=1 log p 0(Xi ) nλ/2. Range of the quantization: f 1 s tail probabilities less than a small specific value at a fixed boundary f 0 (X i ) p 0 (X i ). W n = L n (Xi ) n

148 LZSLRT algorithm Use Lempel-Ziv incremental parsing technique (LZ78).

149 LZSLRT algorithm Use Lempel-Ziv incremental parsing technique (LZ78). Added a correction term nɛ n in W n

150 LZSLRT algorithm Use Lempel-Ziv incremental parsing technique (LZ78). Added a correction term nɛ n in W n ɛ n = C ( 1 log log n + + log n n C depends on size of the quantized alphabet. ) log log n. log n

151 LZSLRT algorithm Use Lempel-Ziv incremental parsing technique (LZ78). Added a correction term nɛ n in W n ɛ n = C ( 1 log log n + + log n n C depends on size of the quantized alphabet. ) log log n. log n LZSLRT performance is close to the nearly optimal parametric GLR sequential tests and is better for some class of distributions (e.g. Pareto)

152 KTSLRT algorithm L n (X 1:n ) from combined use of Krichevsky-Trofimov estimator and the Arithmetic Encoder

153 KTSLRT algorithm L n (X1:n ) from combined use of Krichevsky-Trofimov estimator and the Arithmetic Encoder Estimate source distribution via K-T estimator for a finite alphabet source as, P c (x n 1 ) = n t=1 v(x t /x t 1 1 ) t 1 + A 2,

154 KTSLRT algorithm L n (X1:n ) from combined use of Krichevsky-Trofimov estimator and the Arithmetic Encoder Estimate source distribution via K-T estimator for a finite alphabet source as, P c (x n 1 ) = n t=1 v(x t /x t 1 1 ) t 1 + A 2, Using AE with above estimate: L n (x n 1 ) < log P c(x n 1 ) + 2

155 KTSLRT algorithm L n (X1:n ) from combined use of Krichevsky-Trofimov estimator and the Arithmetic Encoder Estimate source distribution via K-T estimator for a finite alphabet source as, P c (x n 1 ) = n t=1 v(x t /x t 1 1 ) t 1 + A 2, Using AE with above estimate: L n (x n 1 ) < log P c(x n 1 ) + 2 Universal codes defined via the K-T estimator and AE are nearly optimal

156 Performance Comparison LZLRT E DD = 0.5 E H1 [N] E H0 [N] versus P E = 0.5 P FA P MD

157 Performance Comparison LZLRT E DD = 0.5 E H1 [N] E H0 [N] versus P E = 0.5 P FA P MD E DD P E = 0.05 P E = 0.01 P E = SPRT GLR-Lai LZSLRT Table: f 0 N (0, 5) and f 0 N (3, 5)

158 Performance Comparison LZLRT E DD = 0.5 E H1 [N] E H0 [N] versus P E = 0.5 P FA P MD E DD P E = 0.05 P E = 0.01 P E = SPRT GLR-Lai LZSLRT Table: f 0 N (0, 5) and f 0 N (3, 5) E DD P E = 0.05 P E = 0.01 P E = SPRT GLR-Lai LZSLRT Table: f 0 P(10, 2) and f 0 P(3, 2)

159 Performance Comparison KTSLRT Gaussian case. f 1 N (0, 5) and f 0 N (0, 1), 8 bit uniform quantizer KTSLRT S=1 KTSLRT S=0 LZSLRT 30 E DD P E

160 Performance Comparison Lognormal Distribution and Pareto Distribution examples f 1 lnn (3, 3) and f 0 lnn (0, 3) KTSLRT S=1 LZSLRT E DD P E

161 Performance Comparison Lognormal Distribution and Pareto Distribution examples f 1 lnn (3, 3) and f 0 lnn (0, 3) f 1 P(3, 2) and f 0 P(10, 2), support set (2, 10) KTSLRT S=1 LZSLRT KTSLRT S=1 LZSLRT E DD 10 8 E DD P E P E

162 Performance Comparison Gaussian case (f 1 N (0, 5) and f 0 N (0, 1)) with different estimators

163 Performance Comparison Gaussian case (f 1 N (0, 5) and f 0 N (0, 1)) with different estimators 1-NN differential entropy estimator is, γ = Euler-Mascheroni constant ĥ n = 1 n log ρ(i) + log(n 1) + γ + 1, ρ(i) = min n X i X j j:1 j n,j i i=1

164 Performance Comparison Gaussian case (f 1 N (0, 5) and f 0 N (0, 1)) with different estimators 1-NN differential entropy estimator is, γ = Euler-Mascheroni constant ĥ n = 1 n log ρ(i) + log(n 1) + γ + 1, ρ(i) = min n X i X j j:1 j n,j i i=1 Kernel density estimator at a point z is ˆf n (z) = 1 n ( ) z Xi K, k: kernel and w n : bandwidth w n w n i=1

165 Performance Comparison Gaussian case (f 1 N (0, 5) and f 0 N (0, 1)) with different estimators 1-NN differential entropy estimator is, γ = Euler-Mascheroni constant ĥ n = 1 n log ρ(i) + log(n 1) + γ + 1, ρ(i) = min n X i X j j:1 j n,j i i=1 Kernel density estimator at a point z is ˆf n (z) = 1 n ( ) z Xi K, k: kernel and w n : bandwidth w n w n i=1 KTSLRT S=1 Test with Kernel Denisty Estr. Test with 1NN Differential Entropy Estr. E DD P E

166 Performance Comparison Comparison with Hoeffding Test: P 0 B(8, 0.2), P 1 B(8, 0.5)

167 Performance Comparison Comparison with Hoeffding Test: P 0 B(8, 0.2), P 1 B(8, 0.5) Hoeffding test: Asymptotically optimal universal fixed sample size test for finite alphabet sources

168 Performance Comparison Comparison with Hoeffding Test: P 0 B(8, 0.2), P 1 B(8, 0.5) Hoeffding test: Asymptotically optimal universal fixed sample size test for finite alphabet sources δ FSS = I{D(Γ n P 0 ) η}

169 Performance Comparison Comparison with Hoeffding Test: P 0 B(8, 0.2), P 1 B(8, 0.5) Hoeffding test: Asymptotically optimal universal fixed sample size test for finite alphabet sources δ FSS = I{D(Γ n P 0 ) η} KT estimator with AE Hoeffding test 20 E DD P E

170 Decentralized case Model and Algorithm Sequential+Universal+Cooperative

171 Decentralized case Model and Algorithm Sequential+Universal+Cooperative observations are i.i.d. at any CR and independent across CRs.

172 Decentralized case Model and Algorithm Sequential+Universal+Cooperative observations are i.i.d. at any CR and independent across CRs. Algorithm: at SU s:, at FC:

173 Decentralized case Model and Algorithm Sequential+Universal+Cooperative observations are i.i.d. at any CR and independent across CRs. Algorithm: at SU s: LZSLRT/KTSLRT, at FC: SPRT

174 Decentralized case Model and Algorithm Sequential+Universal+Cooperative observations are i.i.d. at any CR and independent across CRs. Algorithm: at SU s: LZSLRT/KTSLRT, at FC: SPRT noisy MAC between SUs and FC FC SPRT: binary hypothesis testing of g µ1 vs g µ0

175 Decentralized case Performance Comparison b 1 = 1, b 0 = 1, I = 2, L = 5 and Z k N (0, 1)

176 Decentralized case Performance Comparison b 1 = 1, b 0 = 1, I = 2, L = 5 and Z k N (0, 1) DualSPRT KTSLRT SPRT S=1 LZSLRT SPRT E DD P E f 0,l N (0, 1) and f 1,l N (0, 5), for 1 l L.

177 Decentralized case Performance Comparison b 1 = 1, b 0 = 1, I = 2, L = 5 and Z k N (0, 1) E DD DualSPRT KTSLRT SPRT S=1 LZSLRT SPRT P E E DD DualSPRT KTSLRT SPRT S=1 LZSLRT SPRT P E f 0,l N (0, 1) and f 1,l N (0, 5), for 1 l L. f 0,l P(10, 2) and f 1,l P(3, 2), for 1 l L.

178 Decentralized case Performance Comparison b 1 = 1, b 0 = 1, I = 2, L = 5 and Z k N (0, 1) E DD DualSPRT KTSLRT SPRT S=1 LZSLRT SPRT P E E DD DualSPRT KTSLRT SPRT S=1 LZSLRT SPRT P E f 0,l N (0, 1) and f 1,l N (0, 5), for 1 l L. f 0,l P(10, 2) and f 1,l P(3, 2), for 1 l L. Analysis using Perturbed Random Walk theory Ŵ n = S n + ξ n, ξ n /n 0 a.s.

179 Multihypothesis Decentralized Sequential Testing (distributions completely known)

180 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1

181 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1 [ E i log f k ] (X 1 ) f j = (X 1 )

182 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1 [ E i log f k ] (X 1 ) f j = D(f i f j ) D(f i f k ) (X 1 )

183 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1 [ E i log f k ] (X 1 ) f j = D(f i f j ) D(f i f k ) (X 1 ) [ min E i log f k ] (X 1 ) j k f j = (X 1 )

184 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1 [ E i log f k ] (X 1 ) f j = D(f i f j ) D(f i f k ) (X 1 ) [ min E i log f k ] (X 1 ) j k f j = (X 1 ) { < 0 when k i

185 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1 [ E i log f k ] (X 1 ) f j = D(f i f j ) D(f i f k ) (X 1 ) [ min E i log f k ] (X 1 ) j k f j = (X 1 ) { < 0 when k i > 0 when k = i

186 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1 [ E i log f k ] (X 1 ) f j = D(f i f j ) D(f i f k ) (X 1 ) [ min E i log f k ] (X 1 ) j k f j = (X 1 ) { < 0 when k i > 0 when k = i At local node l, {W k,j n,l, 0 k, j M 1}

187 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1 [ E i log f k ] (X 1 ) f j = D(f i f j ) D(f i f k ) (X 1 ) [ min E i log f k ] (X 1 ) j k f j = (X 1 ) { < 0 when k i > 0 when k = i At local node l, {W k,j n,l, 0 k, j M 1} W k,j n,l = W k,j n 1,l + log f l k (X n,l ) f j l (X n,l), W k,j 0,l = 0

188 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1 [ E i log f k ] (X 1 ) f j = D(f i f j ) D(f i f k ) (X 1 ) [ min E i log f k ] (X 1 ) j k f j = (X 1 ) { < 0 when k i > 0 when k = i At local node l, {W k,j n,l, 0 k, j M 1} W k,j n,l = W k,j n 1,l + log f l k (X n,l ) f j l (X n,l), W k,j 0,l = 0 N l = inf{n : W k,j n,l > A for all j k and some k}

189 Multihypothesis Decentralized Sequential Testing: Algorithm-1 (DMSLRT-1) M Hypothesis (M > 2), H i : X n f i, i = 0,..., M 1 [ E i log f k ] (X 1 ) f j = D(f i f j ) D(f i f k ) (X 1 ) [ min E i log f k ] (X 1 ) j k f j = (X 1 ) { < 0 when k i > 0 when k = i At local node l, {W k,j n,l, 0 k, j M 1} W k,j n,l = W k,j n 1,l + log f l k (X n,l ) f j l (X n,l), W k,j 0,l = 0 N l = inf{n : W k,j n,l > A for all j k and some k} or N l = inf{n : max min W k,j k j k n,l > A}

190 Algorithm-1 (DMSLRT-1) Contd. Use reflected random walks at local nodes, max(w n,l, 0)

191 Algorithm-1 (DMSLRT-1) Contd. Use reflected random walks at local nodes, max(w n,l, 0) Decision by node l = H i At time k N l, node l transmits b i

192 Algorithm-1 (DMSLRT-1) Contd. Use reflected random walks at local nodes, max(w n,l, 0) Decision by node l = H i At time k N l, node l transmits b i Instead of physical layer fusion, TDMA used.

193 Algorithm-1 (DMSLRT-1) Contd. Use reflected random walks at local nodes, max(w n,l, 0) Decision by node l = H i At time k N l, node l transmits b i Instead of physical layer fusion, TDMA used. FC uses the same test with hypotheses G m : Y k f m FC = N (b m, σ 2 )

194 Numerical results H m : X k,l N (m, 1), m=0,..., 4, No of local nodes=5 E DD DMSLRT 1 DualTest D1 MSLRT 1:Test D1 Test D1:MSLRT P FA Figure: Comparison among different Multihypothesis schemes

195 Algorithm-2 (DMSLRT-2) Reduce false alarms caused by Gaussian noise before first transmission from local nodes.

196 Algorithm-2 (DMSLRT-2) Reduce false alarms caused by Gaussian noise before first transmission from local nodes. FC statistic is modified as ( Fn i,j i,0 = F n F n j,0 i,0, where F n = max F i,0 n 1 + log f FC i (Y ) n) f Z (Y n ), 0.

197 Algorithm-2 (DMSLRT-2) Reduce false alarms caused by Gaussian noise before first transmission from local nodes. FC statistic is modified as ( Fn i,j i,0 = F n F n j,0 i,0, where F n = max F i,0 n 1 + log f FC i (Y ) n) f Z (Y n ), 0. Expected drift of F i,0 n > 0 only when E[Y n ] > b i /2

198 Algorithm-2 (DMSLRT-2) Reduce false alarms caused by Gaussian noise before first transmission from local nodes. FC statistic is modified as ( Fn i,j i,0 = F n F n j,0 i,0, where F n = max F i,0 n 1 + log f FC i (Y ) n) f Z (Y n ), 0. i,0 Expected drift of F n > 0 only when E[Y n ] > b i /2 Positive b i s make E[Fn i,j ] negligible before first transmission

199 Numerical results SNRs (-10 db, -6 db, 0 db and 6 db), PU with SNR -10 db uses the channel DMSLRT 2 DMSLRT E DD P FA Figure: Comparison between MDSLRT-1 and MDSLRT-2

200 Analysis of Algorithm-1(DMSLRT-1) E DD Analysis at SU: Dominant event-reflected random walk with minimum positive expected drift

201 Analysis of Algorithm-1(DMSLRT-1) E DD Analysis at SU: Dominant event-reflected random walk with minimum positive expected drift E l DD = E i[n l ] A min j i D(f i l f j l )

202 Analysis of Algorithm-1(DMSLRT-1) E DD Analysis at SU: Dominant event-reflected random walk with minimum positive expected drift E l DD = E i[n l ] A min j i D(f i l f j l ) Nonlinear renewal theory to take care of overshoots

203 Analysis of Algorithm-1(DMSLRT-1) E DD Analysis at SU: Dominant event-reflected random walk with minimum positive expected drift E l DD = E i[n l ] A min j i D(f i l f j l ) Nonlinear renewal theory to take care of overshoots P FA Analysis at SU: Dominant event in {W k,j n,l, k i}-when the expected drift is most negative.

204 Analysis of Algorithm-1(DMSLRT-1) E DD Analysis at SU: Dominant event-reflected random walk with minimum positive expected drift E l DD = E i[n l ] A min j i D(f i l f j l ) Nonlinear renewal theory to take care of overshoots P FA Analysis at SU: Dominant event in {W k,j n,l, k i}-when the expected drift is most negative. N k,j l = inf{n : W k,j n,l ĵ = argmin j i D(f i l f j l ) > A} ; P l FA P(min k i N k,i l < N i,ĵ l )

205 Analysis of Algorithm-1(DMSLRT-1) E DD Analysis at SU: Dominant event-reflected random walk with minimum positive expected drift E l DD = E i[n l ] A min j i D(f i l f j l ) Nonlinear renewal theory to take care of overshoots P FA Analysis at SU: Dominant event in {W k,j n,l, k i}-when the expected drift is most negative. N k,j l = inf{n : W k,j n,l ĵ = argmin j i D(f i l f j l ) > A} ; P l FA P(min k i N k,i l < N i,ĵ l ) E DD Analysis of DMSLRT-1: E DD E i [max l N l ] + E i [N FC ]

Nonparametric Distributed Sequential Detection via Universal Source Coding

Nonparametric Distributed Sequential Detection via Universal Source Coding onparametric Distributed Sequential Detection via Universal Source Coding Jithin K. Sreedharan and Vinod Sharma Department of Electrical Communication Engineering Indian Institute of Science Bangalore

More information

Spectrum Sensing in Cognitive Radios using Distributed Sequential Detection

Spectrum Sensing in Cognitive Radios using Distributed Sequential Detection Spectrum Sensing in Cognitive Radios using Distributed Sequential Detection A Thesis Submitted for the Degree of Master of Science (Engineering) in the Faculty of Engineering by Jithin K. S. under the

More information

A New Algorithm for Nonparametric Sequential Detection

A New Algorithm for Nonparametric Sequential Detection A New Algorithm for Nonparametric Sequential Detection Shouvik Ganguly, K. R. Sahasranand and Vinod Sharma Department of Electrical Communication Engineering Indian Institute of Science, Bangalore, India

More information

Analysis of DualCUSUM: a Distributed Energy Efficient Algorithm for Change Detection

Analysis of DualCUSUM: a Distributed Energy Efficient Algorithm for Change Detection NCC 29, January 6-8, IIT Guwahati 29 of DualCUSUM: a Distributed Energy Efficient Algorithm for Change Detection Taposh Banerjee and Vinod Sharma Electrical Communication Engineering, Indian Institute

More information

Decentralized Sequential Hypothesis Testing. Change Detection

Decentralized Sequential Hypothesis Testing. Change Detection Decentralized Sequential Hypothesis Testing & Change Detection Giorgos Fellouris, Columbia University, NY, USA George V. Moustakides, University of Patras, Greece Outline Sequential hypothesis testing

More information

Spectrum Sensing via Event-triggered Sampling

Spectrum Sensing via Event-triggered Sampling Spectrum Sensing via Event-triggered Sampling Yasin Yilmaz Electrical Engineering Department Columbia University New Yor, NY 0027 Email: yasin@ee.columbia.edu George Moustaides Dept. of Electrical & Computer

More information

Uncertainty. Jayakrishnan Unnikrishnan. CSL June PhD Defense ECE Department

Uncertainty. Jayakrishnan Unnikrishnan. CSL June PhD Defense ECE Department Decision-Making under Statistical Uncertainty Jayakrishnan Unnikrishnan PhD Defense ECE Department University of Illinois at Urbana-Champaign CSL 141 12 June 2010 Statistical Decision-Making Relevant in

More information

Sequential Detection. Changes: an overview. George V. Moustakides

Sequential Detection. Changes: an overview. George V. Moustakides Sequential Detection of Changes: an overview George V. Moustakides Outline Sequential hypothesis testing and Sequential detection of changes The Sequential Probability Ratio Test (SPRT) for optimum hypothesis

More information

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 9, SEPTEMBER

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 9, SEPTEMBER IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 9, SEPTEMBER 2012 4509 Cooperative Sequential Spectrum Sensing Based on Level-Triggered Sampling Yasin Yilmaz,StudentMember,IEEE, George V. Moustakides,SeniorMember,IEEE,and

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

Sensing for Cognitive Radio Networks

Sensing for Cognitive Radio Networks Censored Truncated Sequential Spectrum 1 Sensing for Cognitive Radio Networks Sina Maleki Geert Leus arxiv:1106.2025v2 [cs.sy] 14 Mar 2013 Abstract Reliable spectrum sensing is a key functionality of a

More information

Cooperative Spectrum Prediction for Improved Efficiency of Cognitive Radio Networks

Cooperative Spectrum Prediction for Improved Efficiency of Cognitive Radio Networks Cooperative Spectrum Prediction for Improved Efficiency of Cognitive Radio Networks by Nagwa Shaghluf B.S., Tripoli University, Tripoli, Libya, 2009 A Thesis Submitted in Partial Fulfillment of the Requirements

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and Estimation I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 22, 2015

More information

Lecture 8: Information Theory and Statistics

Lecture 8: Information Theory and Statistics Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang

More information

Censoring for Type-Based Multiple Access Scheme in Wireless Sensor Networks

Censoring for Type-Based Multiple Access Scheme in Wireless Sensor Networks Censoring for Type-Based Multiple Access Scheme in Wireless Sensor Networks Mohammed Karmoose Electrical Engineering Department Alexandria University Alexandria 1544, Egypt Email: mhkarmoose@ieeeorg Karim

More information

Novel spectrum sensing schemes for Cognitive Radio Networks

Novel spectrum sensing schemes for Cognitive Radio Networks Novel spectrum sensing schemes for Cognitive Radio Networks Cantabria University Santander, May, 2015 Supélec, SCEE Rennes, France 1 The Advanced Signal Processing Group http://gtas.unican.es The Advanced

More information

Asymptotically Optimal and Bandwith-efficient Decentralized Detection

Asymptotically Optimal and Bandwith-efficient Decentralized Detection Asymptotically Optimal and Bandwith-efficient Decentralized Detection Yasin Yılmaz and Xiaodong Wang Electrical Engineering Department, Columbia University New Yor, NY 10027 Email: yasin,wangx@ee.columbia.edu

More information

Information Theory and Hypothesis Testing

Information Theory and Hypothesis Testing Summer School on Game Theory and Telecommunications Campione, 7-12 September, 2014 Information Theory and Hypothesis Testing Mauro Barni University of Siena September 8 Review of some basic results linking

More information

Dynamic spectrum access with learning for cognitive radio

Dynamic spectrum access with learning for cognitive radio 1 Dynamic spectrum access with learning for cognitive radio Jayakrishnan Unnikrishnan and Venugopal V. Veeravalli Department of Electrical and Computer Engineering, and Coordinated Science Laboratory University

More information

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for

MARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1

More information

Physical Layer Binary Consensus. Study

Physical Layer Binary Consensus. Study Protocols: A Study Venugopalakrishna Y. R. and Chandra R. Murthy ECE Dept., IISc, Bangalore 23 rd Nov. 2013 Outline Introduction to Consensus Problem Setup Protocols LMMSE-based scheme Co-phased combining

More information

Robust Nonparametric Sequential Distributed Spectrum Sensing under EMI and Fading

Robust Nonparametric Sequential Distributed Spectrum Sensing under EMI and Fading Robust Nonparametric Sequential Distributed Spectrum Sensing under EMI and Fading Sahasranand K. R. and Vinod Sharma, Senior Member, IEEE arxiv:141.474v3 [cs.it] 3 Apr 15 Abstract A nonparametric distributed

More information

Fully-distributed spectrum sensing: application to cognitive radio

Fully-distributed spectrum sensing: application to cognitive radio Fully-distributed spectrum sensing: application to cognitive radio Philippe Ciblat Dpt Comelec, Télécom ParisTech Joint work with F. Iutzeler (PhD student funded by DGA grant) Cognitive radio principle

More information

Distributed Binary Quantizers for Communication Constrained Large-scale Sensor Networks

Distributed Binary Quantizers for Communication Constrained Large-scale Sensor Networks Distributed Binary Quantizers for Communication Constrained Large-scale Sensor Networks Ying Lin and Biao Chen Dept. of EECS Syracuse University Syracuse, NY 13244, U.S.A. ylin20 {bichen}@ecs.syr.edu Peter

More information

Censoring for Improved Sensing Performance in Infrastructure-less Cognitive Radio Networks

Censoring for Improved Sensing Performance in Infrastructure-less Cognitive Radio Networks Censoring for Improved Sensing Performance in Infrastructure-less Cognitive Radio Networks Mohamed Seif 1 Mohammed Karmoose 2 Moustafa Youssef 3 1 Wireless Intelligent Networks Center (WINC),, Egypt 2

More information

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria Source Coding Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Asymptotic Equipartition Property Optimal Codes (Huffman Coding) Universal

More information

LIKELIHOOD RECEIVER FOR FH-MFSK MOBILE RADIO*

LIKELIHOOD RECEIVER FOR FH-MFSK MOBILE RADIO* LIKELIHOOD RECEIVER FOR FH-MFSK MOBILE RADIO* Item Type text; Proceedings Authors Viswanathan, R.; S.C. Gupta Publisher International Foundation for Telemetering Journal International Telemetering Conference

More information

OPPORTUNISTIC Spectrum Access (OSA) is emerging

OPPORTUNISTIC Spectrum Access (OSA) is emerging Optimal and Low-complexity Algorithms for Dynamic Spectrum Access in Centralized Cognitive Radio Networks with Fading Channels Mario Bkassiny, Sudharman K. Jayaweera, Yang Li Dept. of Electrical and Computer

More information

Optimal Distributed Detection Strategies for Wireless Sensor Networks

Optimal Distributed Detection Strategies for Wireless Sensor Networks Optimal Distributed Detection Strategies for Wireless Sensor Networks Ke Liu and Akbar M. Sayeed University of Wisconsin-Madison kliu@cae.wisc.edu, akbar@engr.wisc.edu Abstract We study optimal distributed

More information

ADAPTIVE CLUSTERING ALGORITHM FOR COOPERATIVE SPECTRUM SENSING IN MOBILE ENVIRONMENTS. Jesus Perez and Ignacio Santamaria

ADAPTIVE CLUSTERING ALGORITHM FOR COOPERATIVE SPECTRUM SENSING IN MOBILE ENVIRONMENTS. Jesus Perez and Ignacio Santamaria ADAPTIVE CLUSTERING ALGORITHM FOR COOPERATIVE SPECTRUM SENSING IN MOBILE ENVIRONMENTS Jesus Perez and Ignacio Santamaria Advanced Signal Processing Group, University of Cantabria, Spain, https://gtas.unican.es/

More information

Sequential Procedure for Testing Hypothesis about Mean of Latent Gaussian Process

Sequential Procedure for Testing Hypothesis about Mean of Latent Gaussian Process Applied Mathematical Sciences, Vol. 4, 2010, no. 62, 3083-3093 Sequential Procedure for Testing Hypothesis about Mean of Latent Gaussian Process Julia Bondarenko Helmut-Schmidt University Hamburg University

More information

Universal Loseless Compression: Context Tree Weighting(CTW)

Universal Loseless Compression: Context Tree Weighting(CTW) Universal Loseless Compression: Context Tree Weighting(CTW) Dept. Electrical Engineering, Stanford University Dec 9, 2014 Universal Coding with Model Classes Traditional Shannon theory assume a (probabilistic)

More information

EUSIPCO

EUSIPCO EUSIPCO 3 569736677 FULLY ISTRIBUTE SIGNAL ETECTION: APPLICATION TO COGNITIVE RAIO Franc Iutzeler Philippe Ciblat Telecom ParisTech, 46 rue Barrault 753 Paris, France email: firstnamelastname@telecom-paristechfr

More information

Algorithms for Dynamic Spectrum Access with Learning for Cognitive Radio

Algorithms for Dynamic Spectrum Access with Learning for Cognitive Radio Algorithms for Dynamic Spectrum Access with Learning for Cognitive Radio Jayakrishnan Unnikrishnan, Student Member, IEEE, and Venugopal V. Veeravalli, Fellow, IEEE 1 arxiv:0807.2677v2 [cs.ni] 21 Nov 2008

More information

بسم الله الرحمن الرحيم

بسم الله الرحمن الرحيم بسم الله الرحمن الرحيم Reliability Improvement of Distributed Detection in Clustered Wireless Sensor Networks 1 RELIABILITY IMPROVEMENT OF DISTRIBUTED DETECTION IN CLUSTERED WIRELESS SENSOR NETWORKS PH.D.

More information

A Step Towards the Cognitive Radar: Target Detection under Nonstationary Clutter

A Step Towards the Cognitive Radar: Target Detection under Nonstationary Clutter A Step Towards the Cognitive Radar: Target Detection under Nonstationary Clutter Murat Akcakaya Department of Electrical and Computer Engineering University of Pittsburgh Email: akcakaya@pitt.edu Satyabrata

More information

Distributed Spectrum Sensing for Cognitive Radio Networks Based on the Sphericity Test

Distributed Spectrum Sensing for Cognitive Radio Networks Based on the Sphericity Test This article h been accepted for publication in a future issue of this journal, but h not been fully edited. Content may change prior to final publication. Citation information: DOI 0.09/TCOMM.208.2880902,

More information

arxiv: v2 [eess.sp] 20 Nov 2017

arxiv: v2 [eess.sp] 20 Nov 2017 Distributed Change Detection Based on Average Consensus Qinghua Liu and Yao Xie November, 2017 arxiv:1710.10378v2 [eess.sp] 20 Nov 2017 Abstract Distributed change-point detection has been a fundamental

More information

Evidence Theory based Cooperative Energy Detection under Noise Uncertainty

Evidence Theory based Cooperative Energy Detection under Noise Uncertainty Evidence Theory based Cooperative Energy Detection under Noise Uncertainty by Sachin Chaudhari, Prakash Gohain in IEEE GLOBECOM 7 Report No: IIIT/TR/7/- Centre for Communications International Institute

More information

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf

Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Reading: Ch. 5 in Kay-II. (Part of) Ch. III.B in Poor. EE 527, Detection and Estimation Theory, # 5c Detecting Parametric Signals in Noise

More information

Multiple Bits Distributed Moving Horizon State Estimation for Wireless Sensor Networks. Ji an Luo

Multiple Bits Distributed Moving Horizon State Estimation for Wireless Sensor Networks. Ji an Luo Multiple Bits Distributed Moving Horizon State Estimation for Wireless Sensor Networks Ji an Luo 2008.6.6 Outline Background Problem Statement Main Results Simulation Study Conclusion Background Wireless

More information

COMPARISON OF STATISTICAL ALGORITHMS FOR POWER SYSTEM LINE OUTAGE DETECTION

COMPARISON OF STATISTICAL ALGORITHMS FOR POWER SYSTEM LINE OUTAGE DETECTION COMPARISON OF STATISTICAL ALGORITHMS FOR POWER SYSTEM LINE OUTAGE DETECTION Georgios Rovatsos*, Xichen Jiang*, Alejandro D. Domínguez-García, and Venugopal V. Veeravalli Department of Electrical and Computer

More information

Communication constraints and latency in Networked Control Systems

Communication constraints and latency in Networked Control Systems Communication constraints and latency in Networked Control Systems João P. Hespanha Center for Control Engineering and Computation University of California Santa Barbara In collaboration with Antonio Ortega

More information

Adaptive Spectrum Sensing

Adaptive Spectrum Sensing Adaptive Spectrum Sensing Wenyi Zhang, Ahmed K. Sadek, Cong Shen, and Stephen J. Shellhammer Abstract A challenge met in realizing spectrum sensing for cognitive radio is how to identify occupied channels

More information

Data-Efficient Quickest Change Detection

Data-Efficient Quickest Change Detection Data-Efficient Quickest Change Detection Venu Veeravalli ECE Department & Coordinated Science Lab University of Illinois at Urbana-Champaign http://www.ifp.illinois.edu/~vvv (joint work with Taposh Banerjee)

More information

Asynchronous Multi-Sensor Change-Point Detection for Seismic Tremors

Asynchronous Multi-Sensor Change-Point Detection for Seismic Tremors Asynchronous Multi-Sensor Change-Point Detection for Seismic Tremors Liyan Xie, Yao Xie, George V. Moustakides School of Industrial & Systems Engineering, Georgia Institute of Technology, {lxie49, yao.xie}@isye.gatech.edu

More information

Information Dimension

Information Dimension Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, 2011 1 / 26 2 / 26 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of

More information

Cooperative Spectrum Sensing for Cognitive Radios under Bandwidth Constraints

Cooperative Spectrum Sensing for Cognitive Radios under Bandwidth Constraints Cooperative Spectrum Sensing for Cognitive Radios under Bandwidth Constraints Chunhua Sun, Wei Zhang, and haled Ben Letaief, Fellow, IEEE Department of Electronic and Computer Engineering The Hong ong

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Distributed Estimation and Detection for Smart Grid

Distributed Estimation and Detection for Smart Grid Distributed Estimation and Detection for Smart Grid Texas A&M University Joint Wor with: S. Kar (CMU), R. Tandon (Princeton), H. V. Poor (Princeton), and J. M. F. Moura (CMU) 1 Distributed Estimation/Detection

More information

Chapter 4: Continuous channel and its capacity

Chapter 4: Continuous channel and its capacity meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat

More information

1.1 Basis of Statistical Decision Theory

1.1 Basis of Statistical Decision Theory ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016 Lecture 1: Introduction Lecturer: Yihong Wu Scribe: AmirEmad Ghassami, Jan 21, 2016 [Ed. Jan 31] Outline: Introduction of

More information

Multi User Detection I

Multi User Detection I January 12, 2005 Outline Overview Multiple Access Communication Motivation: What is MU Detection? Overview of DS/CDMA systems Concept and Codes used in CDMA CDMA Channels Models Synchronous and Asynchronous

More information

Quickest Anomaly Detection: A Case of Active Hypothesis Testing

Quickest Anomaly Detection: A Case of Active Hypothesis Testing Quickest Anomaly Detection: A Case of Active Hypothesis Testing Kobi Cohen, Qing Zhao Department of Electrical Computer Engineering, University of California, Davis, CA 95616 {yscohen, qzhao}@ucdavis.edu

More information

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10 Digital Band-pass Modulation PROF. MICHAEL TSAI 211/11/1 Band-pass Signal Representation a t g t General form: 2πf c t + φ t g t = a t cos 2πf c t + φ t Envelope Phase Envelope is always non-negative,

More information

Lecture 7: Wireless Channels and Diversity Advanced Digital Communications (EQ2410) 1

Lecture 7: Wireless Channels and Diversity Advanced Digital Communications (EQ2410) 1 Wireless : Wireless Advanced Digital Communications (EQ2410) 1 Thursday, Feb. 11, 2016 10:00-12:00, B24 1 Textbook: U. Madhow, Fundamentals of Digital Communications, 2008 1 / 15 Wireless Lecture 1-6 Equalization

More information

Scalable robust hypothesis tests using graphical models

Scalable robust hypothesis tests using graphical models Scalable robust hypothesis tests using graphical models Umamahesh Srinivas ipal Group Meeting October 22, 2010 Binary hypothesis testing problem Random vector x = (x 1,...,x n ) R n generated from either

More information

TCP over Cognitive Radio Channels

TCP over Cognitive Radio Channels 1/43 TCP over Cognitive Radio Channels Sudheer Poojary Department of ECE, Indian Institute of Science, Bangalore IEEE-IISc I-YES seminar 19 May 2016 2/43 Acknowledgments The work presented here was done

More information

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel

Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Efficient Use of Joint Source-Destination Cooperation in the Gaussian Multiple Access Channel Ahmad Abu Al Haija ECE Department, McGill University, Montreal, QC, Canada Email: ahmad.abualhaija@mail.mcgill.ca

More information

Algorithms for Dynamic Spectrum Access with Learning for Cognitive Radio

Algorithms for Dynamic Spectrum Access with Learning for Cognitive Radio Algorithms for Dynamic Spectrum Access with Learning for Cognitive Radio Jayakrishnan Unnikrishnan, Student Member, IEEE, and Venugopal V. Veeravalli, Fellow, IEEE 1 arxiv:0807.2677v4 [cs.ni] 6 Feb 2010

More information

Collaborative Spectrum Sensing in the Presence of Byzantine Attacks in Cognitive Radio Networks

Collaborative Spectrum Sensing in the Presence of Byzantine Attacks in Cognitive Radio Networks Collaborative Spectrum Sensing in the Presence of Byzantine Attacks in Cognitive Radio Networks Priyank Anand, Ankit Singh Rawat Department of Electrical Engineering Indian Institute of Technology Kanpur

More information

Introduction to Statistical Inference

Introduction to Statistical Inference Structural Health Monitoring Using Statistical Pattern Recognition Introduction to Statistical Inference Presented by Charles R. Farrar, Ph.D., P.E. Outline Introduce statistical decision making for Structural

More information

Improved Spectrum Utilization in Cognitive Radio Systems

Improved Spectrum Utilization in Cognitive Radio Systems Improved Spectrum Utilization in Cognitive Radio Systems Lei Cao and Ramanarayanan Viswanathan Department of Electrical Engineering The University of Mississippi Jan. 15, 2014 Lei Cao and Ramanarayanan

More information

Collaborative Spectrum Sensing Algorithm Based on Exponential Entropy in Cognitive Radio Networks

Collaborative Spectrum Sensing Algorithm Based on Exponential Entropy in Cognitive Radio Networks S S symmetry Article Collaborative Spectrum Sensing Algorithm Based on Exponential Entropy in Cognitive Radio Networks Fang Ye *, Xun Zhang and Yibing i College of Information and Communication Engineering,

More information

CHAPTER 14. Based on the info about the scattering function we know that the multipath spread is T m =1ms, and the Doppler spread is B d =0.2 Hz.

CHAPTER 14. Based on the info about the scattering function we know that the multipath spread is T m =1ms, and the Doppler spread is B d =0.2 Hz. CHAPTER 4 Problem 4. : Based on the info about the scattering function we know that the multipath spread is T m =ms, and the Doppler spread is B d =. Hz. (a) (i) T m = 3 sec (ii) B d =. Hz (iii) ( t) c

More information

LECTURE 3. Last time:

LECTURE 3. Last time: LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate

More information

International Journal of Modern Trends in Engineering and Research e-issn No.: , Date: 2-4 July, 2015

International Journal of Modern Trends in Engineering and Research   e-issn No.: , Date: 2-4 July, 2015 International Journal of Modern Trends in Engineering and Research www.ijmter.com e-issn No.:2349-9745, Date: 2-4 July, 2015 Multiple Primary User Spectrum Sensing Using John s Detector at Low SNR Haribhau

More information

Statistical Models and Algorithms for Real-Time Anomaly Detection Using Multi-Modal Data

Statistical Models and Algorithms for Real-Time Anomaly Detection Using Multi-Modal Data Statistical Models and Algorithms for Real-Time Anomaly Detection Using Multi-Modal Data Taposh Banerjee University of Texas at San Antonio Joint work with Gene Whipps (US Army Research Laboratory) Prudhvi

More information

Decentralized Sequential Change Detection Using Physical Layer Fusion

Decentralized Sequential Change Detection Using Physical Layer Fusion Decentralized Sequential Change Detection Using hysical Layer Fusion Leena Zacharias ECE Department Indian Institute of Science Bangalore, India Email: zleena@ece.iisc.ernet.in ajesh Sundaresan ECE Department

More information

Bayesian Quickest Change Detection Under Energy Constraints

Bayesian Quickest Change Detection Under Energy Constraints Bayesian Quickest Change Detection Under Energy Constraints Taposh Banerjee and Venugopal V. Veeravalli ECE Department and Coordinated Science Laboratory University of Illinois at Urbana-Champaign, Urbana,

More information

Direct-Sequence Spread-Spectrum

Direct-Sequence Spread-Spectrum Chapter 3 Direct-Sequence Spread-Spectrum In this chapter we consider direct-sequence spread-spectrum systems. Unlike frequency-hopping, a direct-sequence signal occupies the entire bandwidth continuously.

More information

Decentralized Detection In Wireless Sensor Networks

Decentralized Detection In Wireless Sensor Networks Decentralized Detection In Wireless Sensor Networks Milad Kharratzadeh Department of Electrical & Computer Engineering McGill University Montreal, Canada April 2011 Statistical Detection and Estimation

More information

Transmitter-Receiver Cooperative Sensing in MIMO Cognitive Network with Limited Feedback

Transmitter-Receiver Cooperative Sensing in MIMO Cognitive Network with Limited Feedback IEEE INFOCOM Workshop On Cognitive & Cooperative Networks Transmitter-Receiver Cooperative Sensing in MIMO Cognitive Network with Limited Feedback Chao Wang, Zhaoyang Zhang, Xiaoming Chen, Yuen Chau. Dept.of

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

EECS564 Estimation, Filtering, and Detection Exam 2 Week of April 20, 2015

EECS564 Estimation, Filtering, and Detection Exam 2 Week of April 20, 2015 EECS564 Estimation, Filtering, and Detection Exam Week of April 0, 015 This is an open book takehome exam. You have 48 hours to complete the exam. All work on the exam should be your own. problems have

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Performance Analysis of Spread Spectrum CDMA systems

Performance Analysis of Spread Spectrum CDMA systems 1 Performance Analysis of Spread Spectrum CDMA systems 16:33:546 Wireless Communication Technologies Spring 5 Instructor: Dr. Narayan Mandayam Summary by Liang Xiao lxiao@winlab.rutgers.edu WINLAB, Department

More information

A CUSUM approach for online change-point detection on curve sequences

A CUSUM approach for online change-point detection on curve sequences ESANN 22 proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning. Bruges Belgium, 25-27 April 22, i6doc.com publ., ISBN 978-2-8749-49-. Available

More information

Lecture 22: Error exponents in hypothesis testing, GLRT

Lecture 22: Error exponents in hypothesis testing, GLRT 10-704: Information Processing and Learning Spring 2012 Lecture 22: Error exponents in hypothesis testing, GLRT Lecturer: Aarti Singh Scribe: Aarti Singh Disclaimer: These notes have not been subjected

More information

Class 2 & 3 Overfitting & Regularization

Class 2 & 3 Overfitting & Regularization Class 2 & 3 Overfitting & Regularization Carlo Ciliberto Department of Computer Science, UCL October 18, 2017 Last Class The goal of Statistical Learning Theory is to find a good estimator f n : X Y, approximating

More information

Binary Consensus for Cooperative Spectrum Sensing in Cognitive Radio Networks

Binary Consensus for Cooperative Spectrum Sensing in Cognitive Radio Networks Binary Consensus for Cooperative Spectrum Sensing in Cognitive Radio Networks Shwan Ashrafi, ehrzad almirchegini, and Yasamin ostofi Department of Electrical and Computer Engineering University of New

More information

Active Sequential Hypothesis Testing with Application to a Visual Search Problem

Active Sequential Hypothesis Testing with Application to a Visual Search Problem 202 IEEE International Symposium on Information Theory Proceedings Active Sequential Hypothesis Testing with Application to a Visual Search Problem Nidhin Koshy Vaidhiyan nidhinkv@ece.iisc.ernet.in Department

More information

Early Detection of a Change in Poisson Rate After Accounting For Population Size Effects

Early Detection of a Change in Poisson Rate After Accounting For Population Size Effects Early Detection of a Change in Poisson Rate After Accounting For Population Size Effects School of Industrial and Systems Engineering, Georgia Institute of Technology, 765 Ferst Drive NW, Atlanta, GA 30332-0205,

More information

Lecture 8 Inequality Testing and Moment Inequality Models

Lecture 8 Inequality Testing and Moment Inequality Models Lecture 8 Inequality Testing and Moment Inequality Models Inequality Testing In the previous lecture, we discussed how to test the nonlinear hypothesis H 0 : h(θ 0 ) 0 when the sample information comes

More information

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals Jingxian Wu Department of Electrical Engineering University of Arkansas Outline Matched Filter Generalized Matched Filter Signal

More information

Detection theory. H 0 : x[n] = w[n]

Detection theory. H 0 : x[n] = w[n] Detection Theory Detection theory A the last topic of the course, we will briefly consider detection theory. The methods are based on estimation theory and attempt to answer questions such as Is a signal

More information

Asymptotically Optimal Quickest Change Detection in Distributed Sensor Systems

Asymptotically Optimal Quickest Change Detection in Distributed Sensor Systems This article was downloaded by: [University of Illinois at Urbana-Champaign] On: 23 May 2012, At: 16:03 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

CHANGE DETECTION WITH UNKNOWN POST-CHANGE PARAMETER USING KIEFER-WOLFOWITZ METHOD

CHANGE DETECTION WITH UNKNOWN POST-CHANGE PARAMETER USING KIEFER-WOLFOWITZ METHOD CHANGE DETECTION WITH UNKNOWN POST-CHANGE PARAMETER USING KIEFER-WOLFOWITZ METHOD Vijay Singamasetty, Navneeth Nair, Srikrishna Bhashyam and Arun Pachai Kannu Department of Electrical Engineering Indian

More information

Flat Rayleigh fading. Assume a single tap model with G 0,m = G m. Assume G m is circ. symmetric Gaussian with E[ G m 2 ]=1.

Flat Rayleigh fading. Assume a single tap model with G 0,m = G m. Assume G m is circ. symmetric Gaussian with E[ G m 2 ]=1. Flat Rayleigh fading Assume a single tap model with G 0,m = G m. Assume G m is circ. symmetric Gaussian with E[ G m 2 ]=1. The magnitude is Rayleigh with f Gm ( g ) =2 g exp{ g 2 } ; g 0 f( g ) g R(G m

More information

arxiv: v2 [math.st] 20 Jul 2016

arxiv: v2 [math.st] 20 Jul 2016 arxiv:1603.02791v2 [math.st] 20 Jul 2016 Asymptotically optimal, sequential, multiple testing procedures with prior information on the number of signals Y. Song and G. Fellouris Department of Statistics,

More information

Lecture 5: Asymptotic Equipartition Property

Lecture 5: Asymptotic Equipartition Property Lecture 5: Asymptotic Equipartition Property Law of large number for product of random variables AEP and consequences Dr. Yao Xie, ECE587, Information Theory, Duke University Stock market Initial investment

More information

Modulation Classification and Parameter Estimation in Wireless Networks

Modulation Classification and Parameter Estimation in Wireless Networks Syracuse University SURFACE Electrical Engineering - Theses College of Engineering and Computer Science 6-202 Modulation Classification and Parameter Estimation in Wireless Networks Ruoyu Li Syracuse University,

More information

F2E5216/TS1002 Adaptive Filtering and Change Detection. Course Organization. Lecture plan. The Books. Lecture 1

F2E5216/TS1002 Adaptive Filtering and Change Detection. Course Organization. Lecture plan. The Books. Lecture 1 Adaptive Filtering and Change Detection Bo Wahlberg (KTH and Fredrik Gustafsson (LiTH Course Organization Lectures and compendium: Theory, Algorithms, Applications, Evaluation Toolbox and manual: Algorithms,

More information

Detection theory 101 ELEC-E5410 Signal Processing for Communications

Detection theory 101 ELEC-E5410 Signal Processing for Communications Detection theory 101 ELEC-E5410 Signal Processing for Communications Binary hypothesis testing Null hypothesis H 0 : e.g. noise only Alternative hypothesis H 1 : signal + noise p(x;h 0 ) γ p(x;h 1 ) Trade-off

More information

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010

Detection Theory. Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010 Detection Theory Chapter 3. Statistical Decision Theory I. Isael Diaz Oct 26th 2010 Outline Neyman-Pearson Theorem Detector Performance Irrelevant Data Minimum Probability of Error Bayes Risk Multiple

More information

Nonparametric regression with martingale increment errors

Nonparametric regression with martingale increment errors S. Gaïffas (LSTA - Paris 6) joint work with S. Delattre (LPMA - Paris 7) work in progress Motivations Some facts: Theoretical study of statistical algorithms requires stationary and ergodicity. Concentration

More information

SPATIAL DIVERSITY AWARE DATA FUSION FOR COOPERATIVE SPECTRUM SENSING

SPATIAL DIVERSITY AWARE DATA FUSION FOR COOPERATIVE SPECTRUM SENSING 2th European Signal Processing Conference (EUSIPCO 22) Bucharest, Romania, August 27-3, 22 SPATIAL DIVERSITY AWARE DATA FUSIO FOR COOPERATIVE SPECTRUM SESIG uno Pratas (,2) eeli R. Prasad () António Rodrigues

More information

WIRELESS COMMUNICATIONS AND COGNITIVE RADIO TRANSMISSIONS UNDER QUALITY OF SERVICE CONSTRAINTS AND CHANNEL UNCERTAINTY

WIRELESS COMMUNICATIONS AND COGNITIVE RADIO TRANSMISSIONS UNDER QUALITY OF SERVICE CONSTRAINTS AND CHANNEL UNCERTAINTY University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Theses, Dissertations, and Student Research from Electrical & Computer Engineering Electrical & Computer Engineering, Department

More information