PARAMETER ESTIMATION OF CHIRP SIGNALS IN PRESENCE OF STATIONARY NOISE
|
|
- Cassandra Hensley
- 5 years ago
- Views:
Transcription
1 PARAMETER ESTIMATION OF CHIRP SIGNALS IN PRESENCE OF STATIONARY NOISE DEBASIS KUNDU AND SWAGATA NANDI Abstract. The problem of parameter estimation of the chirp signals in presence of stationary noise has been addressed. We consider the least squares estimators and it is observed that the least squares estimators are strongly consistent. The asymptotic distributions of the least squares estimators are obtained. The multiple chirp signal model is also considered and we obtain the asymptotic properties of the least squares estimators of the unknown parameters. We perform some small sample simulations to observe how the proposed estimators work for small sample sizes.. Introduction In this paper we consider the estimation procedure of the parameters of the following signal processing model: () y(n) = A 0 cos(α 0 n + β 0 n ) + B 0 sin(α 0 n + β 0 n ) + X(n); n =,..., N. Here y(n) is the real valued signal observed at n =,..., N. A 0 and B 0 are real-valued amplitudes and α 0 and β 0 are the frequency and frequency rate respectively. So the chirp signal model () does not have a constant frequency like the sinusoidal frequency model and the initial frequency changes over time with the rate β. The error random variables {X(n)} is a sequence of random variables with mean zero and finite fourth moment. The error random variable X(n) satisfies the following assumption: Assumption. The error random variable {X(n)} can be written in the following form; () X(n) = a(j)e(n j). Date: April 8, 006. j= 000 Mathematics Subject Classification. 6J0; 6E0; 6C05. Key words and phrases. Asymptotic distributions; Chirp signals; Least squares estimators; Multiple chirp signals. The authors would like to thank the referee and the Associate Editor for some constructive suggestions and the Editor Professor Jane-Ling Wang for encouragement.
2 DEBASIS KUNDU AND SWAGATA NANDI Here {e(n)} is a sequence of independent and identically distributed (i.i.d.) random variables with mean zero and finite fourth moment. The coefficients a(j) s satisfy the following condition; (3) j= a(j) <. The signals as described in () are known as the chirp signals in the statistical signal processing literature (Djurić and Kay; 990). Chirp signals are quite common in various areas of science and engineering, specifically in sonar, radar, communications, etc. Several authors considered the chirp signal model () when X(n) s are i.i.d. random variables. See for example, the work of Abatzoglon (986), Kumaresn and Verma (987), Djurić and Kay (990), Gini, Montanari and Verrazzani (000), Nandi and Kundu (00) etc. Different approaches of the estimation of chirp parameters in similar kinds of models are found in Giannakis and Zhou (995), Zhou, Ginnakis and Swami (996), Shamsunder, Giannakis and Friedlander (995), Swami (996) and Zhou and Giannakis (995). It is well known, that in most of the practical situations, the errors may not be independent. We assume stationarity through assumption to incorporate the dependence structure and make the model more realistic. Assumption is a standard assumption for a stationary linear process. Any finite dimensional stationary AR, MA or ARMA process can be represented as () when the coefficients a(j) s satisfy (3). Thus, the Assumption is true for a large class of stationary random variables. In this paper, we discuss the problem of parameter estimation of the chirp signal model in presence of stationary noise. We consider the least squares estimators and study their properties, when the errors satisfy assumption. It is known, see Kundu (997), that the simple sum of sinusoidal model does not satisfy the sufficient conditions of Jennrich (969) or Wu (98) for the least squares estimators to be consistent. So the chirp signal model as defined in () also does not satisfy the sufficient conditions of Jennrich and Wu. Therefore, the results of Wu or Jennrich cannot be applied directly to establish the strong consistency or the asymptotic normality properties of the LSEs. Interestingly, because of the structure of the model, although it does not satisfy the standard sufficient conditions, the strong consistency or the asymptotic normality results can be obtained. It is also observed that the asymptotic variances of the amplitudes, frequency and frequency rate estimators
3 CHIRP SIGNALS IN STATIONARY NOISE 3 are O(N ), O(N 3 ) and O(N 5 ) respectively. Based on the asymptotic distributions, asymptotic confidence intervals can also be constructed. The rest of the paper is organized as follows. In section, we provide the asymptotic properties of the least squares estimators. Multiple chirp model is discussed in section 3. Some numerical results are presented in section and we conclude the paper in section 5. The proofs of the results of section are provided in the Appendix.. Asymptotic Properties of LSEs Let us use the following notation: θ = (A, B, α, β), θ 0 = (A 0, B 0, α 0, β 0 ). Then, the least squares estimator (LSE) of θ 0, say ˆθ = (Â, ˆB, ˆα, ˆβ), can be obtained by minimizing () Q(A, B, α, β) = Q(θ) = y(n) A cos(αn + βn ) B sin(αn + βn ) ], with respect to A, B, α and β. In the following, we state the consistency property of θ 0 in theorem. Theorem. Let the true parameter vector θ 0 = (A 0, B 0, α 0, β 0 ) be an interior point of the parameter space Θ = (, ) (, ) (0, π) (0, π) and A 0 + B 0 > 0. If the error random variables X(n) satisfy assumption, then ˆθ, the LSE of θ 0, is a strongly consistent estimator of θ 0. In this section we compute the asymptotic joint distribution of the least squares estimators of the unknown parameters. We use Q (θ) and Q (θ) to denote the vector of first derivatives of Q(θ) and the second derivative matrix of Q(θ) respectively. Now expanding Q (ˆθ) around the true parameter value θ 0 by Taylor series, we obtain (5) Q (ˆθ) Q (θ 0 ) = (ˆθ θ 0 )Q ( θ), here θ is a point on the line joining the points ˆθ and θ 0. Suppose D is a diagonal matrix as follows; { } D = diag N, N, N 3, N 5. Since Q (ˆθ) = 0, therefore (5) can be written as (6) (ˆθ θ 0 )D = Q (θ 0 )D ] DQ ( θ)d ],
4 DEBASIS KUNDU AND SWAGATA NANDI as DQ ( θ)d ] is an invertible matrix a.e. for large N. Using theorem, it follows that ˆθ converges a.e. to θ 0 and since each element of Q (θ) is a continuous function of θ, therefore, lim DQ ( θ)d ] = lim DQ (θ 0 )D ] = Σ(θ 0 ) (say). Now let us look at different elements of the matrix Σ(θ) = (σ jk (θ)). We will use the following result lim N p n p = p for p =,,.... and the following notation: (7) (8) lim lim N p+ N p+ n p cos k (αn + βn ) = δ k (p, α, β), n p sin k (αn + βn ) = γ k (p, α, β). Here k takes values and. Using these notation for limits, we compute the elements of Σ(θ) by routine calculations and are as follows: The random vector Q (θ 0 )D ] takes the form; N 3 N 5 N N X(n) cos(α0 n + β 0 n ) N N X(n) sin(α0 n + β 0 n ). N nx(n) A0 sin(α 0 n + β 0 n ) B 0 cos(α 0 n + β 0 n )] N n X(n) A 0 sin(α 0 n + β 0 n ) B 0 cos(α 0 n + β 0 n )] Now using the central limit theorem of stochastic processes (see Fuller; 976, page 5), it follows that Q (θ 0 )D ] tends to a -variate normal distribution as given below; (9) Q (θ 0 )D ] d N (0, G(θ 0 )),
5 CHIRP SIGNALS IN STATIONARY NOISE 5 where the matrix G(θ 0 ) is the asymptotic dispersion matrix of Q (θ 0 )D ]. If we denote G(θ) = ((g jk (θ))), then for k j, g jk (θ) are as follows: (0) () () (3) () where S = S 3 = S = g (θ) = lim N ES ], g 3 (θ) = lim N ES S 3 ], g (θ) = lim g (θ) = lim g 3 (θ) = lim N ES ], N ES S 3 ], N ES 3S ], X(n) cos(αn + βn ), g (θ) = lim g (θ) = lim g 3 (θ) = lim g 33 (θ) = lim g (θ) = lim S = N ES S ], N ES S 3 ], N ES S 3 ], N ES 3], 3 N ES ], 5 X(n) sin(αn + βn ), nx(n) A sin(αn + βn ) B cos(αn + βn ) ], n X(n) A sin(αn + βn ) B cos(αn + βn ) ]. For k < j, g jk (θ) = g kj (θ). These limits given in (0) to () exist for fixed value of θ because of (7) and (8). Therefore, from (6) the following theorem follows. Theorem. Under the same assumptions as in Theorem, (5) (ˆθ θ 0 )D d N 0, ] Σ (θ 0 )G(θ 0 )Σ (θ 0 ) Remark. When X(n) s are i.i.d. random variables, then the covariance matrix takes the simplified form Σ (θ 0 )G(θ 0 )Σ (θ 0 ) = σ Σ (θ 0 ). Remark. Although we could not prove it theoretically, but it is observed by extensive numerical computations that the right hand side limits of (7) and (8) for k =, do not depend on α. So we assume that these quantities are independent of their second argument and we write them as. δ k (p; β) = δ k (p, α, β), γ k (p; β) = γ k (p, α, β).
6 6 DEBASIS KUNDU AND SWAGATA NANDI Let us denote c c = k= a(k) cos(α 0 k + β 0 k ), c s = k= a(k) sin(α 0 k + β 0 k ). c c and c s are functions of α 0 and β 0, but we do not make it explicit here to keep the notation simple. Now according to the above assumption, δ s and γ s are independent of α and based on it, we can explicitly compute the elements of G(θ) matrix for a given θ. For different entries of the matrix G(θ) in terms of δ s and γ s, one can see at statmath/eprints/ (isid/ms/005/08) or it can be available from the authors on request. Thus, obtaining the explicit expression of different entries of the variance-covariance matrix Σ (θ 0 )G(θ 0 )Σ (θ 0 ) of (ˆθ θ 0 )D is possible by inverting the matrix Σ(θ) at θ 0. But they are not provided here due to the complex (notational) structure of matrices Σ(θ) and G(θ). If the true value of β is zero (i.e. frequency does not change over time) and if this information is used in the model, then the model () is nothing but the usual sinusoidal model. In that case, the asymptotic distribution can be obtained in compact form and the amplitude is asymptotically independent of the frequency. This has not been observed in case of the chirp signal model. 3. Multiple Chirp Signal In this section, we introduce the multiple chirp signal model in stationary noise. complex-valued single chirp model was generalized as superimposed chirp model by Saha and Kay (00). The following model is a similar generalization of model (). We assume that the observed data y(n) have the following representation. p (6) y(n) = A 0 k cos(αk 0 n + β0 k n ) + Bk 0 sin(α0 k n + β0 k n ) ] + X(n); n =,..., N. k= Similarly as the single chirp model, the parameters αk 0, β0 k (0, π) are the frequency and frequency rate respectively. A 0 k s and B0 k s are real-valued amplitudes. Again our aim is to estimate the parameters and study their properties. We assume that the number of components, p is known and X(n) s satisfy assumption. Estimation of p is an important problem and will be addressed elsewhere. Now let us define, θ k = (A k, B k, α k, β k ) and ν = (θ,..., θ p ) be the parameter vector. The least squares estimators of the parameters are obtained by minimizing the objective function, say R(ν) (defined similarly as Q(θ); see The
7 CHIRP SIGNALS IN STATIONARY NOISE 7 eq. (), sec. ). Let ˆν and ν 0 denote the least squares estimator and the true value of ν. The consistency of ˆν follows similarly as the consistency of ˆθ, considering the parameter vector as ν. We will state the asymptotic distribution of ˆν here. The proof involves routine calculations and use of the multiple Taylor series expansion and the central limit theorem for stochastic processes. For the( asymptotic distribution of ˆν, we introduce the following notation; ψ N k = (ˆθ k θ 0 k )D = N / (Âk A 0 k ), N / ( ˆB k Bk 0), N 3/ (ˆα k αk 0), N 5/ ( ˆβ k β 0 k ) ), moreover c k c and ck s are obtained from c c and c s by replacing α 0 and β 0 by α 0 k and β0 k respectively. Let us denote β j + β k = β + jk, β j β k = β jk, d = c c c c + c s c s, d = c c c s + c s c c, d 3 = c cc c c sc s and d = c cc s c sc c. Then the asymptotic distribution of (ψ N,..., ψ N p ) is as follows. (7) (ψ N,..., ψn p ) d N p ( 0, σ Λ (ν 0 )H(ν 0 )Λ (ν 0 ) ), Λ Λ Λ p H H H p Λ (8) Λ(ν) = Λ Λ p...., H(ν) = H H H p..... Λ p Λ p Λ pp H p H p H pp The sub-matrices Λ jk and H jk are square matrices of order four and Λ jk Λ jk (θ j, θ k ), H jk H jk (θ j, θ k ). Λ jj and H jj can be obtained from Σ(θ) and G(θ) by putting θ = θ j. As in case of G(θ), the entries of the off-diagonal sub-matrices Λ jk = ((λ rs )) and H jk = ((h rs )) are available at statmath/eprints/ (isid/ms/005/08). The elements of the matrices Λ jk and H jk are non-zero. So the parameters corresponding to different components, ψ N j and ψ N k for j k, are not asymptotically independent. If the frequencies do not change over time, i.e. the frequency rates β s vanish, the model (6) is equivalent to the multiple frequency model. Then the off-diagonal matrices in H and Λ are zero matrices and the estimators of the unknown parameters in different components are independent. This is due to the reason that δ (p, α, 0) = 0 = γ (p, α, 0) for all p 0 and α (0, π).. Numerical Experiments In this section, we present the results of the numerical experiments based on simulations. For this purpose, we consider a single chirp model with A =.93, B =.9, α =.5 and β =.0. We use sample size N = 50 and N = 00. Though, α, β (0, π), we have considered the true value of β, much less than the initial frequency α, as β, being the frequency rate is comparatively small in general. We consider different stationary processes as the error random variables for our simulations. The errors are generated from (a) X(t) =
8 8 DEBASIS KUNDU AND SWAGATA NANDI ρe(t + ) + e(t), (b) X(t) = ρ e(t ) + ρ e(t ) + e(t) and (c) X(t) = ρx(t ) + e(t). The random variables {e(t)} are distributed as N (0, σ ). The processes (a), (b) and (c) are stationary MA(), MA() and AR() processes. Here MA(q) and AR(p) are usual notation for the moving average process of order q and the autoregressive process of order p respectively. For simulations, ρ =.5, ρ =.5 and ρ =. have been used. We consider different values of σ and accordingly the variances of X(t) are different depending on the model of the error process and their true parameter values. We generate the data using () and the parameters as mentioned above. The LSEs of the parameters are obtained by minimizing the residual sum of squares. The starting estimates of the frequency and the frequency rate are obtained by maximizing the following periodogram like function; I(ω, ω ) = y(t)e i(ω t+ω t ) N t= over a fine two-dimensional (-d) grid of (0, π) (0, π). The linear parameters, A and B are expressible in terms of α and β. So the minimization of Q(θ) with respect to θ involves a -d search. Once the non-linear parameters, α and β are estimated, A and B are estimated using the linear regression technique. We replicate this procedure of data generation and the estimation for parameters 000 times resulting 000 estimated values of each parameter. Then we calculate the average estimate (AVEEST), the bias (BIAS) and the mean squared error (MSE) of each parameter. We summarize results in tables - when the errors are of type (c). In table, results with N = 50 are reported and results with N = 00 are in table. We did not report the results with errors of types (a) and (b) for limitation of space. These results are available at statmath/eprints/ (isid/ms/005/08) or it can be obtained from the authors on request. In section, we have obtained the asymptotic distributions of the LSEs of the unknown parameters of a single chirp signal model under quite general assumptions. So, it is possible to obtain the confidence intervals of the unknown parameters for fixed finite length data using Theorem. But due to the complexity involved in the distribution, it is extremely complicated to implement it in practice. Also in numerical experiments, it has been observed that the convergence of sequences, δ s as well γ s highly depends on the parameters and in many cases we need a very large value to stabilize the convergence. For this reason, we have used the percentile bootstrap method for interval estimation of the different parameters as a simple alternative method as suggested by Nandi, Iyer and Kundu (00). In each replication of our experiment, we generate 000 bootstrap resamples using the estimated parameters and then the bootstrap confidence intervals using the bootstrap quantiles at 95% nominal level. So we have 000 intervals for each parameter from the replicated experiment. Then we estimate the 95% bootstrap coverage probability
9 CHIRP SIGNALS IN STATIONARY NOISE Figure. Plot of the histograms of LSEs of A (left plot) and B (right plot) Figure. Plot of the histograms of LSEs of α (left plot) and β (right plot). by calculating the proportion of covering the true parameter value by the interval in each replication. We report them as B-COVP in Tables -. We also report the average length of the bootstrap confidence interval as B-AVEL. So in each table, we report the average estimate, its bias and mean squared error and the 95% bootstrap coverage probability and the average length. We have seen in simulations, that the maximizer of the periodogram like function defined above over a fine grid provides reasonably good initial estimates of the non-linear parameters, α and β in most of the cases. In the above discussed experiments, we have collected the LSEs of all the parameters estimated in all replications in case of N = 50 with type (c) error and σ =.. The type (c) error, being an AR() process, has the variance σ ρ =.3, of its stationary distribution. To understand their sample distributions, we plot the histograms of the LSEs of A and B in Fig. and histograms of the LSEs of α and β in Fig.. We wanted to see how the fitted signal looks like, so we have generated a realization using the type of error and σ, same as above. The fitted signal is plotted in Fig. 3 along with the original one.
10 0 DEBASIS KUNDU AND SWAGATA NANDI Figure 3. Plot of original signal (solid line) and estimated signals (dotted line). Table. Average estimates, biases, MSEs, coverage probabilities and average lengths using bootstrap technique when errors are of type (c) and sample size N = 50 Parameters σ A B α β 0. AVEEST BIAS e e e e-6 MSE e e e e-8 B-COVP B-AVEL e e- 0.5 AVEEST BIAS e e e e-6 MSE e e e e-8 B-COVP B-AVEL e e-.0 AVEEST BIAS e e e e-5 MSE e e-.588e-7 B-COVP B-AVEL e e-3 Now we summarize the findings of the experiments discussed above. We observe that the average estimates are quite good which is reflected in the fact that the biases are quite small in absolute value. The MSEs are reasonably small and we observe that they are in decreasing
11 CHIRP SIGNALS IN STATIONARY NOISE Table. Average estimates, biases, MSEs, coverage probabilities and average lengths using bootstrap technique when errors are of type (c) and sample size N = 00 Parameters σ A B α β 0. AVEEST BIAS e-.5395e e e-7 MSE e e-.690e e-0 B-COVP B-AVEL e e AVEEST BIAS e e e e-6 MSE.57707e e e e-9 B-COVP B-AVEL e-.5639e-.0 AVEEST BIAS e e-.36e e-6 MSE e e e-9 B-COVP B-AVEL e-.5379e- order of the linear parameters, α and β. The similar findings are also seen in case of the average bootstrap confidence lengths i.e. the average lengths decrease with (A, B), α, β. The asymptotic distribution (Theorem ) also suggests accordingly as the rates of convergence are N /, N 3/, N 5/ respectively. This has been reflected in the bootstrap intervals to some extent. Also considering all the cases reported here, we can say that the order of the MSEs, approximately match with the order given in the asymptotic distribution of the LSEs and that we expect in case of finite samples of moderate size. For each type of error, the average lengths of intervals, biases and MSEs increase as the error variance increases for all the parameters. Also as we increase the sample size, these values decrease and that has to be according to the Theorem which says that LSEs are strongly consistent. By comparing the different types of errors and σ, we see that with N = 50, the coverage probabilities do not attain the nominal level mainly for the frequency rate β except type (b) error. However, in case of N = 00, the bootstrap coverage probabilities are quite close to the nominal level. In some cases, mainly for the linear parameters, the bootstrap method overestimate the
12 DEBASIS KUNDU AND SWAGATA NANDI coverage probabilities. We understand that using the given sample size in calculating the limiting quantities, δ s and γ s, may cause the overestimation. We have plotted the histograms of the LSEs in Figs and. It is clear from the plots that the LSEs are distributed symmetrically around the true value for all parameters. Though, we have reported the MSEs, the histograms gives a very good idea of the variability of the estimates. In Fig. 3, the fitted signal have been plotted with the observed one for a particular case. We see that the fitted one match reasonably well with the observed one. So, from this discussion, we see that the performance of the LSEs and the bootstrap method in obtaining the confidence intervals are quite good and can be used in practice. 5. Conclusions In this paper, we study the problem of estimation of the parameters of the real single chirp signal model as well as the multiple chirp signal model in stationary noise. It is a generalization of the multiple frequency model similar to the way the complex-valued chirp signal is a generalization of the exponential model. We propose the least squares estimators to estimate the unknown parameters and study their asymptotic properties. As the joint asymptotic distribution of the LSEs of the unknown parameters is quite complicated for practical implementation purposes, we have used a parametric bootstrap method for interval estimation. We observe that the results are quite satisfactory and can be used in practice. In simulations study, the initial estimates of the frequency and the frequency rate are obtained by maximizing a periodogram like function. It will be interesting to explore the properties of the estimators obtained by maximizing the periodogram like function defined in section. Also generalization of some of the existing iterative and non-iterative methods for the frequency model to the chirp signal model is another problem which needs to be addressed as well as the estimation of the number of chirp components for the multiple chirp model. Appendix In this appendix, we first state Lemmas and and then state and prove the lemmas A- to A-6. Then these lemmas are used to prove lemma. Lemma : Let us denote S C,M = { θ; θ = (A R, A I, α, β), θ θ 0 C, A R M, A I M }.
13 If for any C > 0 and for some M <, CHIRP SIGNALS IN STATIONARY NOISE 3 lim inf then ˆθ is a strongly consistent estimator of θ 0. inf Q(θ) Q(θ 0 ) ] > 0 a.s. θ S C,M N Proof of Lemma : The proof can be obtained by contradiction along the same line as the lemma of Wu (98). Lemma : As N, sup α,β N X(n)e i(αn+βn ) 0 a.s. In the following, we first state and prove the lemmas A- to A-6 and then these lemmas are used to prove lemma. Lemma A-: Let {e(n)} be a sequence of i.i.d. random variables with mean zero and finite fourth moment, then N (9) E e(n)e(n + ) e(n + ) = O(N ), (0) E N k e(n)e(n + )e(n + k)e(n + k + ) = O(N ), for k =, 3,..., N. Proof of Lemma A-: We prove (9) and then (0) follows similarly. Note that N E e(n)e(n + ) e(n + ) E ( N e(n)e(n + ) e(n + ) ) = O(N ). Lemma A-: For an arbitrary integer m, θ e(n)e(n + k)e imθn = O(N 3 ).
14 DEBASIS KUNDU AND SWAGATA NANDI Proof of Lemma A-: θ = e(n)e(n + k)e imθn θ E θ ( N ) ( N e(n)e(n + k)e imθn e(n)e(n + k)e imθn e(n)e(n + k)e imθn )] N e(n) e(n + k) + E e(n)e(n + )e(n + k)e(n + k + ) E e()e( + k)e(n)e(n + k) ] = O(N + N.N ) (using Lemma A-) = O(N 3 ). Lemma A-3: α,β e(n)e i(αn+βn ) = O(N 7 ). Proof of Lemma A-3: α,β e(n)e i(αn+βn ) = α,β N e(n)e i(αn+βn ) O(N + NN 3 ) (using Lemma A-) = O(N 7 ). ] N e(n)e i(αn+βn ) ] Lemma A-: α,β N e(n)e i(αn+βn ) O(N 8 ). Proof of Lemma A-: αβ N e(n)e i(αn+βn ) α,β N e(n)e i(αn+βn ) = O(N 8 ) (using Lemma A-3). Lemma A-5: αβ N X(n)e i(αn+βn ) O(N 8 ).
15 CHIRP SIGNALS IN STATIONARY NOISE 5 Proof of Lemma A-5: X(n)e i(αn+βn ) α,β N = a(k)e(n k)e i(αn+βn ) α,β N k= ] a(k) e(n k)e i(αn+βn ). α,β N k= Note that α,β N N e(n ) k)ei(αn+βn is independent of k and therefore the result follows using Lemma A-. Lemma A-6: sup α,β N X(n)e i(αn+βn ) 0, a.s. Proof of Lemma A-6: Consider the sequence N 9, then using Lemma A-5 we obtain N 9 X(n)e i(αn+βn ) α,β N 9 O(N 9 8 ). sup α,β Therefore, using Borel Cantelli lemma it follows that N 9 X(n)e i(αn+βn ) 0, N 9 a.s. Now consider for J, such that N 9 < J (N + ) 9, then N 9 sup sup X(n)e i(αn+βn) J α,β N 9 <J (N+) N 9 9 J N 9 = sup sup X(n)e i(αn+βn) α,β N 9 N 9 N 9 N 9 <J (N+) 9 N 9 J X(n)e i(αn+βn) J (N+) 9 n=n 9 + X(n) + (N+) 9 J X(n)e i(αn+βn ) J X(n)e i(αn+βn) + X(n)e i(αn+βn ) ( ) X(n) N. 9 (N + ) 9 Note that the mean squared error of the first term is of the order O ( N 8 ((N + ) 9 N 9 ) ) = O(N ). Similarly, the mean squared error of the second term is of the order
16 6 DEBASIS KUNDU AND SWAGATA NANDI O ( ( ) ) N 8 (N+) 9 N 9 = O(N ). Therefore, both terms converge to zero almost surely N 8 and that proves the lemma. Proof of Theorem : In this proof, we denote ˆθ by ˆθ N = (ÂN, ˆB N, ˆα N, ˆβ N ) to emphasize that ˆθ depends on the sample size. If ˆθ N is not consistent for θ 0, then there exits a subsequence {N k } of {N} such that ˆθ Nk does not converge to θ 0. Then either: Case I: ÂN k + ˆB Nk is not bounded. So ÂN k + ˆB Nk i.e. at least one of the ÂN k or ˆB Nk tends to. This implies N k Q(ˆθ Nk ). Since lim N k Q(θ 0 ) <, hence, But as ˆθ Nk is the LSE of θ 0, therefore, ] Q(ˆθ Nk ) Q(θ 0 ). N k Q(ˆθ Nk ) Q(θ 0 ) < 0, which leads to a contradiction. So ˆθ N is a strongly consistent estimator of θ 0. Case II: ÂN k + ˆB Nk is bounded, that means there exists a set S C,M (as defined in Lemma ) such that ˆθ Nk S C,M, for some C > 0 and for an 0 < M <. Now let us write where f (θ) = N Q(θ) Q(θ 0 ) ] = f (θ) + f (θ), N A 0 cos(α 0 n + β 0 n ) A cos(αn + βn ) +B 0 sin(α 0 n + β 0 n ) B sin(αn + βn ) ], f (θ) = N X(n) A 0 cos(α 0 n + β 0 n ) A cos(αn + βn ) +B 0 sin(α 0 n + β 0 n ) B sin(αn + βn ) ]. Using lemma, it follows that () lim sup f (θ) = 0 θ S C,M a.s.
17 CHIRP SIGNALS IN STATIONARY NOISE 7 Now consider the following sets; S C,M, = { θ : θ = (A, B, α, β), A A 0 C, A M, B M }, S C,M, = { θ : θ = (A, B, α, β), B B 0 C, A M, B M }, S C,M,3 = { θ : θ = (A, B, α, β), α α 0 C, A M, B M }, S C,M, = { θ : θ = (A, B, α, β), β β 0 C, A M, B M }. Note that S C,M S C,M, S C,M, S C,M,3 S C,M, = S (say). Therefore, () lim inf Q(θ) Q(θ 0 ) ] lim inf θ S C,M N θ S N First we show that (3) lim inf θ S C,M,j N for j =,..., and then because of (), it implies Q(θ) Q(θ 0 ) ]. Q(θ) Q(θ 0 ) ] > 0 a.s., lim inf θ S C,M N Q(θ) Q(θ 0 ) ] > 0 a.s. Therefore, due to lemma, theorem is proved, provided we can show (3). First consider j = to prove (3). Using (), it follows that lim inf θ S C,M, N = lim inf A A 0 C N = lim inf A A 0 C C lim N Q(θ) Q(θ 0 ) ] = lim inf θ S C,M, f (θ) N A 0 cos(α 0 n + β 0 n ) A cos(αn + βn )+ B 0 sin(α 0 n + β 0 n ) B sin(αn + βn ) ] cos (α 0 n + β 0 n )(A A 0 ) cos (α 0 n + β 0 n ) > 0. For other j also, it can be shown along the same line and that proves theorem.
18 8 DEBASIS KUNDU AND SWAGATA NANDI References ] Abatzoglou, T. (986). Fast maximum likelihood joint estimation of frequency and frequency rate. IEEE Trans. on Aerosp. Electron. Syst., ] Djuric, P.M. and Kay, S.M. (990). Parameter estimation of chirp signals. IEEE Trans. on Acoustics, Speech and Signal Process. 38, No., ] Fuller, W.A. (976), Introduction to Statistical Time Series, John Wiley and Sons, New York. ] Giannakis, G.B. and Zhou, G. (995), Harmonics in multiplicative and additive noise: Parameter estimation using cyclic statistics, IEEE Trans. Signal Process., Vol. 3, 7-. 5] Gini, F., Montanari, M. and Verrazzani, L. (000). Estimation of chirp signals in compound Gaussian clutter: a cyclostationary approach. IEEE Trans. on Acoustics, Speech and Signal Process. 8, No., ] Jennrich, R.I. (969). Asymptotic properties of the nonlinear least squares estimators. Annals of Mathematical Statistics. 0, ] Kumaresan, R. and Verma, S. (987). On estimating the parameters of chirp signals using rank reduction techniques. Proc. st. Asilomar Conf., Pacific Grove, CA, ] Kundu, D. (997), Asymptotic Properties of the Least Squares Estimators of Sinusoidal Signals, Statistics, Vol. 30, ] Nandi, S., Iyer, S.K. and Kundu, D. (00). Estimation of frequencies in presence of heavy tail errors. Statistics and Probability Letters 58, No. 3, ] Nandi, S. and Kundu, D. (00). Asymptotic properties of the least squares estimators of the parameters of the chirp signals. Annals of the Institute of Statistical Mathematics. 56, ] Rao, C.R. (973), Linear Statistical Inferences and Its Applications, Wiley and Sons, New York, nd. Edition. ] Rihaczek, A.W. (969) Principles of High Resolution Radar, New York McGraw Hill. 3] Saha, S. and Kay, S.M. (00). Maximum likelihood parameter estimation of superimposed chirps using Monte Carlo importance sampling. IEEE Trans. Signal Processing 50, No., -30. ] Shamsunder, S. Giannakis, G.B. and Friedlander, B. (995), Estimation Random Amplitude Polynomial Phase Signals: A Cyclostationary Approach, IEEE Trans. Signal Process., Vol. 3, No., ] Swami, A. (996), Cramér-Rao bounds for deterministic signals in additive and multiplicative noise, Signal Process., Vol. 53, 3-. 6] Wu, C.F.J. (98), Asymptotic theory of the nonlinear least squares estimation Annals of Statistics 9, ] Zhou, G and Giannakis, G.B., (995), Harmonics in multiplicative and additive noise: Performance analysis of cyclic estimators, IEEE Trans. Signal Process., Vol. 3, ] Zhou, G., Giannakis, G.B. and Swami, A., (996), On polynomial phase signals with time-varying amplitudes, IEEE Trans. Signal Process., Vol., No.,
19 CHIRP SIGNALS IN STATIONARY NOISE 9 Department of Mathematics and Statistics, Indian Institute of Technology Kanpur, Pin 0806, India, kundu@iitk.ac.in Theoretical Statistics and Mathematics Unit, Indian Statistical Institute, 7, S.J.S. Sansanwal Marg, New Delhi - 006, India, nandi@isid.ac.in
PARAMETER ESTIMATION OF CHIRP SIGNALS IN PRESENCE OF STATIONARY NOISE
Statistica Sinica 8(008), 87-0 PARAMETER ESTIMATION OF CHIRP SIGNALS IN PRESENCE OF STATIONARY NOISE Debasis Kundu and Swagata Nandi Indian Institute of Technology, Kanpur and Indian Statistical Institute
More informationESTIMATION OF PARAMETERS OF PARTIALLY SINUSOIDAL FREQUENCY MODEL
1 ESTIMATION OF PARAMETERS OF PARTIALLY SINUSOIDAL FREQUENCY MODEL SWAGATA NANDI 1 AND DEBASIS KUNDU Abstract. In this paper, we propose a modification of the multiple sinusoidal model such that periodic
More informationOn Least Absolute Deviation Estimators For One Dimensional Chirp Model
On Least Absolute Deviation Estimators For One Dimensional Chirp Model Ananya Lahiri & Debasis Kundu, & Amit Mitra Abstract It is well known that the least absolute deviation (LAD) estimators are more
More informationEFFICIENT ALGORITHM FOR ESTIMATING THE PARAMETERS OF CHIRP SIGNAL
EFFICIENT ALGORITHM FOR ESTIMATING THE PARAMETERS OF CHIRP SIGNAL ANANYA LAHIRI, & DEBASIS KUNDU,3,4 & AMIT MITRA,3 Abstract. Chirp signals play an important role in the statistical signal processing.
More informationAmplitude Modulated Model For Analyzing Non Stationary Speech Signals
Amplitude Modulated Model For Analyzing on Stationary Speech Signals Swagata andi, Debasis Kundu and Srikanth K. Iyer Institut für Angewandte Mathematik Ruprecht-Karls-Universität Heidelberg Im euenheimer
More informationAsymptotic of Approximate Least Squares Estimators of Parameters Two-Dimensional Chirp Signal
Asymptotic of Approximate Least Squares Estimators of Parameters Two-Dimensional Chirp Signal Rhythm Grover, Debasis Kundu,, and Amit Mitra Department of Mathematics, Indian Institute of Technology Kanpur,
More informationAn Efficient and Fast Algorithm for Estimating the Parameters of Sinusoidal Signals
An Efficient and Fast Algorithm for Estimating the Parameters of Sinusoidal Signals Swagata Nandi 1 Debasis Kundu Abstract A computationally efficient algorithm is proposed for estimating the parameters
More informationOn Parameter Estimation of Two Dimensional Chirp Signal
On Parameter Estimation of Two Dimensional Chirp Signal Ananya Lahiri & Debasis Kundu, & Amit Mitra Abstract Two dimensional (-D) chirp signals occur in different areas of image processing. In this paper,
More informationAn Efficient and Fast Algorithm for Estimating the Parameters of Two-Dimensional Sinusoidal Signals
isid/ms/8/ November 6, 8 http://www.isid.ac.in/ statmath/eprints An Efficient and Fast Algorithm for Estimating the Parameters of Two-Dimensional Sinusoidal Signals Swagata Nandi Anurag Prasad Debasis
More informationASYMPTOTIC PROPERTIES OF THE LEAST SQUARES ESTIMATORS OF MULTIDIMENSIONAL EXPONENTIAL SIGNALS
ASYMPTOTIC PROPERTIES OF THE LEAST SQUARES ESTIMATORS OF MULTIDIMENSIONAL EXPONENTIAL SIGNALS Debasis Kundu Department of Mathematics Indian Institute of Technology Kanpur Kanpur, Pin 20806 India Abstract:
More informationOn Two Different Signal Processing Models
On Two Different Signal Processing Models Department of Mathematics & Statistics Indian Institute of Technology Kanpur January 15, 2015 Outline First Model 1 First Model 2 3 4 5 6 Outline First Model 1
More informationEstimating Periodic Signals
Department of Mathematics & Statistics Indian Institute of Technology Kanpur Most of this talk has been taken from the book Statistical Signal Processing, by D. Kundu and S. Nandi. August 26, 2012 Outline
More informationExact Inference for the Two-Parameter Exponential Distribution Under Type-II Hybrid Censoring
Exact Inference for the Two-Parameter Exponential Distribution Under Type-II Hybrid Censoring A. Ganguly, S. Mitra, D. Samanta, D. Kundu,2 Abstract Epstein [9] introduced the Type-I hybrid censoring scheme
More informationA New Two Sample Type-II Progressive Censoring Scheme
A New Two Sample Type-II Progressive Censoring Scheme arxiv:609.05805v [stat.me] 9 Sep 206 Shuvashree Mondal, Debasis Kundu Abstract Progressive censoring scheme has received considerable attention in
More informationAsymptotic inference for a nonstationary double ar(1) model
Asymptotic inference for a nonstationary double ar() model By SHIQING LING and DONG LI Department of Mathematics, Hong Kong University of Science and Technology, Hong Kong maling@ust.hk malidong@ust.hk
More informationThe Uniform Weak Law of Large Numbers and the Consistency of M-Estimators of Cross-Section and Time Series Models
The Uniform Weak Law of Large Numbers and the Consistency of M-Estimators of Cross-Section and Time Series Models Herman J. Bierens Pennsylvania State University September 16, 2005 1. The uniform weak
More informationAsymptotic properties of the least squares estimators of a two dimensional model
Metrika (998) 48: 83±97 > Springer-Verlag 998 Asymptotic properties of the least squares estimators of a two dimensional model Debasis Kundu,*, Rameshwar D. Gupta,** Department of Mathematics, Indian Institute
More informationAnalysis of Type-II Progressively Hybrid Censored Data
Analysis of Type-II Progressively Hybrid Censored Data Debasis Kundu & Avijit Joarder Abstract The mixture of Type-I and Type-II censoring schemes, called the hybrid censoring scheme is quite common in
More informationBootstrap. Director of Center for Astrostatistics. G. Jogesh Babu. Penn State University babu.
Bootstrap G. Jogesh Babu Penn State University http://www.stat.psu.edu/ babu Director of Center for Astrostatistics http://astrostatistics.psu.edu Outline 1 Motivation 2 Simple statistical problem 3 Resampling
More informationDiscriminating Between the Bivariate Generalized Exponential and Bivariate Weibull Distributions
Discriminating Between the Bivariate Generalized Exponential and Bivariate Weibull Distributions Arabin Kumar Dey & Debasis Kundu Abstract Recently Kundu and Gupta ( Bivariate generalized exponential distribution,
More informationAdjusted Empirical Likelihood for Long-memory Time Series Models
Adjusted Empirical Likelihood for Long-memory Time Series Models arxiv:1604.06170v1 [stat.me] 21 Apr 2016 Ramadha D. Piyadi Gamage, Wei Ning and Arjun K. Gupta Department of Mathematics and Statistics
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationThe Restricted Likelihood Ratio Test at the Boundary in Autoregressive Series
The Restricted Likelihood Ratio Test at the Boundary in Autoregressive Series Willa W. Chen Rohit S. Deo July 6, 009 Abstract. The restricted likelihood ratio test, RLRT, for the autoregressive coefficient
More informationinferences on stress-strength reliability from lindley distributions
inferences on stress-strength reliability from lindley distributions D.K. Al-Mutairi, M.E. Ghitany & Debasis Kundu Abstract This paper deals with the estimation of the stress-strength parameter R = P (Y
More informationEstimation of parametric functions in Downton s bivariate exponential distribution
Estimation of parametric functions in Downton s bivariate exponential distribution George Iliopoulos Department of Mathematics University of the Aegean 83200 Karlovasi, Samos, Greece e-mail: geh@aegean.gr
More informationENEE 621 SPRING 2016 DETECTION AND ESTIMATION THEORY THE PARAMETER ESTIMATION PROBLEM
c 2007-2016 by Armand M. Makowski 1 ENEE 621 SPRING 2016 DETECTION AND ESTIMATION THEORY THE PARAMETER ESTIMATION PROBLEM 1 The basic setting Throughout, p, q and k are positive integers. The setup With
More informationThe Bootstrap: Theory and Applications. Biing-Shen Kuo National Chengchi University
The Bootstrap: Theory and Applications Biing-Shen Kuo National Chengchi University Motivation: Poor Asymptotic Approximation Most of statistical inference relies on asymptotic theory. Motivation: Poor
More informationAnalysis of Middle Censored Data with Exponential Lifetime Distributions
Analysis of Middle Censored Data with Exponential Lifetime Distributions Srikanth K. Iyer S. Rao Jammalamadaka Debasis Kundu Abstract Recently Jammalamadaka and Mangalam (2003) introduced a general censoring
More informationOn Moving Average Parameter Estimation
On Moving Average Parameter Estimation Niclas Sandgren and Petre Stoica Contact information: niclas.sandgren@it.uu.se, tel: +46 8 473392 Abstract Estimation of the autoregressive moving average (ARMA)
More informationASYMPTOTIC PROPERTIES OF THE LEAST SQUARES ESTIMATORS OF THE PARAMETERS OF THE CHIRP SIGNALS
Ann. Inst. Statist. Math. Vol. 56, o. 3, 529 544 (2004) (~)2004 The Institute of Statistical Mathematics ASYMPTOTIC PROPERTIES OF THE LEAST SQUARES ESTIMATORS OF THE PARAMETERS OF THE CHIRP SIGALS SWAGATA
More informationSGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection
SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28
More informationFAST AND ACCURATE DIRECTION-OF-ARRIVAL ESTIMATION FOR A SINGLE SOURCE
Progress In Electromagnetics Research C, Vol. 6, 13 20, 2009 FAST AND ACCURATE DIRECTION-OF-ARRIVAL ESTIMATION FOR A SINGLE SOURCE Y. Wu School of Computer Science and Engineering Wuhan Institute of Technology
More informationSome Theoretical Properties and Parameter Estimation for the Two-Sided Length Biased Inverse Gaussian Distribution
Journal of Probability and Statistical Science 14(), 11-4, Aug 016 Some Theoretical Properties and Parameter Estimation for the Two-Sided Length Biased Inverse Gaussian Distribution Teerawat Simmachan
More informationConfidence Intervals in Ridge Regression using Jackknife and Bootstrap Methods
Chapter 4 Confidence Intervals in Ridge Regression using Jackknife and Bootstrap Methods 4.1 Introduction It is now explicable that ridge regression estimator (here we take ordinary ridge estimator (ORE)
More informationBootstrap inference for the finite population total under complex sampling designs
Bootstrap inference for the finite population total under complex sampling designs Zhonglei Wang (Joint work with Dr. Jae Kwang Kim) Center for Survey Statistics and Methodology Iowa State University Jan.
More informationMean Likelihood Frequency Estimation
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 48, NO. 7, JULY 2000 1937 Mean Likelihood Frequency Estimation Steven Kay, Fellow, IEEE, and Supratim Saha Abstract Estimation of signals with nonlinear as
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More informationEIE6207: Estimation Theory
EIE6207: Estimation Theory Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: Steven M.
More informationMultivariate Normal-Laplace Distribution and Processes
CHAPTER 4 Multivariate Normal-Laplace Distribution and Processes The normal-laplace distribution, which results from the convolution of independent normal and Laplace random variables is introduced by
More informationEC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015
EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationInference For High Dimensional M-estimates. Fixed Design Results
: Fixed Design Results Lihua Lei Advisors: Peter J. Bickel, Michael I. Jordan joint work with Peter J. Bickel and Noureddine El Karoui Dec. 8, 2016 1/57 Table of Contents 1 Background 2 Main Results and
More informationBayes Estimation and Prediction of the Two-Parameter Gamma Distribution
Bayes Estimation and Prediction of the Two-Parameter Gamma Distribution Biswabrata Pradhan & Debasis Kundu Abstract In this article the Bayes estimates of two-parameter gamma distribution is considered.
More informationSupplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION. September 2017
Supplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION By Degui Li, Peter C. B. Phillips, and Jiti Gao September 017 COWLES FOUNDATION DISCUSSION PAPER NO.
More informationCovariance function estimation in Gaussian process regression
Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian
More informationAn algorithm for robust fitting of autoregressive models Dimitris N. Politis
An algorithm for robust fitting of autoregressive models Dimitris N. Politis Abstract: An algorithm for robust fitting of AR models is given, based on a linear regression idea. The new method appears to
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationInference For High Dimensional M-estimates: Fixed Design Results
Inference For High Dimensional M-estimates: Fixed Design Results Lihua Lei, Peter Bickel and Noureddine El Karoui Department of Statistics, UC Berkeley Berkeley-Stanford Econometrics Jamboree, 2017 1/49
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationStatistics - Lecture One. Outline. Charlotte Wickham 1. Basic ideas about estimation
Statistics - Lecture One Charlotte Wickham wickham@stat.berkeley.edu http://www.stat.berkeley.edu/~wickham/ Outline 1. Basic ideas about estimation 2. Method of Moments 3. Maximum Likelihood 4. Confidence
More informationEstimation Theory Fredrik Rusek. Chapters
Estimation Theory Fredrik Rusek Chapters 3.5-3.10 Recap We deal with unbiased estimators of deterministic parameters Performance of an estimator is measured by the variance of the estimate (due to the
More informationAnalysis of Gamma and Weibull Lifetime Data under a General Censoring Scheme and in the presence of Covariates
Communications in Statistics - Theory and Methods ISSN: 0361-0926 (Print) 1532-415X (Online) Journal homepage: http://www.tandfonline.com/loi/lsta20 Analysis of Gamma and Weibull Lifetime Data under a
More informationA Subspace Approach to Estimation of. Measurements 1. Carlos E. Davila. Electrical Engineering Department, Southern Methodist University
EDICS category SP 1 A Subspace Approach to Estimation of Autoregressive Parameters From Noisy Measurements 1 Carlos E Davila Electrical Engineering Department, Southern Methodist University Dallas, Texas
More informationMassachusetts Institute of Technology Department of Economics Time Series Lecture 6: Additional Results for VAR s
Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture 6: Additional Results for VAR s 6.1. Confidence Intervals for Impulse Response Functions There
More informationLecture 8: Information Theory and Statistics
Lecture 8: Information Theory and Statistics Part II: Hypothesis Testing and I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 23, 2015 1 / 50 I-Hsiang
More informationThe properties of L p -GMM estimators
The properties of L p -GMM estimators Robert de Jong and Chirok Han Michigan State University February 2000 Abstract This paper considers Generalized Method of Moment-type estimators for which a criterion
More informationAsymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands
Asymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands Elizabeth C. Mannshardt-Shamseldin Advisor: Richard L. Smith Duke University Department
More informationElements of Multivariate Time Series Analysis
Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series
More informationA Comparison of Robust Estimators Based on Two Types of Trimming
Submitted to the Bernoulli A Comparison of Robust Estimators Based on Two Types of Trimming SUBHRA SANKAR DHAR 1, and PROBAL CHAUDHURI 1, 1 Theoretical Statistics and Mathematics Unit, Indian Statistical
More informationNonconcave Penalized Likelihood with A Diverging Number of Parameters
Nonconcave Penalized Likelihood with A Diverging Number of Parameters Jianqing Fan and Heng Peng Presenter: Jiale Xu March 12, 2010 Jianqing Fan and Heng Peng Presenter: JialeNonconcave Xu () Penalized
More informationFall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.
1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n
More informationMaximum Likelihood Estimation
Maximum Likelihood Estimation Assume X P θ, θ Θ, with joint pdf (or pmf) f(x θ). Suppose we observe X = x. The Likelihood function is L(θ x) = f(x θ) as a function of θ (with the data x held fixed). The
More informationECE 275A Homework 7 Solutions
ECE 275A Homework 7 Solutions Solutions 1. For the same specification as in Homework Problem 6.11 we want to determine an estimator for θ using the Method of Moments (MOM). In general, the MOM estimator
More informationAcomplex-valued harmonic with a time-varying phase is a
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 46, NO. 9, SEPTEMBER 1998 2315 Instantaneous Frequency Estimation Using the Wigner Distribution with Varying and Data-Driven Window Length Vladimir Katkovnik,
More informationBinary choice 3.3 Maximum likelihood estimation
Binary choice 3.3 Maximum likelihood estimation Michel Bierlaire Output of the estimation We explain here the various outputs from the maximum likelihood estimation procedure. Solution of the maximum likelihood
More informationLong-range dependence
Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series
More informationand Srikanth K. Iyer Department of Mathematics Indian Institute of Technology, Kanpur, India. ph:
A SIMPLE ESTIMATE OF THE INDEX OF STABILITY FOR SYMMETRIC STABLE DISTRIBUTIONS by S. Rao Jammalamadaka Department of Statistics and Applied Probability University of California, Santa Barbara, USA ph:
More informationOn the Comparison of Fisher Information of the Weibull and GE Distributions
On the Comparison of Fisher Information of the Weibull and GE Distributions Rameshwar D. Gupta Debasis Kundu Abstract In this paper we consider the Fisher information matrices of the generalized exponential
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationEmpirical likelihood and self-weighting approach for hypothesis testing of infinite variance processes and its applications
Empirical likelihood and self-weighting approach for hypothesis testing of infinite variance processes and its applications Fumiya Akashi Research Associate Department of Applied Mathematics Waseda University
More informationEXTENDED GLRT DETECTORS OF CORRELATION AND SPHERICITY: THE UNDERSAMPLED REGIME. Xavier Mestre 1, Pascal Vallet 2
EXTENDED GLRT DETECTORS OF CORRELATION AND SPHERICITY: THE UNDERSAMPLED REGIME Xavier Mestre, Pascal Vallet 2 Centre Tecnològic de Telecomunicacions de Catalunya, Castelldefels, Barcelona (Spain) 2 Institut
More informationOn Modifications to Linking Variance Estimators in the Fay-Herriot Model that Induce Robustness
Statistics and Applications {ISSN 2452-7395 (online)} Volume 16 No. 1, 2018 (New Series), pp 289-303 On Modifications to Linking Variance Estimators in the Fay-Herriot Model that Induce Robustness Snigdhansu
More informationLesson 2: Analysis of time series
Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems
More informationA comparison of inverse transform and composition methods of data simulation from the Lindley distribution
Communications for Statistical Applications and Methods 2016, Vol. 23, No. 6, 517 529 http://dx.doi.org/10.5351/csam.2016.23.6.517 Print ISSN 2287-7843 / Online ISSN 2383-4757 A comparison of inverse transform
More informationAsymptotic standard errors of MLE
Asymptotic standard errors of MLE Suppose, in the previous example of Carbon and Nitrogen in soil data, that we get the parameter estimates For maximum likelihood estimation, we can use Hessian matrix
More informationInfinitely Imbalanced Logistic Regression
p. 1/1 Infinitely Imbalanced Logistic Regression Art B. Owen Journal of Machine Learning Research, April 2007 Presenter: Ivo D. Shterev p. 2/1 Outline Motivation Introduction Numerical Examples Notation
More informationTIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.
TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationDoes k-th Moment Exist?
Does k-th Moment Exist? Hitomi, K. 1 and Y. Nishiyama 2 1 Kyoto Institute of Technology, Japan 2 Institute of Economic Research, Kyoto University, Japan Email: hitomi@kit.ac.jp Keywords: Existence of moments,
More informationA PRACTICAL WAY FOR ESTIMATING TAIL DEPENDENCE FUNCTIONS
Statistica Sinica 20 2010, 365-378 A PRACTICAL WAY FOR ESTIMATING TAIL DEPENDENCE FUNCTIONS Liang Peng Georgia Institute of Technology Abstract: Estimating tail dependence functions is important for applications
More informationOn the estimation of the K parameter for the Rice fading distribution
On the estimation of the K parameter for the Rice fading distribution Ali Abdi, Student Member, IEEE, Cihan Tepedelenlioglu, Student Member, IEEE, Mostafa Kaveh, Fellow, IEEE, and Georgios Giannakis, Fellow,
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationcovariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of
Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated
More informationIdentifiability, Invertibility
Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:
More informationSome Theories about Backfitting Algorithm for Varying Coefficient Partially Linear Model
Some Theories about Backfitting Algorithm for Varying Coefficient Partially Linear Model 1. Introduction Varying-coefficient partially linear model (Zhang, Lee, and Song, 2002; Xia, Zhang, and Tong, 2004;
More informationStatistics for analyzing and modeling precipitation isotope ratios in IsoMAP
Statistics for analyzing and modeling precipitation isotope ratios in IsoMAP The IsoMAP uses the multiple linear regression and geostatistical methods to analyze isotope data Suppose the response variable
More informationUNIVERSITÄT POTSDAM Institut für Mathematik
UNIVERSITÄT POTSDAM Institut für Mathematik Testing the Acceleration Function in Life Time Models Hannelore Liero Matthias Liero Mathematische Statistik und Wahrscheinlichkeitstheorie Universität Potsdam
More informationResearch Article The Laplace Likelihood Ratio Test for Heteroscedasticity
International Mathematics and Mathematical Sciences Volume 2011, Article ID 249564, 7 pages doi:10.1155/2011/249564 Research Article The Laplace Likelihood Ratio Test for Heteroscedasticity J. Martin van
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More information8.2 Harmonic Regression and the Periodogram
Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process
More informationA Simulation Comparison Study for Estimating the Process Capability Index C pm with Asymmetric Tolerances
Available online at ijims.ms.tku.edu.tw/list.asp International Journal of Information and Management Sciences 20 (2009), 243-253 A Simulation Comparison Study for Estimating the Process Capability Index
More informationLIST OF PUBLICATIONS
LIST OF PUBLICATIONS Papers in referred journals [1] Estimating the ratio of smaller and larger of two uniform scale parameters, Amit Mitra, Debasis Kundu, I.D. Dhariyal and N.Misra, Journal of Statistical
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationMeasure-Transformed Quasi Maximum Likelihood Estimation
Measure-Transformed Quasi Maximum Likelihood Estimation 1 Koby Todros and Alfred O. Hero Abstract In this paper, we consider the problem of estimating a deterministic vector parameter when the likelihood
More informationMGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong
Modeling, Estimation and Control, for Telecommunication Networks Notes for the MGR-815 course 12 June 2010 School of Superior Technology Professor Zbigniew Dziong 1 Table of Contents Preface 5 1. Example
More informationOn prediction and density estimation Peter McCullagh University of Chicago December 2004
On prediction and density estimation Peter McCullagh University of Chicago December 2004 Summary Having observed the initial segment of a random sequence, subsequent values may be predicted by calculating
More informationSmooth nonparametric estimation of a quantile function under right censoring using beta kernels
Smooth nonparametric estimation of a quantile function under right censoring using beta kernels Chanseok Park 1 Department of Mathematical Sciences, Clemson University, Clemson, SC 29634 Short Title: Smooth
More information