Simple Examples. Let s look at a few simple examples of OI analysis.

Size: px
Start display at page:

Download "Simple Examples. Let s look at a few simple examples of OI analysis."

Transcription

1 Simple Examples Let s look at a few simple examples of OI analysis. Example 1: Consider a scalar prolem. We have one oservation y which is located at the analysis point. We also have a ackground estimate x. In addition, we assume that we know the error statistics of y and x : x x 0 ( x x ) t ( y y )( x x ) 0 y y 0 t t t t ( y y ) t o Using the notion of OI solution, n =1, p=1, H=(1), R=( ), B=( ). o 1

2 ( x ) ( x ) a 1 (1)( o) [( y) (1)( x)] 1 1 ( ) (1)( o) (1) x x y x x y o a ( ) o o o a o The analysis equation can e rewritten as 1 1 o xa x y, o o 1 1 x then xa o o y, xa x y therefore. a o We see that the analysis is a linear comination of the ackground and oservation, with the weighting coefficients proportional to the inverse of variances. In fact, if we have N samples of

3 oservations, or more generally, N measured estimates of state variale x, called y n, plus one ackground estimate x, then the final estimate comining all availale pieces of information is x x y N a n a n1 on, N where. a n1 on The variance of the analysis is smaller than that of oth ackground and oservations, i.e., 1 a min[, o1, o,... on ]. This is ecause the varianace is always positive, should e larger than the largest term of the right hand side of the analysis covariance equation aove, therefore should e smaller than the smallest on the right hand side that corresponds to the largest term. a a 3

4 Now assume that the analysis is for temperature and y =.0, R = ( o ) = 1.0 x = 0.5, B = ( ) =.0 we otain the analysis and its variance x a = 1.5, A 0.7 ( a 0.8). The following figure shows that ackground, oservation and analysis proaility distriution functions assuming the errors have Gaussian distriutions. It is clear that the analysis is etween the ackground and oservation, and is in this particular case closer to the oservation ecause of its smaller expected errors. It is also shown that the analysis has a higher proaility and smaller expected errors compared to oth the ackground and the oservation. Read also section of Kalnay s ook. 4

5 Proaility distriution function for analysis, given oservation and the ackground. 5

6 Example. Now, suppose we have one oservation, y, located etween two analysis points. We have ackground information on the two analysis points, denoted y x 1 and x, and we can linearly interpolate the ackground information to the oservation location as Hx x 1 1 x 1 (1 ) x x where z z For linear interpolation,, where z is the spatial coordinate. z z1 The assumed oservation error is as given efore, R = ( o ), while the ackground error covariance matrix now takes the form of B. 1 1 Here we have assumed that 6

7 with eing the ackground error correlation coefficient etween the two grid points. The OI solution is then x x (1 ) y [ x (1 ) x ] x a1 1 1 a x (1 ) [ (1 ) (1 ) ] o. (5) Let s consider three cases: Case 1. The oservation is collocated with analysis grid point 1 (= 1) and the ackground errors are not correlated etween points 1 and (= 0). The aove solution now reduces to x x 1 y x x a1 1 1 a x 0 o. In this case, the solution at point 1 is identical to that in example 1. The solution at point is equal to the ackground and no information from the oservation is added there. Case. The oservation is collocated with analysis at grid 1 (= 1) as in case 1. However, the ackground errors are correlated etween point 1 and point ( 0). The solution now reduces to 7

8 x x 1 y x x a1 1 1 a x o. In this case, the solution at point 1 is unchanged from case 1, ut the solution at point is equal to the ackground plus times the analysis increment added to point 1 now we can see the role of ackground error correlation in spreading oservational information (or more strictly the analysis increment). Case 3. The oservation is located inetween of the analysis points ( 1) ut the ackground errors are not correlated etween points 1 and (= 0). Now the solution ecomes xa 1 x 1 y [ x 1 (1 ) x] x a x 1 [ (1 ) ] o. In this case, the analysis increments for point 1 and point are proportional to and 1 -, respectively, i.e., the analysis result depends on the distance of the oservation from the grid point. This also says that even in the asence of ackground error correlation, the grid points involved in the forward oservation operators are usually influenced directly y the oservations, through the link in the oservation operator. 8

9 Final Comments: From the full solution (5), we can see that oth the oservation operator and error correlation have made contriutions. However, when generalizing the solution from two analysis points to n points, the linear interpolation operator will only influence the analysis points around the oservation, while the error correlations may spread information to all analysis points that have error correlation with the first guess oservation H( x ). 9

10 Approximations with Practical OI Implementation The OI analysis is given y x x W[ y H( x )] a o W ( W BH ( R HBH ) T T 1 The actual implementation requires simplifications in the computation of the weight W. W The equation for xa can e regarded as a list of scalar analysis equations, one per model variale in the vector x. For each model variale the analysis increment is given y the corresponding line of W times the vector of ackground departures ( y H( x )). Given 30

11 W w w w T 1 T : T n, and T xa 1 x 1 w 1 T x a x w [ y o H ( x )], : : : T xan xn wn therefore, x x w T [ y H ( x )]. ai i i o The fundamental hypothesis in the typically implementation of OI is: For each model variale, only a few oservations are important in determining the analysis increment. Based on this assumption, the prolem of matrix product and inversion is reduced y including only a smaller numer of oservations for the analysis at a given grid point. The following two figures show two data selection strategies. 31

12 3

13 33

14 The actual implemented can e as follows: 1) For each model variale x i, select a small numer of p i oservations using empirical selection criteria. ) Form the corresponding list of ackground departures ( y H( x )) i, 3) Form the p i ackground error covariances etween the model variale x i and the model state interpolated at the p i oservation points (i.e. the relevant p i coefficients of the i-th line of BH T ), and 4) Form the p i p i ackground and oservation error covariance sumatrices formed y the restrictions of HBH T and R to the selected oservations. 5) Invert the positive definite matrix formed y the restriction of (HBH T +R) to the selected oservations, 6) Multiply it y the i-th line of BH T to get the necessary line of W. It is possile to save some computer time on the matrix inversion y solving directly a symmetric positive linear system, since we know in advance the vector of departures to which the inverse matrix will e applied. Also, if the same set of oservations is used to analyze several model variales, then the same matrix inverse (or factorization) can e reused. 34

15 Models of Error Covariances Correct specification of ackground and oservation error covariances is crucial they determine the relative weight of ackground and oservations. Variances are essential for determining the magnitude of errors therefore the relative weight Covariance determines how oservation information is spread in model space (when the model resolution does not match that of oservation) Oservation error variances include instrument errors and representativeness errors. Systematic oservation iases should e removed efore using the data Oservation error correlation/covariance often assumed zero, i.e., measurements are assumed uncorrelated. Oservation error correlation can show up when o Sets of oservations are taken y the same platform, e.g., radar, rawinsonde, aircraft, satellite o Data preprocessing that introduce systematic errors o Representative errors of close-y oservations o Error of the forward operator, e.g., interpolator that contains similar errors The presence of (positive) oservation error correlation reduces the weight given to the average of the oservations reasonale ecause these oservations are alike 35

16 Oservation error correlations are difficult to estimate and account for. In practice, efforts are made to minimize them through reducing ias, y o Avoiding unnecessary preprocessing o By thinning dense data (denser than grid resolution) o By improving model and oservation operators (model plays the role of forward operator in the case of 4DVAR) After these are done, it is safer to assume the oservation correlation to e zero, i.e., the oservation error covariance matrix R is diagonal. Background error variances they are usually estimates of the error variances in the forecast used to produce x. This is a difficult prolem, ecause they are never oserved directly they can only e estimated in a statistical sense. If the analysis is of good quality (i.e. if there are a lot of oservations) an estimate can e provided y the variance of the differences etween the forecast and a verifying analysis. If the oservations can e assumed to e uncorrelated, much etter averaged ackground error variances can e otained y using the oservational method (or the Hollingworth-Lonnerg method Tellus 1986). 36

17 However, in a system like the atmosphere the actual ackground errors are expected to depend a lot on the weather situation, and ideally the ackground errors should e flow-dependent. This can e achieved y the Kalman filter, y 4D-Var to some extent, or y some empirical laws of error growth ased on physical grounds. If ackground error variances are adly specified, it will lead to too large or too small analysis increments. In least-squares analysis algorithms, only the relative magnitude of the ackground and oservation error variances is important. Background error correlations they are essential ecause Information spreading. o In data-sparse areas, the shape of the analysis increment is completely determined y the covariance structures. o The correlations in B will perform the spatial spreading of information from the oservation points to a finite domain surrounding it. Information smoothing. 37

18 o In data-dense areas, the amount of smoothing of the oserved information is governed y the correlations in B, which can e understood y noting that the left most term in W is B. o The smoothing of the increments is important in ensuring that the analysis contains scales which are statistically compatile with the smoothness properties of the physical fields. For instance, when analyzing stratospheric or anticyclonic air masses, it is desirale to smooth the increments a lot in the horizontal in order to average and spread efficiently the measurements. When doing a low-level analysis in frontal, coastal or mountainous areas, or near temperature inversions, it is desirale on the contrary to limit the extent of the increments so as not to produce an unphysically smooth analysis. This has to e reflected in the specification of ackground error correlations. o Both of the aove address only the issue of correlations among same variales, or autocorrelations. Balance properties correlation across variales, or cross-correlations o There are sometimes more degrees of freedom in a model than in reality (i.e., not all model variales are free from each other). For instance, the large-scale atmosphere is usually hydrostatic and is almost geostrophic these relationships introduce alances among the fields 38

19 o These alance properties show up as correlations in the ackground errors o Because of these correlations / alances, oservations can e used more effectively, i.e., oserving one model variale yields information aout all variales that are in alance with it. For example, a low-level wind oservation allows one to correct the surface pressure field y assuming some amount of geostrophy. When comined with the spatial smoothing of increments this can lead to a considerale impact on the quality of the analysis, e.g. a properly spread oservation of geopotential height can produce a complete three-dimensional correction to the geostrophic wind field (see Figure ). The relative amplitude of the increments in terms of the various model fields will depend directly on the specified amount of correlation as well as on the assumed error variance in all the concerned parameters. o Accurate estimation and use of ackground error (cross-) correlations can do the magic of retrieving quantities not directly oserved, a thing that ensemle Kalman filter attempts to do in, e.g., the assimilation of radar data. o Accurate ackground errors are flow-dependent. 39

20 40

21 The oservational (or Hollingworth Lonnerg) method for estimating ackground error This method relies on the use of ackground departures (y - H(x )) in an oserving network that is dense and large enough to provide information on many scales, and that can e assumed to consist of uncorrelated and discrete oservations. The principle (illustrated in Fig. 8 ) is to calculate an histogram of ackground departure covariances, stratified against separation (for instance). At zero separation the histogram provides averaged information aout the ackground and oservation errors, at nonzero separation it gives the averaged ackground error correlation. 41

22 In most systems the ackground error covariances should go to zero for very large separations. If this is not the case, it is usually the sign of iases in the ackground and/or in the oservations and the method may not work correctly (Hollingsworth and Lonnerg 1986.). The formula is like this: For the innovation covariance ij, etween point i and j, 4

23 T E[( y H x )( y H x ) ] ij i i j j T T E[{( y H x ) ( H x H x )}{( y H x ) ( H x H x ) }] i i t i t i j j t j t j T T T E[( y H x )( y H x ) ] H E[( x x )( x x ) ] H 0 0 R i i t j j t i t t j H BH T ij i j Reference: Hollingsworth, A. and P. Lonnerg, 1986: The statistical structure of short-range forecast errors as determined from radiosonde data. Part I: The wind field. Tellus, 38A,

24 The NMC method for estimating ackground errors The so-called "NMC method" (Parrish and Derer, 199) estimates the forecast error covariance according to B E{[ x (48 hr) x (4 hr)][ x (48 hr) x (4 hr)] T }, f f f f i.e., the structure of the forecast or ackground error covariance is estimated as the average over many (e.g., 50) differences etween two short-range model forecasts verifying at the same time. The magnitude of the covariance is then appropriately scaled. In this approximation, rather than estimating the structure of the forecast error covariance from differences with oservations, the model-forecast differences themselves provide a multivariate gloal forecast difference covariance. The forecast covariance strictly speaking is the covariance of the forecast differences and is only a proxy of the structure of forecast errors. Nevertheless, it has een shown to produce etter results than previous estimates computed from forecast minus oservation estimates. An important reason is that the rawinsonde oservational network does not have enough density to allow a proper estimate of the gloal structures. The NMC method has een in use at most operational centers ecause of its simplicity and comparative effectiveness. 44

25 Being ased on many past forecasts (over e.g., 1- months), the estimate is at est seasonally dependent, however. Reference: Parrish, D. E. and J. C. Derer, 199: The National Meteorological Center's spectral statisticalinterpolation analysis system. Mon. Wea. Rev., 10, The Modeling of ackground correlations The full B matrix is usually too ig to e specified explicitly. The variances are just the n diagonal terms of B, which are usually specified completely. o The off-diagonal terms are more difficult to specify. They must generate a symmetric positive definite matrix. o Additionally B is often required to have some physical properties which are required to e reflected in the analysis: o the correlations must e smooth in physical space, on sensile scales, o the correlations should go to zero for very large separations if it is elieved that oservations should only have a local effect on the increments, o the correlations should not exhiit physically unjustifiale variations according to direction or location, 45

26 o the most fundamental alance properties, like geostrophy, must e reasonaly well enforced. o the correlations should not lead to unreasonale effective ackground error variances for any parameter that is oserved, used in the susequent model forecast, or output to the users as an analysis product. The complexity and sutlety of these requirements mean that the specification of ackground error covariances is a prolem similar to physical parameterization. Some of the more popular techniques are listed elow. o Correlation models can e specified independently from variance fields, under the condition that the scales of variation of the variances are much larger than the correlation scales, o Vertical autocorrelation matrices for each parameter are usually small enough to e specified explicitly. o Horizontal autocorrelations cannot e specified explicitly, ut they can e reduced to sparse matrices y assuming that they are homogeneous and isotropic to some extent. o Three-dimensional multivariate correlation models can e uilt y carefully comining separaility, homogeneity and independency hypotheses like: zero correlations in the vertical for distinct spectral wavenumers, homogeneity of the vertical correlations in the horizontal and/or horizontal correlations in the vertical, property of the correlations eing products of horizontal and vertical correlations. Numerically they imply that the correlation matrix is sparse ecause it is made of lock matrices which are themselves lock-diagonal 46

27 o Balance constraints can e enforced y transforming the model variales into suitaly defined complementary spaces of alanced and unalanced variales. The latter are supposed to have smaller ackground error variances than the former, meaning that they will contriute less to the increment structures. o The geostrophic alance constraint can e enforced using the classical f-plane or -plane alance equations, or projections onto suspaces spanned y so-called Rossy and Gravity normal modes. o More general kinds of alance properties can e expressed using linear regression operators calirated on actual ackground error fields, if no analytical formulation is availale. Many of such treatments are done in the NCEP operational 3DVAR system. Good discussions can e found in: Purser, R. J., W.-S. Wu, D. F. Parrish, and N. M. Roerts, 003: Numerical aspects of the application of recursive filters to variational statistical analysis. Part II: Spatially inhomogeneous and anisotropic general covariances. Mon. Wea. Rev., 131, Purser, R. J., W.-S. Wu, D. F. Parrish, and N. M. Roerts, 003: Numerical aspects of the application of recursive filters to variational statistical analysis. Part I: Spatially homogeneous and isotropic Gaussian covariances. Mon. Wea. Rev., 131,

28 Correlation coefficients etween Z at the lack dot and model states at t=80min. Experiment assimilating Z(>10dBZ) only (See Tong and Xue MWR 005) - color is for the variales, contours are for the correlations. 48

29 Comment on OI (and 3DVAR) versus other schemes Perhaps the most important advantage of statistical interpolation schemes such as Optimal Interpolation and 3D-Var over empirical schemes such as successive correction method (SCM), is that the correlation etween oservational increments can e taken into account. With SCM, the weights of the oservational increments depend only on their distance to the grid point. Therefore, if a numer of oservations are "unched up" in one quadrant, with just a single oservation in a different quadrant, then all the oservations will e given similar weight. In Optimal Interpolation (or 3D Var), y contrast, the isolated oservational increment will e given more weight in the analysis than oservations that are close together and therefore less independent. When several oservations are too close together, then the OI solution ecomes an ill-posed prolem. In those cases, it is common to compute a "super-oservation" comining the close individual oservations. This has the advantage of removing the ill posedness, while at the same time reducing y averaging the random errors of the individual oservations. The superoservation should e a weighted average that takes into account the relative oservation errors of the original close oservations. 49

30 The role of oservation operator H Earlier, we said that The H matrix (a p n matraix) transforms vectors in model space (e.g., x, which is a vector of length n) into their corresponding values in oservation space (vectors of length p). The transpose or adjoint of H, H T, (an n p matrix) transforms vectors in oservation space (e.g., y, a vector of length p) to vectors in model space (vectors of length n). The oservation operator and its adjoint can also operate on the error covariance matrices, B and R, and when they do so, they have similar effect as on vectors. T For example, we indicated earlier that BH is the ackground error covariances etween the grid points and oservation points, therefore H T plays the role of taking ackground error from grid point to oservational points, partially though ecause the product still represents correlations etween grid and oservation points. The ackground error is completely rought into the oservation space y the H operator and its T adjoint (transpose), so that HBH represents error covariances of the ackground in terms of the oserved quantities. This can e seen from elow: 50

31 y H ( x ) y y H( x + x x ) H( x ) H( x x ) t t t t t ( T t )( t ) ( t )( t ) y y y y H x x x x H HBH T T T where approximation has een made to linearize the oservation operator H. For a linear oservation operator, it is exact. Further illustration of the point Suppose we have three oservations, o o o y1, y and y 3 taken etween two grid points with ackground values of x 1 and x : o o o x 1 y 1 y 3 y x The forward operator is simply the linear interpolator, so that x 1 x 1 Hx 1. x x 1 The ackground error covariance matrix B is 51

32 therefore B T BH Indeed, the ackground error covariances have een interpolated y the oservation operator (interpolator in this case). T For example, the first element of BH, represents the ackground error covariance etween grid point one and oservation 1, and is equal to the interpolation of the ackground error covariance etween x 1 point with itself ( 11 ) and the ackground error covariance etween x 1 point and x point ( 1 ) to the y 1 oservation point, using interpolation coefficients 1 and 1. Let s denote the matrix c c c T BH c c c The first index of c denotes grid point location and second index indicates the oservation point.. 5

33 T Applying H operator again to BH takes the other grid point end of covariances also to the oservation points, so that we are left with covariances etween oservation points, ut still of ackground errors. HBH. c11 c1 c1 c c13 c3 d11 d1 d13 c c c c c c d d d. 1 T c11 c1 c13 c1 c c c11 3c1 3c1 3c 3c13 3c3 d31 d3 d33 Here, covariances c ij are interpolated again, from grid point (indicated y the first index) to the oservational point. For example, d 1 represents ackground error covariance etween y 1 and y point, and is equal to the interpolated value (using weight 1 and 1 ) of the covariance etween x 1 point and y point and the error variance at y point. The error variances and covariances for x are defined as 53

34 N 1 1 x x x x ( )( ) 11 1i 1t 1i 1t 1i 1i N 1 i1 N 1 i1 N 1 1 x x x x ( )( ) 1 1i 1t i t 1i i N 1 i1 N 1 i1 N 1 1 x x x x ( )( ) 1 i t 1i 1t i 1i N 1 i1 N 1 i1 N 1 1 x x x x ( )( ) i t i t i i N 1 i1 N 1 i1 N N N N where the summations are over the numer of samples, and 1 i x1 i x1 t, i xi xt are the errors of x at points 1 and. x t denotes the truth. Covarance d 1 is then 54

35 N N N N 1 d1 c1 c ( 1i 1i 1i i ) ( i 1i i i ) N 1 i1 i1 i1 i1 N 1 1i1i 1 i i i1i i i N 1 i1 N 1 ( 1i i)( 1 i i) N 1 i1 N 1 ( y y )( y y N 1 i1 1i 1t i t ) where y1 i x1 i x i and yi x1 i xi. which is clearly the covariance etween the x errors interpolated to oservation points 1 and. For T this reason, in the ensemle Kalman filter procedure where we need HBH from ensemle samples, we apply the oservation operator H to x first, then directly calculate the ackground error covariance etween oservation points, which is much cheaper than calculating B first then multiple H on the left and H T on the right. When the oservation operator H is linear, the two approaches are identical. When H is not linear, the results are not identical, ut if the linearization approximation is reasonaly good, the difference is small. Here, we have two grid points and three oservations, therefore n= and p=3, therefore B is a x T array while HBH is a 3x3 array. Suppose the ackground error covariance is zero, i.e., there is no correlation etween errors at different grid points, then 55

36 B, T BH and HBH , 1 T Consider the special case where y 1 is located at x 1 point, and y 3 is located at x point, and y is located etween x 1 and x, then 1 1, 1 0, 3 0, T BH,

37 11 11 T HBH Assuming 0 0 R 0 0 o 0 o o a Derive the formula for, a x, a 1 x and x 3 (you homework no need to turn it in ut make sure you do it!) o o Consider three situations: (1) y is located at equal distance from x 1 and x, () y is located at the o o same point as y 1 and x 1, (3) y does not exist. Discuss the your results. 57

In the derivation of Optimal Interpolation, we found the optimal weight matrix W that minimizes the total analysis error variance.

In the derivation of Optimal Interpolation, we found the optimal weight matrix W that minimizes the total analysis error variance. hree-dimensional variational assimilation (3D-Var) In the derivation of Optimal Interpolation, we found the optimal weight matrix W that minimizes the total analysis error variance. Lorenc (1986) showed

More information

13A. 4 Analysis and Impact of Super-obbed Doppler Radial Velocity in the NCEP Grid-point Statistical Interpolation (GSI) Analysis System

13A. 4 Analysis and Impact of Super-obbed Doppler Radial Velocity in the NCEP Grid-point Statistical Interpolation (GSI) Analysis System 13A. 4 Analysis and Impact of Super-obbed Doppler Radial Velocity in the NCEP Grid-point Statistical Interpolation (GSI) Analysis System Shun Liu 1, Ming Xue 1,2, Jidong Gao 1,2 and David Parrish 3 1 Center

More information

Tue 4/19/2016. MP experiment assignment (due Thursday) Final presentations: 28 April, 1-4 pm (final exam period)

Tue 4/19/2016. MP experiment assignment (due Thursday) Final presentations: 28 April, 1-4 pm (final exam period) Data Assimilation Tue 4/9/06 Notes and optional practice assignment Reminders/announcements: MP eperiment assignment (due Thursday) Final presentations: 8 April, -4 pm (final eam period) Schedule optional

More information

Optimal Interpolation ( 5.4) We now generalize the least squares method to obtain the OI equations for vectors of observations and background fields.

Optimal Interpolation ( 5.4) We now generalize the least squares method to obtain the OI equations for vectors of observations and background fields. Optimal Interpolation ( 5.4) We now generalize the least squares method to obtain the OI equations for vectors of observations and background fields. Optimal Interpolation ( 5.4) We now generalize the

More information

M.Sc. in Meteorology. Numerical Weather Prediction

M.Sc. in Meteorology. Numerical Weather Prediction M.Sc. in Meteorology UCD Numerical Weather Prediction Prof Peter Lynch Meteorology & Climate Cehtre School of Mathematical Sciences University College Dublin Second Semester, 2005 2006. Text for the Course

More information

INTRODUCTION. 2. Characteristic curve of rain intensities. 1. Material and methods

INTRODUCTION. 2. Characteristic curve of rain intensities. 1. Material and methods Determination of dates of eginning and end of the rainy season in the northern part of Madagascar from 1979 to 1989. IZANDJI OWOWA Landry Régis Martial*ˡ, RABEHARISOA Jean Marc*, RAKOTOVAO Niry Arinavalona*,

More information

Introduction to Data Assimilation. Saroja Polavarapu Meteorological Service of Canada University of Toronto

Introduction to Data Assimilation. Saroja Polavarapu Meteorological Service of Canada University of Toronto Introduction to Data Assimilation Saroja Polavarapu Meteorological Service of Canada University of Toronto GCC Summer School, Banff. May 22-28, 2004 Outline of lectures General idea Numerical weather prediction

More information

QUALITY CONTROL OF WINDS FROM METEOSAT 8 AT METEO FRANCE : SOME RESULTS

QUALITY CONTROL OF WINDS FROM METEOSAT 8 AT METEO FRANCE : SOME RESULTS QUALITY CONTROL OF WINDS FROM METEOSAT 8 AT METEO FRANCE : SOME RESULTS Christophe Payan Météo France, Centre National de Recherches Météorologiques, Toulouse, France Astract The quality of a 30-days sample

More information

FinQuiz Notes

FinQuiz Notes Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression

More information

4. DATA ASSIMILATION FUNDAMENTALS

4. DATA ASSIMILATION FUNDAMENTALS 4. DATA ASSIMILATION FUNDAMENTALS... [the atmosphere] "is a chaotic system in which errors introduced into the system can grow with time... As a consequence, data assimilation is a struggle between chaotic

More information

The Mean Version One way to write the One True Regression Line is: Equation 1 - The One True Line

The Mean Version One way to write the One True Regression Line is: Equation 1 - The One True Line Chapter 27: Inferences for Regression And so, there is one more thing which might vary one more thing aout which we might want to make some inference: the slope of the least squares regression line. The

More information

Numerical Weather Prediction: Data assimilation. Steven Cavallo

Numerical Weather Prediction: Data assimilation. Steven Cavallo Numerical Weather Prediction: Data assimilation Steven Cavallo Data assimilation (DA) is the process estimating the true state of a system given observations of the system and a background estimate. Observations

More information

Non-Linear Regression Samuel L. Baker

Non-Linear Regression Samuel L. Baker NON-LINEAR REGRESSION 1 Non-Linear Regression 2006-2008 Samuel L. Baker The linear least squares method that you have een using fits a straight line or a flat plane to a unch of data points. Sometimes

More information

Data assimilation concepts and methods March 1999

Data assimilation concepts and methods March 1999 Data assimilation concepts and methods March 1999 By F. Bouttier and P. Courtier Abstract These training course lecture notes are an advanced and comprehensive presentation of most data assimilation methods

More information

USE OF SURFACE MESONET DATA IN THE NCEP REGIONAL GSI SYSTEM

USE OF SURFACE MESONET DATA IN THE NCEP REGIONAL GSI SYSTEM 6A.7 USE OF SURFACE MESONET DATA IN THE NCEP REGIONAL GSI SYSTEM Seung-Jae Lee *, David F. Parrish, Wan-Shu Wu, Manuel Pondeca, Dennis Keyser, and Geoffery J. DiMego NCEP/Environmental Meteorological Center,

More information

The Impact of Background Error on Incomplete Observations for 4D-Var Data Assimilation with the FSU GSM

The Impact of Background Error on Incomplete Observations for 4D-Var Data Assimilation with the FSU GSM The Impact of Background Error on Incomplete Observations for 4D-Var Data Assimilation with the FSU GSM I. Michael Navon 1, Dacian N. Daescu 2, and Zhuo Liu 1 1 School of Computational Science and Information

More information

Comparison of 3D-Var and LETKF in an Atmospheric GCM: SPEEDY

Comparison of 3D-Var and LETKF in an Atmospheric GCM: SPEEDY Comparison of 3D-Var and LEKF in an Atmospheric GCM: SPEEDY Catherine Sabol Kayo Ide Eugenia Kalnay, akemasa Miyoshi Weather Chaos, UMD 9 April 2012 Outline SPEEDY Formulation Single Observation Eperiments

More information

Fundamentals of Data Assimilation

Fundamentals of Data Assimilation National Center for Atmospheric Research, Boulder, CO USA GSI Data Assimilation Tutorial - June 28-30, 2010 Acknowledgments and References WRFDA Overview (WRF Tutorial Lectures, H. Huang and D. Barker)

More information

Anisotropic spatial filter that is based on flow-dependent background error structures is implemented and tested.

Anisotropic spatial filter that is based on flow-dependent background error structures is implemented and tested. Special Topics 3DVAR Analysis/Retrieval of 3D water vapor from GPS slant water data Liu, H. and M. Xue, 2004: 3DVAR retrieval of 3D moisture field from slant-path water vapor observations of a high-resolution

More information

A Reduced Rank Kalman Filter Ross Bannister, October/November 2009

A Reduced Rank Kalman Filter Ross Bannister, October/November 2009 , Octoer/Novemer 2009 hese notes give the detailed workings of Fisher's reduced rank Kalman filter (RRKF), which was developed for use in a variational environment (Fisher, 1998). hese notes are my interpretation

More information

CHAPTER 5. Linear Operators, Span, Linear Independence, Basis Sets, and Dimension

CHAPTER 5. Linear Operators, Span, Linear Independence, Basis Sets, and Dimension A SERIES OF CLASS NOTES TO INTRODUCE LINEAR AND NONLINEAR PROBLEMS TO ENGINEERS, SCIENTISTS, AND APPLIED MATHEMATICIANS LINEAR CLASS NOTES: A COLLECTION OF HANDOUTS FOR REVIEW AND PREVIEW OF LINEAR THEORY

More information

Interpretation of two error statistics estimation methods: 1 - the Derozier s method 2 the NMC method (lagged forecast)

Interpretation of two error statistics estimation methods: 1 - the Derozier s method 2 the NMC method (lagged forecast) Interpretation of two error statistics estimation methods: 1 - the Derozier s method 2 the NMC method (lagged forecast) Richard Ménard, Yan Yang and Yves Rochon Air Quality Research Division Environment

More information

Initial trials of convective-scale data assimilation with a cheaply tunable ensemble filter

Initial trials of convective-scale data assimilation with a cheaply tunable ensemble filter Initial trials of convective-scale data assimilation with a cheaply tunale ensemle filter Jonathan Flowerdew 7 th EnKF Workshop, 6 May 016 Overall strategy Exploring synergy etween ensemles and data assimilation

More information

Background and observation error covariances Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience

Background and observation error covariances Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience Background and observation error covariances Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience Sarah Dance School of Mathematical and Physical Sciences, University of Reading

More information

EnKF Localization Techniques and Balance

EnKF Localization Techniques and Balance EnKF Localization Techniques and Balance Steven Greybush Eugenia Kalnay, Kayo Ide, Takemasa Miyoshi, and Brian Hunt Weather Chaos Meeting September 21, 2009 Data Assimilation Equation Scalar form: x a

More information

Brian J. Etherton University of North Carolina

Brian J. Etherton University of North Carolina Brian J. Etherton University of North Carolina The next 90 minutes of your life Data Assimilation Introit Different methodologies Barnes Analysis in IDV NWP Error Sources 1. Intrinsic Predictability Limitations

More information

Improving variational data assimilation through background and observation error adjustments

Improving variational data assimilation through background and observation error adjustments Generated using the official AMS LATEX template two-column layout. PRELIMINARY ACCEPTED VERSION M O N T H L Y W E A T H E R R E V I E W Improving variational data assimilation through ackground and oservation

More information

Data assimilation; comparison of 4D-Var and LETKF smoothers

Data assimilation; comparison of 4D-Var and LETKF smoothers Data assimilation; comparison of 4D-Var and LETKF smoothers Eugenia Kalnay and many friends University of Maryland CSCAMM DAS13 June 2013 Contents First part: Forecasting the weather - we are really getting

More information

Implementation and Evaluation of WSR-88D Radial Velocity Data Assimilation for WRF-NMM via GSI

Implementation and Evaluation of WSR-88D Radial Velocity Data Assimilation for WRF-NMM via GSI Implementation and Evaluation of WSR-88D Radial Velocity Data Assimilation for WRF-NMM via GSI Shun Liu 1, Ming Xue 1,2 1 Center for Analysis and Prediction of Storms and 2 School of Meteorology University

More information

GSI Tutorial Background and Observation Errors: Estimation and Tuning. Daryl Kleist NCEP/EMC June 2011 GSI Tutorial

GSI Tutorial Background and Observation Errors: Estimation and Tuning. Daryl Kleist NCEP/EMC June 2011 GSI Tutorial GSI Tutorial 2011 Background and Observation Errors: Estimation and Tuning Daryl Kleist NCEP/EMC 29-30 June 2011 GSI Tutorial 1 Background Errors 1. Background error covariance 2. Multivariate relationships

More information

Variational data assimilation of lightning with WRFDA system using nonlinear observation operators

Variational data assimilation of lightning with WRFDA system using nonlinear observation operators Variational data assimilation of lightning with WRFDA system using nonlinear observation operators Virginia Tech, Blacksburg, Virginia Florida State University, Tallahassee, Florida rstefane@vt.edu, inavon@fsu.edu

More information

Comparing Local Ensemble Transform Kalman Filter with 4D-Var in a Quasi-geostrophic model

Comparing Local Ensemble Transform Kalman Filter with 4D-Var in a Quasi-geostrophic model Comparing Local Ensemble Transform Kalman Filter with 4D-Var in a Quasi-geostrophic model Shu-Chih Yang 1,2, Eugenia Kalnay 1, Matteo Corazza 3, Alberto Carrassi 4 and Takemasa Miyoshi 5 1 University of

More information

Section 8.5. z(t) = be ix(t). (8.5.1) Figure A pendulum. ż = ibẋe ix (8.5.2) (8.5.3) = ( bẋ 2 cos(x) bẍ sin(x)) + i( bẋ 2 sin(x) + bẍ cos(x)).

Section 8.5. z(t) = be ix(t). (8.5.1) Figure A pendulum. ż = ibẋe ix (8.5.2) (8.5.3) = ( bẋ 2 cos(x) bẍ sin(x)) + i( bẋ 2 sin(x) + bẍ cos(x)). Difference Equations to Differential Equations Section 8.5 Applications: Pendulums Mass-Spring Systems In this section we will investigate two applications of our work in Section 8.4. First, we will consider

More information

Diagnosis of observation, background and analysis-error statistics in observation space

Diagnosis of observation, background and analysis-error statistics in observation space Q. J. R. Meteorol. Soc. (2005), 131, pp. 3385 3396 doi: 10.1256/qj.05.108 Diagnosis of observation, background and analysis-error statistics in observation space By G. DESROZIERS, L. BERRE, B. CHAPNIK

More information

Introduction to Data Assimilation. Reima Eresmaa Finnish Meteorological Institute

Introduction to Data Assimilation. Reima Eresmaa Finnish Meteorological Institute Introduction to Data Assimilation Reima Eresmaa Finnish Meteorological Institute 15 June 2006 Outline 1) The purpose of data assimilation 2) The inputs for data assimilation 3) Analysis methods Theoretical

More information

Efficient implementation of inverse variance-covariance matrices in variational data assimilation systems

Efficient implementation of inverse variance-covariance matrices in variational data assimilation systems ERAD 2012 - THE SENVENTH EUROPEAN CONFERENCE ON RADAR IN METEOROLOGY AND HYDROLOGY Efficient implementation of inverse variance-covariance matrices in variational data assimilation systems Dominik Jacques,

More information

Data Assimilation: Finding the Initial Conditions in Large Dynamical Systems. Eric Kostelich Data Mining Seminar, Feb. 6, 2006

Data Assimilation: Finding the Initial Conditions in Large Dynamical Systems. Eric Kostelich Data Mining Seminar, Feb. 6, 2006 Data Assimilation: Finding the Initial Conditions in Large Dynamical Systems Eric Kostelich Data Mining Seminar, Feb. 6, 2006 kostelich@asu.edu Co-Workers Istvan Szunyogh, Gyorgyi Gyarmati, Ed Ott, Brian

More information

Multivariate Correlations: Applying a Dynamic Constraint and Variable Localization in an Ensemble Context

Multivariate Correlations: Applying a Dynamic Constraint and Variable Localization in an Ensemble Context Multivariate Correlations: Applying a Dynamic Constraint and Variable Localization in an Ensemble Context Catherine Thomas 1,2,3, Kayo Ide 1 Additional thanks to Daryl Kleist, Eugenia Kalnay, Takemasa

More information

Introduction to Data Assimilation

Introduction to Data Assimilation Introduction to Data Assimilation Alan O Neill Data Assimilation Research Centre University of Reading What is data assimilation? Data assimilation is the technique whereby observational data are combined

More information

Experimental 2D-Var assimilation of ARM cloud and precipitation observations

Experimental 2D-Var assimilation of ARM cloud and precipitation observations 456 Experimental 2D-Var assimilation of ARM cloud and precipitation oservations Philippe Lopez, Angela Benedetti, Peter Bauer, Marta Janisková and Martin Köhler Research Department Sumitted to Q. J. Roy.

More information

The Inversion Problem: solving parameters inversion and assimilation problems

The Inversion Problem: solving parameters inversion and assimilation problems The Inversion Problem: solving parameters inversion and assimilation problems UE Numerical Methods Workshop Romain Brossier romain.brossier@univ-grenoble-alpes.fr ISTerre, Univ. Grenoble Alpes Master 08/09/2016

More information

Basic concepts of data assimilation

Basic concepts of data assimilation Basic concepts of data assimilation Numerical Weather Prediction NWP is an initial-boundary value problem: - given an estimate of the present state of the atmosphere (initial conditions), and appropriate

More information

Background Error Covariance Modelling

Background Error Covariance Modelling Background Error Covariance Modelling Mike Fisher Slide 1 Outline Diagnosing the Statistics of Background Error using Ensembles of Analyses Modelling the Statistics in Spectral Space - Relaxing constraints

More information

Introduction to Ensemble Kalman Filters and the Data Assimilation Research Testbed

Introduction to Ensemble Kalman Filters and the Data Assimilation Research Testbed Introduction to Ensemble Kalman Filters and the Data Assimilation Research Testbed Jeffrey Anderson, Tim Hoar, Nancy Collins NCAR Institute for Math Applied to Geophysics pg 1 What is Data Assimilation?

More information

A regime-dependent balanced control variable based on potential vorticity

A regime-dependent balanced control variable based on potential vorticity A regime-dependent alanced control variale ased on potential vorticity Ross Bannister, Data Assimilation Research Centre, University of Reading Mike Cullen, Numerical Weather Prediction, Met Office Funding:

More information

Expansion formula using properties of dot product (analogous to FOIL in algebra): u v 2 u v u v u u 2u v v v u 2 2u v v 2

Expansion formula using properties of dot product (analogous to FOIL in algebra): u v 2 u v u v u u 2u v v v u 2 2u v v 2 Least squares: Mathematical theory Below we provide the "vector space" formulation, and solution, of the least squares prolem. While not strictly necessary until we ring in the machinery of matrix algera,

More information

Kalman Filter and Ensemble Kalman Filter

Kalman Filter and Ensemble Kalman Filter Kalman Filter and Ensemble Kalman Filter 1 Motivation Ensemble forecasting : Provides flow-dependent estimate of uncertainty of the forecast. Data assimilation : requires information about uncertainty

More information

1. Define the following terms (1 point each): alternative hypothesis

1. Define the following terms (1 point each): alternative hypothesis 1 1. Define the following terms (1 point each): alternative hypothesis One of three hypotheses indicating that the parameter is not zero; one states the parameter is not equal to zero, one states the parameter

More information

Representation of inhomogeneous, non-separable covariances by sparse wavelet-transformed matrices

Representation of inhomogeneous, non-separable covariances by sparse wavelet-transformed matrices Representation of inhomogeneous, non-separable covariances by sparse wavelet-transformed matrices Andreas Rhodin, Harald Anlauf German Weather Service (DWD) Workshop on Flow-dependent aspects of data assimilation,

More information

The Structure of Background-error Covariance in a Four-dimensional Variational Data Assimilation System: Single-point Experiment

The Structure of Background-error Covariance in a Four-dimensional Variational Data Assimilation System: Single-point Experiment ADVANCES IN ATMOSPHERIC SCIENCES, VOL. 27, NO. 6, 2010, 1303 1310 The Structure of Background-error Covariance in a Four-dimensional Variational Data Assimilation System: Single-point Experiment LIU Juanjuan

More information

Estimating a Finite Population Mean under Random Non-Response in Two Stage Cluster Sampling with Replacement

Estimating a Finite Population Mean under Random Non-Response in Two Stage Cluster Sampling with Replacement Open Journal of Statistics, 07, 7, 834-848 http://www.scirp.org/journal/ojs ISS Online: 6-798 ISS Print: 6-78X Estimating a Finite Population ean under Random on-response in Two Stage Cluster Sampling

More information

Multi- and Hyperspectral Remote Sensing Change Detection with Generalized Difference Images by the IR-MAD Method

Multi- and Hyperspectral Remote Sensing Change Detection with Generalized Difference Images by the IR-MAD Method Multi- and Hyperspectral Remote Sensing Change Detection with Generalized Difference Images y the IR-MAD Method Allan A. Nielsen Technical University of Denmark Informatics and Mathematical Modelling DK-2800

More information

Hybrid variational-ensemble data assimilation. Daryl T. Kleist. Kayo Ide, Dave Parrish, John Derber, Jeff Whitaker

Hybrid variational-ensemble data assimilation. Daryl T. Kleist. Kayo Ide, Dave Parrish, John Derber, Jeff Whitaker Hybrid variational-ensemble data assimilation Daryl T. Kleist Kayo Ide, Dave Parrish, John Derber, Jeff Whitaker Weather and Chaos Group Meeting 07 March 20 Variational Data Assimilation J Var J 2 2 T

More information

Ensemble of Data Assimilations and uncertainty estimation

Ensemble of Data Assimilations and uncertainty estimation Ensemle of Data Assimilations and uncertainty estimation Massimo Bonavita ECMWF Acnowledgments: Lars Isasen, Mats Hamrud, Elias Holm, Slide 1 Mie Fisher, Laure Raynaud, Loi Berre, A. Clayton Outline Why

More information

EnKF Review. P.L. Houtekamer 7th EnKF workshop Introduction to the EnKF. Challenges. The ultimate global EnKF algorithm

EnKF Review. P.L. Houtekamer 7th EnKF workshop Introduction to the EnKF. Challenges. The ultimate global EnKF algorithm Overview 1 2 3 Review of the Ensemble Kalman Filter for Atmospheric Data Assimilation 6th EnKF Purpose EnKF equations localization After the 6th EnKF (2014), I decided with Prof. Zhang to summarize progress

More information

Relationship between Singular Vectors, Bred Vectors, 4D-Var and EnKF

Relationship between Singular Vectors, Bred Vectors, 4D-Var and EnKF Relationship between Singular Vectors, Bred Vectors, 4D-Var and EnKF Eugenia Kalnay and Shu-Chih Yang with Alberto Carrasi, Matteo Corazza and Takemasa Miyoshi 4th EnKF Workshop, April 2010 Relationship

More information

Accelerating the spin-up of Ensemble Kalman Filtering

Accelerating the spin-up of Ensemble Kalman Filtering Accelerating the spin-up of Ensemble Kalman Filtering Eugenia Kalnay * and Shu-Chih Yang University of Maryland Abstract A scheme is proposed to improve the performance of the ensemble-based Kalman Filters

More information

Improved analyses and forecasts with AIRS retrievals using the Local Ensemble Transform Kalman Filter

Improved analyses and forecasts with AIRS retrievals using the Local Ensemble Transform Kalman Filter Improved analyses and forecasts with AIRS retrievals using the Local Ensemble Transform Kalman Filter Hong Li, Junjie Liu, and Elana Fertig E. Kalnay I. Szunyogh, E. J. Kostelich Weather and Chaos Group

More information

Spectrum Opportunity Detection with Weak and Correlated Signals

Spectrum Opportunity Detection with Weak and Correlated Signals Specum Opportunity Detection with Weak and Correlated Signals Yao Xie Department of Elecical and Computer Engineering Duke University orth Carolina 775 Email: yaoxie@dukeedu David Siegmund Department of

More information

The Local Ensemble Transform Kalman Filter (LETKF) Eric Kostelich. Main topics

The Local Ensemble Transform Kalman Filter (LETKF) Eric Kostelich. Main topics The Local Ensemble Transform Kalman Filter (LETKF) Eric Kostelich Arizona State University Co-workers: Istvan Szunyogh, Brian Hunt, Ed Ott, Eugenia Kalnay, Jim Yorke, and many others http://www.weatherchaos.umd.edu

More information

DART_LAB Tutorial Section 2: How should observations impact an unobserved state variable? Multivariate assimilation.

DART_LAB Tutorial Section 2: How should observations impact an unobserved state variable? Multivariate assimilation. DART_LAB Tutorial Section 2: How should observations impact an unobserved state variable? Multivariate assimilation. UCAR 2014 The National Center for Atmospheric Research is sponsored by the National

More information

Lecture notes on assimilation algorithms

Lecture notes on assimilation algorithms Lecture notes on assimilation algorithms Elías alur Hólm European Centre for Medium-Range Weather Forecasts Reading, UK April 18, 28 1 Basic concepts 1.1 The analysis In meteorology and other branches

More information

Data Short description Parameters to be used for analysis SYNOP. Surface observations by ships, oil rigs and moored buoys

Data Short description Parameters to be used for analysis SYNOP. Surface observations by ships, oil rigs and moored buoys 3.2 Observational Data 3.2.1 Data used in the analysis Data Short description Parameters to be used for analysis SYNOP Surface observations at fixed stations over land P,, T, Rh SHIP BUOY TEMP PILOT Aircraft

More information

Simple Doppler Wind Lidar adaptive observation experiments with 3D-Var. and an ensemble Kalman filter in a global primitive equations model

Simple Doppler Wind Lidar adaptive observation experiments with 3D-Var. and an ensemble Kalman filter in a global primitive equations model 1 2 3 4 Simple Doppler Wind Lidar adaptive observation experiments with 3D-Var and an ensemble Kalman filter in a global primitive equations model 5 6 7 8 9 10 11 12 Junjie Liu and Eugenia Kalnay Dept.

More information

Generating climatological forecast error covariance for Variational DAs with ensemble perturbations: comparison with the NMC method

Generating climatological forecast error covariance for Variational DAs with ensemble perturbations: comparison with the NMC method Generating climatological forecast error covariance for Variational DAs with ensemble perturbations: comparison with the NMC method Matthew Wespetal Advisor: Dr. Eugenia Kalnay UMD, AOSC Department March

More information

Introduction to initialization of NWP models

Introduction to initialization of NWP models Introduction to initialization of NWP models weather forecasting an initial value problem traditionally, initialization comprised objective analysis of obs at a fixed synoptic time, i.e. 00Z or 12Z: data

More information

Evolution of Forecast Error Covariances in 4D-Var and ETKF methods

Evolution of Forecast Error Covariances in 4D-Var and ETKF methods Evolution of Forecast Error Covariances in 4D-Var and ETKF methods Chiara Piccolo Met Office Exeter, United Kingdom chiara.piccolo@metoffice.gov.uk Introduction Estimates of forecast error covariances

More information

Can hybrid-4denvar match hybrid-4dvar?

Can hybrid-4denvar match hybrid-4dvar? Comparing ensemble-variational assimilation methods for NWP: Can hybrid-4denvar match hybrid-4dvar? WWOSC, Montreal, August 2014. Andrew Lorenc, Neill Bowler, Adam Clayton, David Fairbairn and Stephen

More information

Representation theory of SU(2), density operators, purification Michael Walter, University of Amsterdam

Representation theory of SU(2), density operators, purification Michael Walter, University of Amsterdam Symmetry and Quantum Information Feruary 6, 018 Representation theory of S(), density operators, purification Lecture 7 Michael Walter, niversity of Amsterdam Last week, we learned the asic concepts of

More information

The Use of COMSOL for Integrated Hydrological Modeling

The Use of COMSOL for Integrated Hydrological Modeling The Use of for Integrated Hydrological Modeling Ting Fong May Chui *, David L. Freyerg Department of Civil and Environmental Engineering, Stanford University *Corresponding author: 380 Panama Mall, Terman

More information

Ensemble Kalman Filter based snow data assimilation

Ensemble Kalman Filter based snow data assimilation Ensemble Kalman Filter based snow data assimilation (just some ideas) FMI, Sodankylä, 4 August 2011 Jelena Bojarova Sequential update problem Non-linear state space problem Tangent-linear state space problem

More information

The Canadian approach to ensemble prediction

The Canadian approach to ensemble prediction The Canadian approach to ensemble prediction ECMWF 2017 Annual seminar: Ensemble prediction : past, present and future. Pieter Houtekamer Montreal, Canada Overview. The Canadian approach. What are the

More information

Background Error Covariance Modelling

Background Error Covariance Modelling Background Error Covariance Modelling M Fisher European Centre for Medium-Range Weather Forecasts m.fisher@ecmwf.int. Introduction The modelling and specification of the covariance matrix of background

More information

Inhomogeneous Background Error Modeling and Estimation over Antarctica with WRF-Var/AMPS

Inhomogeneous Background Error Modeling and Estimation over Antarctica with WRF-Var/AMPS Inhomogeneous Background Error Modeling and Estimation over Antarctica with WRF-Var/AMPS Yann MICHEL 1 Météo-France, CNRM/GMAP 2 NCAR, MMM/DAG 10 th Annual WRF Users Workshop 23 th June 2009 Yann MICHEL

More information

Module 9: Further Numbers and Equations. Numbers and Indices. The aim of this lesson is to enable you to: work with rational and irrational numbers

Module 9: Further Numbers and Equations. Numbers and Indices. The aim of this lesson is to enable you to: work with rational and irrational numbers Module 9: Further Numers and Equations Lesson Aims The aim of this lesson is to enale you to: wor with rational and irrational numers wor with surds to rationalise the denominator when calculating interest,

More information

Constrained Bias Correction (CBC) For Satellite Radiance Assimilation. Wei Han NWPC/CMA

Constrained Bias Correction (CBC) For Satellite Radiance Assimilation. Wei Han NWPC/CMA Constrained Bias Correction (CBC) For Satellite Radiance Assimilation Wei Han NWPC/CMA ITSC19, March 27, 2014 Outline Background The ias correction is an ill posed prolem Two Remain Issues of ias correction

More information

Iteration and SUT-based Variational Filter

Iteration and SUT-based Variational Filter Iteration and SUT-ased Variational Filter Ming Lei 1, Zhongliang Jing 1, and Christophe Baehr 2 1. School of Aeronautics and Astronautics, Shanghai Jiaotong University, Shanghai 2. Météo-France/CNRS, Toulouse

More information

Satellite Observations of Greenhouse Gases

Satellite Observations of Greenhouse Gases Satellite Observations of Greenhouse Gases Richard Engelen European Centre for Medium-Range Weather Forecasts Outline Introduction Data assimilation vs. retrievals 4D-Var data assimilation Observations

More information

Supplement to: Guidelines for constructing a confidence interval for the intra-class correlation coefficient (ICC)

Supplement to: Guidelines for constructing a confidence interval for the intra-class correlation coefficient (ICC) Supplement to: Guidelines for constructing a confidence interval for the intra-class correlation coefficient (ICC) Authors: Alexei C. Ionan, Mei-Yin C. Polley, Lisa M. McShane, Kevin K. Doin Section Page

More information

138. Last Latexed: April 25, 2017 at 9:45 Joel A. Shapiro. so 2. iψ α j x j

138. Last Latexed: April 25, 2017 at 9:45 Joel A. Shapiro. so 2. iψ α j x j 138. Last Latexed: April 25, 2017 at 9:45 Joel A. Shapiro The Hamiltonian for a noninteracting fermion 1 is H = d 3 x H 1, with H 1 = iψ j α ψ j + mψ j βψ j j Chapter 13 Local Symmetry So far, we have

More information

Divide-and-Conquer. Reading: CLRS Sections 2.3, 4.1, 4.2, 4.3, 28.2, CSE 6331 Algorithms Steve Lai

Divide-and-Conquer. Reading: CLRS Sections 2.3, 4.1, 4.2, 4.3, 28.2, CSE 6331 Algorithms Steve Lai Divide-and-Conquer Reading: CLRS Sections 2.3, 4.1, 4.2, 4.3, 28.2, 33.4. CSE 6331 Algorithms Steve Lai Divide and Conquer Given an instance x of a prolem, the method works as follows: divide-and-conquer

More information

Sample Solutions from the Student Solution Manual

Sample Solutions from the Student Solution Manual 1 Sample Solutions from the Student Solution Manual 1213 If all the entries are, then the matrix is certainly not invertile; if you multiply the matrix y anything, you get the matrix, not the identity

More information

2.1 Problems in the Current MRF PBL Scheme

2.1 Problems in the Current MRF PBL Scheme 22.2 IMPROVEMENTS TO SURFACE FLUX COMPUTATIONS IN A NON-LOCAL-MIXING PBL SCHEME, AND REFINEMENTS TO URBAN PROCESSES IN THE NOAH LAND-SURFACE MODEL WITH THE NCAR/ATEC REAL-TIME FDDA AND FORECAST SYSTEM

More information

DATA ASSIMILATION FOR FLOOD FORECASTING

DATA ASSIMILATION FOR FLOOD FORECASTING DATA ASSIMILATION FOR FLOOD FORECASTING Arnold Heemin Delft University of Technology 09/16/14 1 Data assimilation is the incorporation of measurement into a numerical model to improve the model results

More information

Some ideas for Ensemble Kalman Filter

Some ideas for Ensemble Kalman Filter Some ideas for Ensemble Kalman Filter Former students and Eugenia Kalnay UMCP Acknowledgements: UMD Chaos-Weather Group: Brian Hunt, Istvan Szunyogh, Ed Ott and Jim Yorke, Kayo Ide, and students Former

More information

THE IMPACT OF DIFFERENT DATA FIELDS ON STORM-SSCALE DATA ASSIMILATION

THE IMPACT OF DIFFERENT DATA FIELDS ON STORM-SSCALE DATA ASSIMILATION JP1J.3 THE IMPACT OF DIFFERENT DATA FIELDS ON STORM-SSCALE DATA ASSIMILATION Guoqing Ge 1,2,*, Jidong Gao 1, Kelvin Droegemeier 1,2 Center for Analysis and Prediction of Storms, University of Oklahoma,

More information

Fixed-b Inference for Testing Structural Change in a Time Series Regression

Fixed-b Inference for Testing Structural Change in a Time Series Regression Fixed- Inference for esting Structural Change in a ime Series Regression Cheol-Keun Cho Michigan State University imothy J. Vogelsang Michigan State University August 29, 24 Astract his paper addresses

More information

ACCURATE estimation of energy and momentum fluxes,

ACCURATE estimation of energy and momentum fluxes, IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 4, NO. 4, OCTOBER 2007 649 A Weak-Constraint-Based Data Assimilation Scheme for Estimating Surface Turulent Fluxes Jun Qin, Shunlin Liang, Senior Memer,

More information

ANOVA 3/12/2012. Two reasons for using ANOVA. Type I Error and Multiple Tests. Review Independent Samples t test

ANOVA 3/12/2012. Two reasons for using ANOVA. Type I Error and Multiple Tests. Review Independent Samples t test // ANOVA Lectures - Readings: GW Review Independent Samples t test Placeo Treatment 7 7 7 Mean... Review Independent Samples t test Placeo Treatment 7 7 7 Mean.. t (). p. C. I.: p t tcrit s pt crit s t

More information

An Adaptive Clarke Transform Based Estimator for the Frequency of Balanced and Unbalanced Three-Phase Power Systems

An Adaptive Clarke Transform Based Estimator for the Frequency of Balanced and Unbalanced Three-Phase Power Systems 07 5th European Signal Processing Conference EUSIPCO) An Adaptive Clarke Transform Based Estimator for the Frequency of Balanced and Unalanced Three-Phase Power Systems Elias Aoutanios School of Electrical

More information

A HYBRID ENSEMBLE KALMAN FILTER / 3D-VARIATIONAL ANALYSIS SCHEME

A HYBRID ENSEMBLE KALMAN FILTER / 3D-VARIATIONAL ANALYSIS SCHEME A HYBRID ENSEMBLE KALMAN FILTER / 3D-VARIATIONAL ANALYSIS SCHEME Thomas M. Hamill and Chris Snyder National Center for Atmospheric Research, Boulder, Colorado 1. INTRODUCTION Given the chaotic nature of

More information

PHY451, Spring /5

PHY451, Spring /5 PHY451, Spring 2011 Notes on Optical Pumping Procedure & Theory Procedure 1. Turn on the electronics and wait for the cell to warm up: ~ ½ hour. The oven should already e set to 50 C don t change this

More information

THE INFLATION-RESTRICTION SEQUENCE : AN INTRODUCTION TO SPECTRAL SEQUENCES

THE INFLATION-RESTRICTION SEQUENCE : AN INTRODUCTION TO SPECTRAL SEQUENCES THE INFLATION-RESTRICTION SEQUENCE : AN INTRODUCTION TO SPECTRAL SEQUENCES TOM WESTON. Example We egin with aelian groups for every p, q and maps : + (here, as in all of homological algera, all maps are

More information

P 1.86 A COMPARISON OF THE HYBRID ENSEMBLE TRANSFORM KALMAN FILTER (ETKF)- 3DVAR AND THE PURE ENSEMBLE SQUARE ROOT FILTER (EnSRF) ANALYSIS SCHEMES

P 1.86 A COMPARISON OF THE HYBRID ENSEMBLE TRANSFORM KALMAN FILTER (ETKF)- 3DVAR AND THE PURE ENSEMBLE SQUARE ROOT FILTER (EnSRF) ANALYSIS SCHEMES P 1.86 A COMPARISON OF THE HYBRID ENSEMBLE TRANSFORM KALMAN FILTER (ETKF)- 3DVAR AND THE PURE ENSEMBLE SQUARE ROOT FILTER (EnSRF) ANALYSIS SCHEMES Xuguang Wang*, Thomas M. Hamill, Jeffrey S. Whitaker NOAA/CIRES

More information

15B.7 A SEQUENTIAL VARIATIONAL ANALYSIS APPROACH FOR MESOSCALE DATA ASSIMILATION

15B.7 A SEQUENTIAL VARIATIONAL ANALYSIS APPROACH FOR MESOSCALE DATA ASSIMILATION 21st Conference on Weather Analysis and Forecasting/17th Conference on Numerical Weather Prediction 15B.7 A SEQUENTIAL VARIATIONAL ANALYSIS APPROACH FOR MESOSCALE DATA ASSIMILATION Yuanfu Xie 1, S. E.

More information

Yubao Liu *, Fei Chen, Tom Warner and Scott Swerdlin National Center for Atmospheric Research/RAP, Boulder, Colorado

Yubao Liu *, Fei Chen, Tom Warner and Scott Swerdlin National Center for Atmospheric Research/RAP, Boulder, Colorado IMPROVEMENTS TO SURFACE FLUX COMPUTATIONS IN THE MRF PBL SCHEME, AND REFINEMENTS TO URBAN PROCESSES IN THE NOAH LAND-SURFACE MODEL WITH THE NCAR/ATEC REAL-TIME FDDA AND FORECAST SYSTEM Yuao Liu *, Fei

More information

The University of Reading

The University of Reading The University of Reading Radial Velocity Assimilation and Experiments with a Simple Shallow Water Model S.J. Rennie 2 and S.L. Dance 1,2 NUMERICAL ANALYSIS REPORT 1/2008 1 Department of Mathematics 2

More information

Chaos and Dynamical Systems

Chaos and Dynamical Systems Chaos and Dynamical Systems y Megan Richards Astract: In this paper, we will discuss the notion of chaos. We will start y introducing certain mathematical concepts needed in the understanding of chaos,

More information

A732: Exercise #7 Maximum Likelihood

A732: Exercise #7 Maximum Likelihood A732: Exercise #7 Maximum Likelihood Due: 29 Novemer 2007 Analytic computation of some one-dimensional maximum likelihood estimators (a) Including the normalization, the exponential distriution function

More information

Feb 21 and 25: Local weighted least squares: Quadratic loess smoother

Feb 21 and 25: Local weighted least squares: Quadratic loess smoother Feb 1 and 5: Local weighted least squares: Quadratic loess smoother An example of weighted least squares fitting of data to a simple model for the purposes of simultaneous smoothing and interpolation is

More information