Variable selection in principal components analysis of qualitative data using the accelerated ALS algorithm

Size: px
Start display at page:

Download "Variable selection in principal components analysis of qualitative data using the accelerated ALS algorithm"

Transcription

1 Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm Masahiro Kuroda Yuichi Mori Masaya Iizuka Michio Sakakihara (Okayama Uiversity of Sciece) (Okayama Uiversity of Sciece) (Okayama Uiversity) (Okayama Uiversity of Sciece) * supported by the Japa Society for the Promotio of Sciece (JSPS), Grat-i-Aid for Scietific Research (C), No Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 1/29

2 Motivatio Variable sectio i PCA of qualitative data The ALS algorithm for quatifyig qualitative data PCA.ALS PRINCIPALS: Youg, Takae & de Leeuw (1978) (SAS) PRINCALS: Gifi (1990) (SPSS) Acceleratio of PCA.ALS usig the vector ε (vε) algorithm vε-pca.als: Kuroda, Mori, Iizuka & Sakakihara (2011) i CSDA. Modified PCA (M.PCA) Formulatio of M.PCA: Taaka & Mori (1997) Backward elimiatio & Forward selectio procedures: Mori et al. (1998, 2006) Applicatio of vε-pca.als to variable selectio i M.PCA of qualitative data Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 2/29

3 PCA with variables measured by mixed scaled levels X : p matrix ( observatios o p variables; columwise stadardized) I PCA, X is postulated to be approximated by a biliear structure of the form: where ˆX = ZA, Z is a r matrix of compoet scores o r compoets (1 r p), A is a p r matrix cosistig of the eigevectors of X X/ ad A A = I r. We fid Z ad A such that θ = tr(x ˆX) (X ˆX) = tr(x ZA ) (X ZA ) is miimized for the prescribed umber of compoets r. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 3/29

4 PCA with variables measured by mixed scaled levels For oly quatitative variables (iterval ad ratio scales) We ca fid Z ad A (or ˆX = ZA ) miimizig θ = tr(x ˆX) (X ˆX) = tr(x ZA ) (X ZA ). For mixed scaled variables (omial, ordial, iterval ad ratio scales) Optimal scalig is ecessary to quatify the observed qualitative data, i.e., we eed to fid a optimally scaled observatio X miimizig θ = tr(x ˆX) (X ˆX) = tr(x ZA ) (X ZA ), where [ X X X ] 1 = 0 p ad diag = I p, i additio to Z ad A, simultaeously. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 4/29

5 Alteratig least squares algorithm to fid the optimal scaled observatio X To fid model parameters (Z, A) ad optimal scalig parameter X, the Alteratig Least Squares (ALS) algorithm ca be utilized: PCA.ALS (PRINCIPALS, PRINCALS) Model parameter estimatio step : estimate Z ad A coditioally o fixed X. Optimal scalig step : fid X for miimizig θ coditioally o fixed Z ad A. X p = Model parameter estimatio = Optimal scalig Z r p A r Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 5/29

6 Alteratig least squares algorithm to fid the optimal scaled observatio X [ PCA.ALS algorithm ] PRINCIPALS (Youg et al, 1978) Superscript (t) idicates the t-th iteratio. Model parameter estimatio step: Obtai A (t) by solvig [ X (t) X (t) ] A = AD r, where A A = I r ad D r is a r r diagoal eigevalue matrix. Compute Z (t) from Z (t) = X (t) A (t). Optimal scalig step: Calculate ˆX (t+1) = Z (t) A (t). Fid X (t+1) such that X (t+1) = arg mi X tr(x ˆX (t+1) ) (X ˆX (t+1) ) for fixed ˆX (t+1) uder measuremet restrictios o each variables. Scale X (t+1) by columwise ormalizig ad ceterig. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 7/29

7 Acceleratio of PCA.ALS by the vector ε accelerator To accelerate the computatio, we ca use vector ε accelerator (vε accelerator) by Wy (1962), which speeds up the covergece of a slowly coverget vector sequece, is very effective for liearly covergig sequeces, geerates a sequece {Ẏ(t) } t 0 from the iterative sequece {Y (t) } t 0. Covergece: The accelerated sequece {Ẏ(t) } t 0 coverges to the statioary poit Y of {Y (t) } t 0 faster tha {Y (t) } t 0. Computatioal cost: At each iteratio, the vε algorithm requires oly O(d) arithmetic operatios while the Newto-Raphso ad quasi-newto algorithms are achieved at O(d 3 ) ad O(d 2 ) where d is the dimesio of Y. Covergece speed: The best speed of covergece is superliear. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 8/29

8 Acceleratio of PCA.ALS by the vector ε accelerator The vε accelerator is give by Ẏ (t 1) = Y (t) + [ [ Y (t 1) Y (t)] 1 + [Y (t+1) Y (t)] ] 1 1, where [Y] 1 = Y/ Y 2 ad Y is the Euclidea orm of Y. The accelerated vector Ẏ (t 1) is obtaied by the origial sequece (Y (t 1), Y (t), Y (t+1) ) The vε accelerator does ot deped o the statistical model {Y (t) } t 0. Therefore, whe the vε algorithm is applied to ALS, it guaratees the covergece properties of the ALS. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 9/29

9 Acceleratio of PCA.ALS by the vector ε accelerator To accelerate PCA.ALS, we itroduce the vε algorithm ito PCA.ALS, i.e., From a sequece {X (t) } t 0 = {X (0), X (1),, X ( ) } i PCA.ALS, make a accelerated sequece {Ẋ (t) } t 0 = {Ẋ (0), Ẋ (1),, X ( ) }. [ Geeral procedure of vε-pca.als ] Alterate the followig two steps util the algorithm is coverged: PCA.ALS step: Compute model parameters A (t) ad Z (t) ad determie optimal scalig parameter X (t+1). Acceleratio step: Calculate Ẋ (t 1) usig {X (t 1), X (t), X (t+1) } from the vε algorithm: [ [ ] 1 [ 1 ] 1 vecẋ (t 1) = vecx (t) + vec(x (t 1) X (t) ) + vec(x (t+1) X )] (t), where vecx stads for the vectors of colums of X, ad check the covergece by vec(ẋ (t 1) Ẋ (t 2) ) 2 < δ, where δ is a desired accuracy. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 10/29

10 Acceleratio of PCA.ALS by the vector ε accelerator X p = Model parameter estimatio = Optimal scalig Z r p A r p PCA.ALS step: {X (0),..., X (t) } Ẋ Acceleratio step: {Ẋ (0),..., Ẋ (s) } Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 11/29

11 Acceleratio of PCA.ALS by the vector ε accelerator Sice vε-pca.als is desiged to geerate {Ẋ (t) } t 0 covergig to X ( ), the estimate of X ca be obtaied from the fial value of {Ẋ (t) } t 0 whe vε-pca.als termiates, the estimates of Z ad A ca the be calculated immediately from the estimate of X i the Model parameter estimatio step of PCA.ALS. Note that Ẋ (t 1) obtaied at the t-th iteratio of the Acceleratio step is ot used as the estimate X (t+1) at the (t + 1)-th iteratio of the PCA.ALS step. Thus vε-pca.als speeds up the covergece of {X (t) } t 0 without affectig the covergece properties of PCA.ALS procedure. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 12/29

12 Modified PCA (M.PCA) of Taaka & Mori (1997) Modified PCA (M.PCA) derives pricipal compoets that are computed as liear combiatios of a subset of variables but ca reproduce all the variables very well. Let X be decomposed ito a q submatrix X V1 ad a (p q) remaiig submatrix X V2. The M.PCA fids r liear combiatios Z = X V1 A. X p = Variable selectio q X V1 p X V2 = M.PCA Z r Approximate: Z PCA Z Z PCA = XA = A from [X X/]A = AD r Z = X V1 A = A from [(S S 12 S 21 ) DS 11 ]A = 0 Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 13/29

13 Formulatio of Modified PCA (M.PCA) The matrix A cosists of the eigevectors associated with the largest r eigevalues λ 1 λ 2 λ r ad is obtaied by solvig the eigevalue problem: [(S S 12 S 21 ) DS 11 ]A = 0, ( ) S11 S where S = 12 S 21 S is the covariace matrix of X = (X V1, X V2 ) ad D is 22 a q q diagoal matrix of eigevalues. A best subset of q variables has the largest value of the proportio P = r j=1 λ j/tr(s) = we use P as variable selectio criteria. the RV -coefficiet RV = { r j=1 λ2 j /tr(s2 )} 1/2. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 14/29

14 Variable selectio procedures i M.PCA I order to fid a subset of q variables, we employ two variable selectio procedures of Mori et al. (1998, 2006) Backward elimiatio procedure ad Forward selectio procedure. These procedures are cost-savig stepwise selectio procedures ad are remove or add oly oe variable sequetially. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 15/29

15 Backward elimiatio [Backward elimiatio] Stage A: Iitial fixed-variables stage A-1 Assig q variables to subset X V1, usually q := p. A-2 Solve the eigevalue problem. A-3 Look carefully at the eigevalues, determie the umber r of pricipal compoets to be used. A-4 Specify kerel variables which should be ivolved i X V1, if ecessary. The umber of kerel variables is less tha q. Stage B: Variable selectio stage (Backward) B-1 Remove oe variable from amog q variables i X V1, make a temporary subset of size q 1, ad compute P based o the subset. Repeat this for each variable i X V1, the obtai q values o P. Fid the best subset of size q 1 which provides the largest P amog these q values ad remove the correspodig variable from the preset X V1. Put q := q 1. B-2 If P or q is larger tha preassiged values, go to B-1. Otherwise stop. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 16/29

16 Forward selectio [Forward selectio] Stage A: Iitial fixed-variables stage A-1 A-3 Same as A-1 to A-3 i Backward elimiatio. A-4 Redefie q as the umber of kerel variables (here, q r). If you have kerel variables, assig them to X V1. If ot, put q := r, fid the best subset of q variables which provides the largest P amog all possible subsets of size q ad assig it to X V1. Stage B: Variable selectio stage (Forward) B-1 Addig oe of the p q variables i X V2 to X V1, make a temporary subset of size q + 1 ad obtai P. Repeat this for each variable i X V2, the obtai p q P s. Fid the best subset of size q + 1 which provides the largest (or smallest) P amog the p q P s ad add the correspodig variable to the preset subset of X V1. Put q := q + 1. B-2 If the P or q are smaller (or larger) tha preassiged values, go back to B-1. Otherwise stop. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 17/29

17 Variable selectio i M.PCA Variable selectio is to fid a subset of q variables that best approximates all the variables. Fid X V1 usig Backward elimiatio or Forward selectio procedures. X p = Variable selectio q X V1 p X V2 = M.PCA Z r Approximate: Z PCA Z Z PCA = XA = A from [X X/]A = AD r Z = X V1 A = A from [(S S 12 S 21 ) DS 11 ]A = 0 Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 18/29

18 Variable selectio i M.PCA of qualitative data M.PCA of qualitative data: Iteratio betwee Variable selectio ad PCA.ALS X p = Variable selectio q p X V 1 X V 2 M.PCA usig PCA.ALS Z r Steps of variable selectio ad PCA.ALS Variable selectio: Select X V1 usig Backward elimiatio or Forward selectio. PCA.ALS: Quatify X V 1 ad compute A ad Z usig the ALS algorithm. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 19/29

19 vε-pca.als for variable selectio i M.PCA of qualitative data X p = Variable selectio q p X V 1 X V 2 M.PCA usig vε-pca.als Acceleratio Z r Steps of variable selectio ad PCA.ALS Variable selectio: Select X V1 usig Backward elimiatio or Forward selectio. vε-pca.als: PCA.ALS: Quatify X V 1 ad compute A ad Z usig the ALS algorithm vε acceleratio: Geerate a accelerated sequece of X V 1 usig the vε algorithm Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 21/29

20 Numerical experimets: Variable selectio i M.PCA of qualitative data Computatio Algorithm: PRINCIPALS ad vε-principals Covergece criteria: δ = 10 8 The umber of pricipal compoets (PCs): r = 3 Compare The umber of iteratios ad CPU time Iteratio ad CPU time speed-ups Iteratio (CPU time) speed-ups = The umber of iteratios (CPU time) of PRINCIPALS The umber of iteratios (CPU time) of vε-principals [Simulatio 1]: Computatio of PCs The size of sample () 100 ** The umber of items (p) 40 items with 20 levels (Data 1) ** The umber of items (p) 20 items with 10 levels (Data 2) ++ Replicatio: 50 times [Simulatio 2]: Variable selectio i M.PCA The size of sample (): 100 The umber of items (p): 10 items with 5 levels (Data 3) Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 22/29

21 Numerical experimets: Data 1 & 2 Table 1: Summary of statistics ad speed-ups (a) Data 1 ( with 20 levels) PRINCIPALS vε-principals Speed-ups Iteratio CPU time Iteratio CPU time Iteratio CPU time Mi st Qu Media Mea rd Qu Max (b) Data 2 ( with 10 levels) PRINCIPALS vε-principals Speed-ups Iteratio CPU time Iteratio CPU time Iteratio CPU time Mi st Qu Media Mea rd Qu Max Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 23/29

22 Numerical experimets: Boxplots of iteratio ad CPU time speed-ups Data 1 ( with 20 levels) Data 2 ( with 10 levels) Iteratio CPU time Iteratio CPU time Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 24/29

23 Numerical experimets: Data 3 Table 2(a): The umbers of iteratios ad CPU times of PRINCIPALS ad vε-principals ad their speed-ups i applicatio to variable selectio for fidig a subset of q variables usig simulated data. (a) Backward elimiatio PRINCIPALS vε-principals Speed-up q Comb. Iteratio CPU time Iteratio CPU time Iteratio CPU time Total Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 25/29

24 Numerical experimets: Data 3 Table 2(b): The umbers of iteratios ad CPU times of PRINCIPALS ad vε-principals ad their speed-ups i applicatio to variable selectio for fidig a subset of q variables usig simulated data. (b) Forward selectio PRINCIPALS vε-principals Speed-up q Comb. Iteratio CPU time Iteratio CPU time Iteratio CPU time Total Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 26/29

25 Numerical experimets: Data 3 The umber of iteratios CPU time Backward elimiatio Forward selectio Backward elimiatio Forward selectio The umber of iteratios The umber of iteratios CPU time CPU time q q q q Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 27/29

26 Coclusio [Numerical experimets] The vε acceleratio works well to reduce the computatioal time of PRINCIPALS. [Future works] Meas of iteratio & CPU time speed-ups: Computatio of PCs Iteratio CPU time Data 1 ( with 20 levels) (36%) (38%) Data 2 ( with 10 levels) (31%) (34%) Iteratio & CPU time speed-ups: Variable selectio Data 3 Iteratio CPU time Backward elimiatio 3.68 (27%) 3.52 (28%) Forward selectio 5.50 (18%) 5.16 (19%) We apply vε-principals with a re-start procedure to variable selectio i PCA of qualitative data. Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 28/29

27 Refereces Refereces GIFI, A. (1989): Algorithm descriptios for ANACOR, HOMALS, PRINCALS, ad OVERALS. Report RR Leide: Departmet of Data Theory, Uiversity of Leide. KURODA, M. ad SAKAKIHARA, M. (2006): Acceleratig the covergece of the EM algorithm usig the vector ε algorithm. Computatioal Statistics ad Data Aalysis 51, Kuroda, M., Mori, Y., Iizuka, M. ad Sakakihara, M. (2011). Acceleratio of the alteratig least squares algorithm for pricipal compoets aalysis. Computatioal Statistics ad Data Aalysis, 55, MICHAILIDIS, G. ad DE LEEUW, J. (1998): The Gifi system of descriptive multivariate aalysis. Statistical Sciece 13, MORI, Y., TANAKA, Y. ad TARUMI, T. (1997): Pricipal compoet aalysis based o a subset of variables for qualitative data. I: C. Hayashi, K. Yajima, H. Bock, N. Ohsumi, Y. Taaka, Y. Baba (Eds.): Data Sciece, Classificatio, ad Related Methods (Proceedigs of IFCS-96). Spriger-Verlag, YOUNG, F.W., TAKANE, Y., ad DE LEEUW, J. (1978): Pricipal compoets of mixed measuremet level multivariate data: A alteratig least squares method with optimal scalig features. Psychometrika 43, WANG, M., KURODA, M., SAKAKIHARA, M. ad GENG, Z. (2008): Acceleratio of the EM algorithm usig the vector epsilo algorithm. Computatioal Statistics 23, WYNN, P. (1962): Acceleratio techiques for iterated vector ad matrix problems. Mathematics of Computatio 16, Variable selectio i pricipal compoets aalysis of qualitative data usig the accelerated ALS algorithm 29/29

Principal components based on a subset of qualitative variables and its accelerated computational algorithm

Principal components based on a subset of qualitative variables and its accelerated computational algorithm Int. Statistical Inst.: Proc. 58th World Statistical Congress, 2011, Dublin (Session IPS042) p.774 Principal components based on a subset of qualitative variables and its accelerated computational algorithm

More information

Two-stage acceleration for non-linear PCA

Two-stage acceleration for non-linear PCA Two-stage acceleration for non-linear PCA Masahiro Kuroda, Okayama University of Science, kuroda@soci.ous.ac.jp Michio Sakakihara, Okayama University of Science, sakaki@mis.ous.ac.jp Yuichi Mori, Okayama

More information

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d Liear regressio Daiel Hsu (COMS 477) Maximum likelihood estimatio Oe of the simplest liear regressio models is the followig: (X, Y ),..., (X, Y ), (X, Y ) are iid radom pairs takig values i R d R, ad Y

More information

The Method of Least Squares. To understand least squares fitting of data.

The Method of Least Squares. To understand least squares fitting of data. The Method of Least Squares KEY WORDS Curve fittig, least square GOAL To uderstad least squares fittig of data To uderstad the least squares solutio of icosistet systems of liear equatios 1 Motivatio Curve

More information

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 +

62. Power series Definition 16. (Power series) Given a sequence {c n }, the series. c n x n = c 0 + c 1 x + c 2 x 2 + c 3 x 3 + 62. Power series Defiitio 16. (Power series) Give a sequece {c }, the series c x = c 0 + c 1 x + c 2 x 2 + c 3 x 3 + is called a power series i the variable x. The umbers c are called the coefficiets of

More information

Chapter 11 Output Analysis for a Single Model. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation

Chapter 11 Output Analysis for a Single Model. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation Chapter Output Aalysis for a Sigle Model Baks, Carso, Nelso & Nicol Discrete-Evet System Simulatio Error Estimatio If {,, } are ot statistically idepedet, the S / is a biased estimator of the true variace.

More information

THE SOLUTION OF NONLINEAR EQUATIONS f( x ) = 0.

THE SOLUTION OF NONLINEAR EQUATIONS f( x ) = 0. THE SOLUTION OF NONLINEAR EQUATIONS f( ) = 0. Noliear Equatio Solvers Bracketig. Graphical. Aalytical Ope Methods Bisectio False Positio (Regula-Falsi) Fied poit iteratio Newto Raphso Secat The root of

More information

Machine Learning for Data Science (CS 4786)

Machine Learning for Data Science (CS 4786) Machie Learig for Data Sciece CS 4786) Lecture & 3: Pricipal Compoet Aalysis The text i black outlies high level ideas. The text i blue provides simple mathematical details to derive or get to the algorithm

More information

Estimation of Backward Perturbation Bounds For Linear Least Squares Problem

Estimation of Backward Perturbation Bounds For Linear Least Squares Problem dvaced Sciece ad Techology Letters Vol.53 (ITS 4), pp.47-476 http://dx.doi.org/.457/astl.4.53.96 Estimatio of Bacward Perturbatio Bouds For Liear Least Squares Problem Xixiu Li School of Natural Scieces,

More information

IN many scientific and engineering applications, one often

IN many scientific and engineering applications, one often INTERNATIONAL JOURNAL OF COMPUTING SCIENCE AND APPLIED MATHEMATICS, VOL 3, NO, FEBRUARY 07 5 Secod Degree Refiemet Jacobi Iteratio Method for Solvig System of Liear Equatio Tesfaye Kebede Abstract Several

More information

Notes on iteration and Newton s method. Iteration

Notes on iteration and Newton s method. Iteration Notes o iteratio ad Newto s method Iteratio Iteratio meas doig somethig over ad over. I our cotet, a iteratio is a sequece of umbers, vectors, fuctios, etc. geerated by a iteratio rule of the type 1 f

More information

Fastest mixing Markov chain on a path

Fastest mixing Markov chain on a path Fastest mixig Markov chai o a path Stephe Boyd Persi Diacois Ju Su Li Xiao Revised July 2004 Abstract We ider the problem of assigig trasitio probabilities to the edges of a path, so the resultig Markov

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 9 Multicolliearity Dr Shalabh Departmet of Mathematics ad Statistics Idia Istitute of Techology Kapur Multicolliearity diagostics A importat questio that

More information

Introduction to Optimization Techniques

Introduction to Optimization Techniques Itroductio to Optimizatio Techiques Basic Cocepts of Aalysis - Real Aalysis, Fuctioal Aalysis 1 Basic Cocepts of Aalysis Liear Vector Spaces Defiitio: A vector space X is a set of elemets called vectors

More information

Math 61CM - Solutions to homework 3

Math 61CM - Solutions to homework 3 Math 6CM - Solutios to homework 3 Cédric De Groote October 2 th, 208 Problem : Let F be a field, m 0 a fixed oegative iteger ad let V = {a 0 + a x + + a m x m a 0,, a m F} be the vector space cosistig

More information

Investigating the Significance of a Correlation Coefficient using Jackknife Estimates

Investigating the Significance of a Correlation Coefficient using Jackknife Estimates Iteratioal Joural of Scieces: Basic ad Applied Research (IJSBAR) ISSN 2307-4531 (Prit & Olie) http://gssrr.org/idex.php?joural=jouralofbasicadapplied ---------------------------------------------------------------------------------------------------------------------------

More information

Chapter 12 EM algorithms The Expectation-Maximization (EM) algorithm is a maximum likelihood method for models that have hidden variables eg. Gaussian

Chapter 12 EM algorithms The Expectation-Maximization (EM) algorithm is a maximum likelihood method for models that have hidden variables eg. Gaussian Chapter 2 EM algorithms The Expectatio-Maximizatio (EM) algorithm is a maximum likelihood method for models that have hidde variables eg. Gaussia Mixture Models (GMMs), Liear Dyamic Systems (LDSs) ad Hidde

More information

Research Article A New Second-Order Iteration Method for Solving Nonlinear Equations

Research Article A New Second-Order Iteration Method for Solving Nonlinear Equations Abstract ad Applied Aalysis Volume 2013, Article ID 487062, 4 pages http://dx.doi.org/10.1155/2013/487062 Research Article A New Secod-Order Iteratio Method for Solvig Noliear Equatios Shi Mi Kag, 1 Arif

More information

w (1) ˆx w (1) x (1) /ρ and w (2) ˆx w (2) x (2) /ρ.

w (1) ˆx w (1) x (1) /ρ and w (2) ˆx w (2) x (2) /ρ. 2 5. Weighted umber of late jobs 5.1. Release dates ad due dates: maximimizig the weight of o-time jobs Oce we add release dates, miimizig the umber of late jobs becomes a sigificatly harder problem. For

More information

BIOINF 585: Machine Learning for Systems Biology & Clinical Informatics

BIOINF 585: Machine Learning for Systems Biology & Clinical Informatics BIOINF 585: Machie Learig for Systems Biology & Cliical Iformatics Lecture 14: Dimesio Reductio Jie Wag Departmet of Computatioal Medicie & Bioiformatics Uiversity of Michiga 1 Outlie What is feature reductio?

More information

Chandrasekhar Type Algorithms. for the Riccati Equation of Lainiotis Filter

Chandrasekhar Type Algorithms. for the Riccati Equation of Lainiotis Filter Cotemporary Egieerig Scieces, Vol. 3, 00, o. 4, 9-00 Chadrasekhar ype Algorithms for the Riccati Equatio of Laiiotis Filter Nicholas Assimakis Departmet of Electroics echological Educatioal Istitute of

More information

x x x 2x x N ( ) p NUMERICAL METHODS UNIT-I-SOLUTION OF EQUATIONS AND EIGENVALUE PROBLEMS By Newton-Raphson formula

x x x 2x x N ( ) p NUMERICAL METHODS UNIT-I-SOLUTION OF EQUATIONS AND EIGENVALUE PROBLEMS By Newton-Raphson formula NUMERICAL METHODS UNIT-I-SOLUTION OF EQUATIONS AND EIGENVALUE PROBLEMS. If g( is cotiuous i [a,b], te uder wat coditio te iterative (or iteratio metod = g( as a uique solutio i [a,b]? '( i [a,b].. Wat

More information

A proposed discrete distribution for the statistical modeling of

A proposed discrete distribution for the statistical modeling of It. Statistical Ist.: Proc. 58th World Statistical Cogress, 0, Dubli (Sessio CPS047) p.5059 A proposed discrete distributio for the statistical modelig of Likert data Kidd, Marti Cetre for Statistical

More information

Testing for Convergence

Testing for Convergence 9.5 Testig for Covergece Remember: The Ratio Test: lim + If a is a series with positive terms ad the: The series coverges if L . The test is icoclusive if L =. a a = L This

More information

Stochastic Simulation

Stochastic Simulation Stochastic Simulatio 1 Itroductio Readig Assigmet: Read Chapter 1 of text. We shall itroduce may of the key issues to be discussed i this course via a couple of model problems. Model Problem 1 (Jackso

More information

A statistical method to determine sample size to estimate characteristic value of soil parameters

A statistical method to determine sample size to estimate characteristic value of soil parameters A statistical method to determie sample size to estimate characteristic value of soil parameters Y. Hojo, B. Setiawa 2 ad M. Suzuki 3 Abstract Sample size is a importat factor to be cosidered i determiig

More information

Lainiotis filter implementation. via Chandrasekhar type algorithm

Lainiotis filter implementation. via Chandrasekhar type algorithm Joural of Computatios & Modellig, vol.1, o.1, 2011, 115-130 ISSN: 1792-7625 prit, 1792-8850 olie Iteratioal Scietific Press, 2011 Laiiotis filter implemetatio via Chadrasehar type algorithm Nicholas Assimais

More information

Complex Analysis Spring 2001 Homework I Solution

Complex Analysis Spring 2001 Homework I Solution Complex Aalysis Sprig 2001 Homework I Solutio 1. Coway, Chapter 1, sectio 3, problem 3. Describe the set of poits satisfyig the equatio z a z + a = 2c, where c > 0 ad a R. To begi, we see from the triagle

More information

Benaissa Bernoussi Université Abdelmalek Essaadi, ENSAT de Tanger, B.P. 416, Tanger, Morocco

Benaissa Bernoussi Université Abdelmalek Essaadi, ENSAT de Tanger, B.P. 416, Tanger, Morocco EXTENDING THE BERNOULLI-EULER METHOD FOR FINDING ZEROS OF HOLOMORPHIC FUNCTIONS Beaissa Beroussi Uiversité Abdelmalek Essaadi, ENSAT de Tager, B.P. 416, Tager, Morocco e-mail: Beaissa@fstt.ac.ma Mustapha

More information

Optimization Methods MIT 2.098/6.255/ Final exam

Optimization Methods MIT 2.098/6.255/ Final exam Optimizatio Methods MIT 2.098/6.255/15.093 Fial exam Date Give: December 19th, 2006 P1. [30 pts] Classify the followig statemets as true or false. All aswers must be well-justified, either through a short

More information

A collocation method for singular integral equations with cosecant kernel via Semi-trigonometric interpolation

A collocation method for singular integral equations with cosecant kernel via Semi-trigonometric interpolation Iteratioal Joural of Mathematics Research. ISSN 0976-5840 Volume 9 Number 1 (017) pp. 45-51 Iteratioal Research Publicatio House http://www.irphouse.com A collocatio method for sigular itegral equatios

More information

Lecture 24: Variable selection in linear models

Lecture 24: Variable selection in linear models Lecture 24: Variable selectio i liear models Cosider liear model X = Z β + ε, β R p ad Varε = σ 2 I. Like the LSE, the ridge regressio estimator does ot give 0 estimate to a compoet of β eve if that compoet

More information

Chapter 2 The Solution of Numerical Algebraic and Transcendental Equations

Chapter 2 The Solution of Numerical Algebraic and Transcendental Equations Chapter The Solutio of Numerical Algebraic ad Trascedetal Equatios Itroductio I this chapter we shall discuss some umerical methods for solvig algebraic ad trascedetal equatios. The equatio f( is said

More information

A Genetic Algorithm for Solving General System of Equations

A Genetic Algorithm for Solving General System of Equations A Geetic Algorithm for Solvig Geeral System of Equatios Győző Molárka, Edit Miletics Departmet of Mathematics, Szécheyi Istvá Uiversity, Győr, Hugary molarka@sze.hu, miletics@sze.hu Abstract: For solvig

More information

NEW FAST CONVERGENT SEQUENCES OF EULER-MASCHERONI TYPE

NEW FAST CONVERGENT SEQUENCES OF EULER-MASCHERONI TYPE UPB Sci Bull, Series A, Vol 79, Iss, 207 ISSN 22-7027 NEW FAST CONVERGENT SEQUENCES OF EULER-MASCHERONI TYPE Gabriel Bercu We itroduce two ew sequeces of Euler-Mascheroi type which have fast covergece

More information

Optimization Methods: Linear Programming Applications Assignment Problem 1. Module 4 Lecture Notes 3. Assignment Problem

Optimization Methods: Linear Programming Applications Assignment Problem 1. Module 4 Lecture Notes 3. Assignment Problem Optimizatio Methods: Liear Programmig Applicatios Assigmet Problem Itroductio Module 4 Lecture Notes 3 Assigmet Problem I the previous lecture, we discussed about oe of the bech mark problems called trasportatio

More information

Section A assesses the Units Numerical Analysis 1 and 2 Section B assesses the Unit Mathematics for Applied Mathematics

Section A assesses the Units Numerical Analysis 1 and 2 Section B assesses the Unit Mathematics for Applied Mathematics X0/70 NATIONAL QUALIFICATIONS 005 MONDAY, MAY.00 PM 4.00 PM APPLIED MATHEMATICS ADVANCED HIGHER Numerical Aalysis Read carefully. Calculators may be used i this paper.. Cadidates should aswer all questios.

More information

Random Matrices with Blocks of Intermediate Scale Strongly Correlated Band Matrices

Random Matrices with Blocks of Intermediate Scale Strongly Correlated Band Matrices Radom Matrices with Blocks of Itermediate Scale Strogly Correlated Bad Matrices Jiayi Tog Advisor: Dr. Todd Kemp May 30, 07 Departmet of Mathematics Uiversity of Califoria, Sa Diego Cotets Itroductio Notatio

More information

Summary and Discussion on Simultaneous Analysis of Lasso and Dantzig Selector

Summary and Discussion on Simultaneous Analysis of Lasso and Dantzig Selector Summary ad Discussio o Simultaeous Aalysis of Lasso ad Datzig Selector STAT732, Sprig 28 Duzhe Wag May 4, 28 Abstract This is a discussio o the work i Bickel, Ritov ad Tsybakov (29). We begi with a short

More information

CS537. Numerical Analysis and Computing

CS537. Numerical Analysis and Computing CS57 Numerical Aalysis ad Computig Lecture Locatig Roots o Equatios Proessor Ju Zhag Departmet o Computer Sciece Uiversity o Ketucky Leigto KY 456-6 Jauary 9 9 What is the Root May physical system ca be

More information

Integer Linear Programming

Integer Linear Programming Iteger Liear Programmig Itroductio Iteger L P problem (P) Mi = s. t. a = b i =,, m = i i 0, iteger =,, c Eemple Mi z = 5 s. t. + 0 0, 0, iteger F(P) = feasible domai of P Itroductio Iteger L P problem

More information

Arkansas Tech University MATH 2924: Calculus II Dr. Marcel B. Finan

Arkansas Tech University MATH 2924: Calculus II Dr. Marcel B. Finan Arkasas Tech Uiversity MATH 94: Calculus II Dr Marcel B Fia 85 Power Series Let {a } =0 be a sequece of umbers The a power series about x = a is a series of the form a (x a) = a 0 + a (x a) + a (x a) +

More information

A class of spectral bounds for Max k-cut

A class of spectral bounds for Max k-cut A class of spectral bouds for Max k-cut Miguel F. Ajos, José Neto December 07 Abstract Let G be a udirected ad edge-weighted simple graph. I this paper we itroduce a class of bouds for the maximum k-cut

More information

Session 5. (1) Principal component analysis and Karhunen-Loève transformation

Session 5. (1) Principal component analysis and Karhunen-Loève transformation 200 Autum semester Patter Iformatio Processig Topic 2 Image compressio by orthogoal trasformatio Sessio 5 () Pricipal compoet aalysis ad Karhue-Loève trasformatio Topic 2 of this course explais the image

More information

Support vector machine revisited

Support vector machine revisited 6.867 Machie learig, lecture 8 (Jaakkola) 1 Lecture topics: Support vector machie ad kerels Kerel optimizatio, selectio Support vector machie revisited Our task here is to first tur the support vector

More information

6.3 Testing Series With Positive Terms

6.3 Testing Series With Positive Terms 6.3. TESTING SERIES WITH POSITIVE TERMS 307 6.3 Testig Series With Positive Terms 6.3. Review of what is kow up to ow I theory, testig a series a i for covergece amouts to fidig the i= sequece of partial

More information

Similarity Solutions to Unsteady Pseudoplastic. Flow Near a Moving Wall

Similarity Solutions to Unsteady Pseudoplastic. Flow Near a Moving Wall Iteratioal Mathematical Forum, Vol. 9, 04, o. 3, 465-475 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/0.988/imf.04.48 Similarity Solutios to Usteady Pseudoplastic Flow Near a Movig Wall W. Robi Egieerig

More information

Section 11.8: Power Series

Section 11.8: Power Series Sectio 11.8: Power Series 1. Power Series I this sectio, we cosider geeralizig the cocept of a series. Recall that a series is a ifiite sum of umbers a. We ca talk about whether or ot it coverges ad i

More information

Approximating the ruin probability of finite-time surplus process with Adaptive Moving Total Exponential Least Square

Approximating the ruin probability of finite-time surplus process with Adaptive Moving Total Exponential Least Square WSEAS TRANSACTONS o BUSNESS ad ECONOMCS S. Khotama, S. Boothiem, W. Klogdee Approimatig the rui probability of fiite-time surplus process with Adaptive Movig Total Epoetial Least Square S. KHOTAMA, S.

More information

c 2006 Society for Industrial and Applied Mathematics

c 2006 Society for Industrial and Applied Mathematics SIAM J. MATRIX ANAL. APPL. Vol. 7, No. 3, pp. 851 860 c 006 Society for Idustrial ad Applied Mathematics EXTREMAL EIGENVALUES OF REAL SYMMETRIC MATRICES WITH ENTRIES IN AN INTERVAL XINGZHI ZHAN Abstract.

More information

Algebra of Least Squares

Algebra of Least Squares October 19, 2018 Algebra of Least Squares Geometry of Least Squares Recall that out data is like a table [Y X] where Y collects observatios o the depedet variable Y ad X collects observatios o the k-dimesioal

More information

6. Kalman filter implementation for linear algebraic equations. Karhunen-Loeve decomposition

6. Kalman filter implementation for linear algebraic equations. Karhunen-Loeve decomposition 6. Kalma filter implemetatio for liear algebraic equatios. Karhue-Loeve decompositio 6.1. Solvable liear algebraic systems. Probabilistic iterpretatio. Let A be a quadratic matrix (ot obligatory osigular.

More information

An alternating series is a series where the signs alternate. Generally (but not always) there is a factor of the form ( 1) n + 1

An alternating series is a series where the signs alternate. Generally (but not always) there is a factor of the form ( 1) n + 1 Calculus II - Problem Solvig Drill 20: Alteratig Series, Ratio ad Root Tests Questio No. of 0 Istructios: () Read the problem ad aswer choices carefully (2) Work the problems o paper as eeded (3) Pick

More information

CS321. Numerical Analysis and Computing

CS321. Numerical Analysis and Computing CS Numerical Aalysis ad Computig Lecture Locatig Roots o Equatios Proessor Ju Zhag Departmet o Computer Sciece Uiversity o Ketucky Leigto KY 456-6 September 8 5 What is the Root May physical system ca

More information

Expectation-Maximization Algorithm.

Expectation-Maximization Algorithm. Expectatio-Maximizatio Algorithm. Petr Pošík Czech Techical Uiversity i Prague Faculty of Electrical Egieerig Dept. of Cyberetics MLE 2 Likelihood.........................................................................................................

More information

Definitions and Theorems. where x are the decision variables. c, b, and a are constant coefficients.

Definitions and Theorems. where x are the decision variables. c, b, and a are constant coefficients. Defiitios ad Theorems Remember the scalar form of the liear programmig problem, Miimize, Subject to, f(x) = c i x i a 1i x i = b 1 a mi x i = b m x i 0 i = 1,2,, where x are the decisio variables. c, b,

More information

IT is well known that Brouwer s fixed point theorem can

IT is well known that Brouwer s fixed point theorem can IAENG Iteratioal Joural of Applied Mathematics, 4:, IJAM_4 0 Costructive Proof of Brouwer s Fixed Poit Theorem for Sequetially Locally No-costat ad Uiformly Sequetially Cotiuous Fuctios Yasuhito Taaka,

More information

For a 3 3 diagonal matrix we find. Thus e 1 is a eigenvector corresponding to eigenvalue λ = a 11. Thus matrix A has eigenvalues 2 and 3.

For a 3 3 diagonal matrix we find. Thus e 1 is a eigenvector corresponding to eigenvalue λ = a 11. Thus matrix A has eigenvalues 2 and 3. Closed Leotief Model Chapter 6 Eigevalues I a closed Leotief iput-output-model cosumptio ad productio coicide, i.e. V x = x = x Is this possible for the give techology matrix V? This is a special case

More information

SECTION 1.5 : SUMMATION NOTATION + WORK WITH SEQUENCES

SECTION 1.5 : SUMMATION NOTATION + WORK WITH SEQUENCES SECTION 1.5 : SUMMATION NOTATION + WORK WITH SEQUENCES Read Sectio 1.5 (pages 5 9) Overview I Sectio 1.5 we lear to work with summatio otatio ad formulas. We will also itroduce a brief overview of sequeces,

More information

SCORE. Exam 2. MA 114 Exam 2 Fall 2016

SCORE. Exam 2. MA 114 Exam 2 Fall 2016 MA 4 Exam Fall 06 Exam Name: Sectio ad/or TA: Do ot remove this aswer page you will retur the whole exam. You will be allowed two hours to complete this test. No books or otes may be used. You may use

More information

SCORE. Exam 2. MA 114 Exam 2 Fall 2017

SCORE. Exam 2. MA 114 Exam 2 Fall 2017 Exam Name: Sectio ad/or TA: Do ot remove this aswer page you will retur the whole exam. You will be allowed two hours to complete this test. No books or otes may be used. You may use a graphig calculator

More information

Machine Learning for Data Science (CS 4786)

Machine Learning for Data Science (CS 4786) Machie Learig for Data Sciece CS 4786) Lecture 9: Pricipal Compoet Aalysis The text i black outlies mai ideas to retai from the lecture. The text i blue give a deeper uderstadig of how we derive or get

More information

Markov Decision Processes

Markov Decision Processes Markov Decisio Processes Defiitios; Statioary policies; Value improvemet algorithm, Policy improvemet algorithm, ad liear programmig for discouted cost ad average cost criteria. Markov Decisio Processes

More information

Generating Functions for Laguerre Type Polynomials. Group Theoretic method

Generating Functions for Laguerre Type Polynomials. Group Theoretic method It. Joural of Math. Aalysis, Vol. 4, 2010, o. 48, 257-266 Geeratig Fuctios for Laguerre Type Polyomials α of Two Variables L ( xy, ) by Usig Group Theoretic method Ajay K. Shula* ad Sriata K. Meher** *Departmet

More information

RAINFALL PREDICTION BY WAVELET DECOMPOSITION

RAINFALL PREDICTION BY WAVELET DECOMPOSITION RAIFALL PREDICTIO BY WAVELET DECOMPOSITIO A. W. JAYAWARDEA Departmet of Civil Egieerig, The Uiversit of Hog Kog, Hog Kog, Chia P. C. XU Academ of Mathematics ad Sstem Scieces, Chiese Academ of Scieces,

More information

Math 113 Exam 3 Practice

Math 113 Exam 3 Practice Math Exam Practice Exam will cover.-.9. This sheet has three sectios. The first sectio will remid you about techiques ad formulas that you should kow. The secod gives a umber of practice questios for you

More information

Topics in Eigen-analysis

Topics in Eigen-analysis Topics i Eige-aalysis Li Zajiag 28 July 2014 Cotets 1 Termiology... 2 2 Some Basic Properties ad Results... 2 3 Eige-properties of Hermitia Matrices... 5 3.1 Basic Theorems... 5 3.2 Quadratic Forms & Noegative

More information

Soo King Lim Figure 1: Figure 2: Figure 3: Figure 4: Figure 5: Figure 6: Figure 7:

Soo King Lim Figure 1: Figure 2: Figure 3: Figure 4: Figure 5: Figure 6: Figure 7: 0 Multivariate Cotrol Chart 3 Multivariate Normal Distributio 5 Estimatio of the Mea ad Covariace Matrix 6 Hotellig s Cotrol Chart 6 Hotellig s Square 8 Average Value of k Subgroups 0 Example 3 3 Value

More information

Eigenvalues and Eigenvectors

Eigenvalues and Eigenvectors 5 Eigevalues ad Eigevectors 5.3 DIAGONALIZATION DIAGONALIZATION Example 1: Let. Fid a formula for A k, give that P 1 1 = 1 2 ad, where Solutio: The stadard formula for the iverse of a 2 2 matrix yields

More information

You may work in pairs or purely individually for this assignment.

You may work in pairs or purely individually for this assignment. CS 04 Problem Solvig i Computer Sciece OOC Assigmet 6: Recurreces You may work i pairs or purely idividually for this assigmet. Prepare your aswers to the followig questios i a plai ASCII text file or

More information

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss ECE 90 Lecture : Complexity Regularizatio ad the Squared Loss R. Nowak 5/7/009 I the previous lectures we made use of the Cheroff/Hoeffdig bouds for our aalysis of classifier errors. Hoeffdig s iequality

More information

A NEW CLASS OF 2-STEP RATIONAL MULTISTEP METHODS

A NEW CLASS OF 2-STEP RATIONAL MULTISTEP METHODS Jural Karya Asli Loreka Ahli Matematik Vol. No. (010) page 6-9. Jural Karya Asli Loreka Ahli Matematik A NEW CLASS OF -STEP RATIONAL MULTISTEP METHODS 1 Nazeeruddi Yaacob Teh Yua Yig Norma Alias 1 Departmet

More information

Formulas for the Number of Spanning Trees in a Maximal Planar Map

Formulas for the Number of Spanning Trees in a Maximal Planar Map Applied Mathematical Scieces Vol. 5 011 o. 64 3147-3159 Formulas for the Number of Spaig Trees i a Maximal Plaar Map A. Modabish D. Lotfi ad M. El Marraki Departmet of Computer Scieces Faculty of Scieces

More information

MAT1026 Calculus II Basic Convergence Tests for Series

MAT1026 Calculus II Basic Convergence Tests for Series MAT026 Calculus II Basic Covergece Tests for Series Egi MERMUT 202.03.08 Dokuz Eylül Uiversity Faculty of Sciece Departmet of Mathematics İzmir/TURKEY Cotets Mootoe Covergece Theorem 2 2 Series of Real

More information

Decoupling Zeros of Positive Discrete-Time Linear Systems*

Decoupling Zeros of Positive Discrete-Time Linear Systems* Circuits ad Systems,,, 4-48 doi:.436/cs..7 Published Olie October (http://www.scirp.org/oural/cs) Decouplig Zeros of Positive Discrete-Time Liear Systems* bstract Tadeusz Kaczorek Faculty of Electrical

More information

CHAPTER 10 INFINITE SEQUENCES AND SERIES

CHAPTER 10 INFINITE SEQUENCES AND SERIES CHAPTER 10 INFINITE SEQUENCES AND SERIES 10.1 Sequeces 10.2 Ifiite Series 10.3 The Itegral Tests 10.4 Compariso Tests 10.5 The Ratio ad Root Tests 10.6 Alteratig Series: Absolute ad Coditioal Covergece

More information

Root Finding COS 323

Root Finding COS 323 Root Fidig COS 323 Remider Sig up for Piazza Assigmet 0 is posted, due Tue 9/25 Last time.. Floatig poit umbers ad precisio Machie epsilo Sources of error Sesitivity ad coditioig Stability ad accuracy

More information

SCORE. Exam 2. MA 114 Exam 2 Fall 2016

SCORE. Exam 2. MA 114 Exam 2 Fall 2016 Exam 2 Name: Sectio ad/or TA: Do ot remove this aswer page you will retur the whole exam. You will be allowed two hours to complete this test. No books or otes may be used. You may use a graphig calculator

More information

Lecture 22: Review for Exam 2. 1 Basic Model Assumptions (without Gaussian Noise)

Lecture 22: Review for Exam 2. 1 Basic Model Assumptions (without Gaussian Noise) Lecture 22: Review for Exam 2 Basic Model Assumptios (without Gaussia Noise) We model oe cotiuous respose variable Y, as a liear fuctio of p umerical predictors, plus oise: Y = β 0 + β X +... β p X p +

More information

Geometry of LS. LECTURE 3 GEOMETRY OF LS, PROPERTIES OF σ 2, PARTITIONED REGRESSION, GOODNESS OF FIT

Geometry of LS. LECTURE 3 GEOMETRY OF LS, PROPERTIES OF σ 2, PARTITIONED REGRESSION, GOODNESS OF FIT OCTOBER 7, 2016 LECTURE 3 GEOMETRY OF LS, PROPERTIES OF σ 2, PARTITIONED REGRESSION, GOODNESS OF FIT Geometry of LS We ca thik of y ad the colums of X as members of the -dimesioal Euclidea space R Oe ca

More information

INFINITE SEQUENCES AND SERIES

INFINITE SEQUENCES AND SERIES INFINITE SEQUENCES AND SERIES INFINITE SEQUENCES AND SERIES I geeral, it is difficult to fid the exact sum of a series. We were able to accomplish this for geometric series ad the series /[(+)]. This is

More information

A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS

A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS J. Japa Statist. Soc. Vol. 41 No. 1 2011 67 73 A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS Yoichi Nishiyama* We cosider k-sample ad chage poit problems for idepedet data i a

More information

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors ECONOMETRIC THEORY MODULE XIII Lecture - 34 Asymptotic Theory ad Stochastic Regressors Dr. Shalabh Departmet of Mathematics ad Statistics Idia Istitute of Techology Kapur Asymptotic theory The asymptotic

More information

We are mainly going to be concerned with power series in x, such as. (x)} converges - that is, lims N n

We are mainly going to be concerned with power series in x, such as. (x)} converges - that is, lims N n Review of Power Series, Power Series Solutios A power series i x - a is a ifiite series of the form c (x a) =c +c (x a)+(x a) +... We also call this a power series cetered at a. Ex. (x+) is cetered at

More information

A Challenging Test For Convergence Accelerators: Summation Of A Series With A Special Sign Pattern

A Challenging Test For Convergence Accelerators: Summation Of A Series With A Special Sign Pattern Applied Mathematics E-Notes, 6(006), 5-34 c ISSN 1607-510 Available free at mirror sites of http://www.math.thu.edu.tw/ ame/ A Challegig Test For Covergece Accelerators: Summatio Of A Series With A Special

More information

Section 1.4. Power Series

Section 1.4. Power Series Sectio.4. Power Series De itio. The fuctio de ed by f (x) (x a) () c 0 + c (x a) + c 2 (x a) 2 + c (x a) + ::: is called a power series cetered at x a with coe ciet sequece f g :The domai of this fuctio

More information

, then cv V. Differential Equations Elements of Lineaer Algebra Name: Consider the differential equation. and y2 cos( kx)

, then cv V. Differential Equations Elements of Lineaer Algebra Name: Consider the differential equation. and y2 cos( kx) Cosider the differetial equatio y '' k y 0 has particular solutios y1 si( kx) ad y cos( kx) I geeral, ay liear combiatio of y1 ad y, cy 1 1 cy where c1, c is also a solutio to the equatio above The reaso

More information

Numerical Solution of the Two Point Boundary Value Problems By Using Wavelet Bases of Hermite Cubic Spline Wavelets

Numerical Solution of the Two Point Boundary Value Problems By Using Wavelet Bases of Hermite Cubic Spline Wavelets Australia Joural of Basic ad Applied Scieces, 5(): 98-5, ISSN 99-878 Numerical Solutio of the Two Poit Boudary Value Problems By Usig Wavelet Bases of Hermite Cubic Splie Wavelets Mehdi Yousefi, Hesam-Aldie

More information

μ are complex parameters. Other

μ are complex parameters. Other A New Numerical Itegrator for the Solutio of Iitial Value Problems i Ordiary Differetial Equatios. J. Suday * ad M.R. Odekule Departmet of Mathematical Scieces, Adamawa State Uiversity, Mubi, Nigeria.

More information

Numerical Solution of Non-linear Equations

Numerical Solution of Non-linear Equations Numerical Solutio of Noliear Equatios. INTRODUCTION The most commo reallife problems are oliear ad are ot ameable to be hadled by aalytical methods to obtai solutios of a variety of mathematical problems.

More information

A NEW APPROACH TO SOLVE AN UNBALANCED ASSIGNMENT PROBLEM

A NEW APPROACH TO SOLVE AN UNBALANCED ASSIGNMENT PROBLEM A NEW APPROACH TO SOLVE AN UNBALANCED ASSIGNMENT PROBLEM *Kore B. G. Departmet Of Statistics, Balwat College, VITA - 415 311, Dist.: Sagli (M. S.). Idia *Author for Correspodece ABSTRACT I this paper I

More information

Product measures, Tonelli s and Fubini s theorems For use in MAT3400/4400, autumn 2014 Nadia S. Larsen. Version of 13 October 2014.

Product measures, Tonelli s and Fubini s theorems For use in MAT3400/4400, autumn 2014 Nadia S. Larsen. Version of 13 October 2014. Product measures, Toelli s ad Fubii s theorems For use i MAT3400/4400, autum 2014 Nadia S. Larse Versio of 13 October 2014. 1. Costructio of the product measure The purpose of these otes is to preset the

More information

Overview. Structured learning for feature selection and prediction. Motivation for feature selection. Outline. Part III:

Overview. Structured learning for feature selection and prediction. Motivation for feature selection. Outline. Part III: Overview Structured learig for feature selectio ad predictio Yookyug Lee Departmet of Statistics The Ohio State Uiversity Part I: Itroductio to Kerel methods Part II: Learig with Reproducig Kerel Hilbert

More information

Using An Accelerating Method With The Trapezoidal And Mid-Point Rules To Evaluate The Double Integrals With Continuous Integrands Numerically

Using An Accelerating Method With The Trapezoidal And Mid-Point Rules To Evaluate The Double Integrals With Continuous Integrands Numerically ISSN -50 (Paper) ISSN 5-05 (Olie) Vol.7, No., 017 Usig A Acceleratig Method With The Trapezoidal Ad Mid-Poit Rules To Evaluate The Double Itegrals With Cotiuous Itegrads Numerically Azal Taha Abdul Wahab

More information

ACCELERATING CONVERGENCE OF SERIES

ACCELERATING CONVERGENCE OF SERIES ACCELERATIG COVERGECE OF SERIES KEITH CORAD. Itroductio A ifiite series is the limit of its partial sums. However, it may take a large umber of terms to get eve a few correct digits for the series from

More information

Linear Programming and the Simplex Method

Linear Programming and the Simplex Method Liear Programmig ad the Simplex ethod Abstract This article is a itroductio to Liear Programmig ad usig Simplex method for solvig LP problems i primal form. What is Liear Programmig? Liear Programmig is

More information

State Space Representation

State Space Representation Optimal Cotrol, Guidace ad Estimatio Lecture 2 Overview of SS Approach ad Matrix heory Prof. Radhakat Padhi Dept. of Aerospace Egieerig Idia Istitute of Sciece - Bagalore State Space Represetatio Prof.

More information

Chapter 7: Numerical Series

Chapter 7: Numerical Series Chapter 7: Numerical Series Chapter 7 Overview: Sequeces ad Numerical Series I most texts, the topic of sequeces ad series appears, at first, to be a side topic. There are almost o derivatives or itegrals

More information

A multivariate rational interpolation with no poles in R m

A multivariate rational interpolation with no poles in R m NTMSCI 3, No., 9-8 (05) 9 New Treds i Mathematical Scieces http://www.tmsci.com A multivariate ratioal iterpolatio with o poles i R m Osma Rasit Isik, Zekeriya Guey ad Mehmet Sezer Departmet of Mathematics,

More information

A NOTE ON THE TOTAL LEAST SQUARES FIT TO COPLANAR POINTS

A NOTE ON THE TOTAL LEAST SQUARES FIT TO COPLANAR POINTS A NOTE ON THE TOTAL LEAST SQUARES FIT TO COPLANAR POINTS STEVEN L. LEE Abstract. The Total Least Squares (TLS) fit to the poits (x,y ), =1,,, miimizes the sum of the squares of the perpedicular distaces

More information