Bilinear Formulated Multiple Kernel Learning for Multi-class Classification Problem

Size: px
Start display at page:

Download "Bilinear Formulated Multiple Kernel Learning for Multi-class Classification Problem"

Transcription

1 Bilinear Formulated Multiple Kernel Learning for Multi-lass Classifiation Problem Takumi Kobayashi and Nobuyuki Otsu National Institute of Advaned Industrial Siene and Tehnology, -- Umezono, Tsukuba, Japan Abstrat. In this paper, we propose a method of multiple kernel learning (MKL) to inherently deal with multi-lass lassifiation problems. The performanes of kernel-based lassifiation methods depend on the employed kernel funtions, and it is diffiult to predefine the optimal kernel. In the framework of MKL, multiple types of kernel funtions are linearly integrated with optimizing the weights for the kernels. However, the multi-lass problems are rarely inorporated in the formulation and the optimization is time-onsuming. We formulate the multi-lass MKL in a bilinear form and propose a sheme for omputationally effiient optimization. The sheme makes the method favorably appliable to large-saled samples in the real-world problems. In the experiments on multi-lass lassifiation using several datasets, the proposed method exhibits the favorable performane and low omputation time ompared to the previous methods. Keywords: Kernel methods, multiple kernel learning, multi-lass lassifiation, bilinear form. Introdution The kernel-based methods have attrated keen attentions, exhibiting the stateof-the-art performanes, suh as in support vetor mahines (SVM) [0] and kernel multivariate analyses [8]. These methods are applied in various real-world tasks, e.g., in the fields of omputer vision and signal proessing. In the kernelbased methods, the input vetors are impliitly embedded in a high dimensional spae (alled kernel feature spae) via kernel funtions whih effiiently ompute inner produts of those vetors in the kernel feature spae. Thus, the performane of the kernel-based methods depends on how to onstrut the kernel funtions. In reent years, Lankriet et al. [5] proposed the method to integrate different kernel funtions with optimizing the weights for the kernels, whih is alled multiple kernel learning (MKL). By ombining multiple types of kernels, the heterogeneous information, whih is omplementary to eah other, an be effetively inorporated, possibly improving the performane. The omposite kernel is suessfully applied to, for example, objet reognition []. In MKL, the weights for ombining the kernels are obtained via the optimization proesses based on a ertain riterion, mainly for lassifiation. Sine the riterion an be defined in different formula, various methods for MKL have been K.W. Wong et al. (Eds.): ICONIP 00, Part II, LNCS 6444, pp , 00. Springer-Verlag Berlin Heidelberg 00

2 00 T. Kobayashi and N. Otsu proposed by treating different optimization problems in different approahes; e.g., semi-definite programming [5] and semi-infinite linear program [9,3]. Most of the methods, however, are intended for lassifying binary lasses, while realworld problems ontain multi lasses in general. In addition, for appliation to pratial problems, the optimization proess should be omputationally effiient. In this paper, we propose a MKL method for multi-lass lassifiation problems. Without deomposing the multi-lass problem into several binary lass problems, the proposed method inherently deals with it based on the formulation of Crammer & Singer [] whih first proposed multi-lass SVM using a single kernel. The ontributions of this paper are as follows: We extend the formulation of multi-lass lassifiation in [] to ope with multiple kernel funtions, and formulate multi-lass MKL (MC-MKL) in a bilinear form. In the formulation, the optimal weights for kernel funtions are obtained in respetive lasses. We propose a sheme to effetively optimize the bilinear formulated problem, whih makes the method appliable to large-saled samples. In the experiments on various datasets, we demonstrate the effetiveness of the proposed method, ompared to the existing MKL methods [7,3]. While Zien & Ong [3] proposed the method of MC-MKL based on a similar formulation, we employ a different riterion for the margin of the multi-lass lassifiers and propose a more effiient optimization sheme. Bilinear Formulation for MC-MKL To onsider multiple kernels, we introdue multiple types of features x (r) (r {,.., R}, wherer is the number of feature types). The inner produts of those features an be replaed with respetive types of kernels via kernel triks: x (r) (r) i x j k r (x (r) i, x (r) j ). Crammer & Singer [] have proposed a formulation for multilass SVM, onsidering only a single type of feature x. We extend the formulation to inorporate the multiple types of features (kernels). We additionally introdue the weights v for feature types as well as the weights w within features similarly in MKL methods [3]. These two kinds of weights are mathematially integrated into the following bilinear form to onstitute multi-lass lassifiation []: { R } =arg max w (r) x (r) = w Xv = X, w v F, () {,..,C} r= v (r) where C is the number of lasses,, F indiates Frobenius inner produt, v (r) is a weight for the r-th type of feature, w (r) is a lassifier vetor for the r-th type of feature vetor in lass, and these variables are onatenated into long vetors, respetively; [ ] x () 0 0 v v (),,v (R), X x (R), w w ().. w (R). ()

3 Bilinear Formulated Multiple Kernel Learning for Multi-lass Problem 0 Then, we an onsider the margin of the above-defined bilinear lassifiers (projetions) by using the Frobenius norm w v F, and define the following optimization problem based on a large margin riterion: OP: min {w,v },{ξ i} C w v F + κ ξ i (3) s.t. i, X i, w yi v y i F X i, w v F + δ yi, ξ i, where N is the number of samples, y i indiates the lass label of the i-th sample; y i {,.., C}, δ yi, equalstoif = y i and 0 otherwise, ξ i is a slak variable for soft margin, and κ is a parameter to ontrol the trade-off between the margin and the training errors. Note that we minimize the Frobenius norm, not squared one in SVM. Sine this problem is diffiult to diretly optimize, we employ the upper bound of the Frobenius norm: w v F = w v ( w + v ). (4) Therefore, the optimization problem OP is modified to ( C ) P : min w + v + κ ξ i (5) {w,v },{ξ i} s.t. i, w y i X i v yi w X i v + δ yi, ξ i. The weights w and v separately emerge as standard squared norm, whih failitates the optimization. It an be shown that the OP and P have the idential optimum solution by using resaling tehnique desribed in Se In the problem P, if the optimal weights v are obtained, the optimal lassifier vetors are represented as w = N τ i X iv,whereτ i are the optimal dual variables []. Thus, the multi-lass bilinear lassifier in Eq.() results in w X v = τ iv X i X v = R τ i v (r) r= R τ i v (r) r= k r (x (r) i, x (r) ), x (r) i x (r) (6) where k r (x (r) i, x (r) ) is a kernel funtion on behalf of the inner-produt of the r-th type of features, x (r) i x (r), in kernel triks. Note that the kernel funtions an be differently defined for respetive feature types. The squared weights v (r) play a role to weight the kernel funtions as in MKL, and produe the omposite kernel funtion speialized to lass. In this ase, we an introdue alternative nonnegative variables d (r) = v (r) 0 without loss of generality. The variables d are the weights for kernel funtions, and therefore the above bilinear formulation is appliable to MC-MKL. The primal problem P is reformulated to

4 0 T. Kobayashi and N. Otsu P: min {w,d },{ξ i} ( C w + d ) + κ ξ i (7) s.t. i, w y i X i d yi w X i d + δ yi, ξ i, d 0, where d =[d (),.., d (R) ],andd is a omponent-wise square root of the vetor d. In the problem P, non-negativity onstraint is additionally introdued to the problem P (or OP). The bilinear lassifier is finally obtained by w X d = R (r) τ i d r= k r (x (r) i, x (r) ). (8) We desribe the sheme to effiiently optimize P in the following setion. 3 Optimization Methods The primal problem P in Eq.(7) has the following dual form, similarly to []: max {τ i} e y i τ i, s.t. i τ i κe yi, τ i =0, r, k r (x (r) i, x (r) j )τ i τ j κ. i,j where τ i is the i-th C-dimensional dual variable, e yi is a C-dimensional vetor in whih only the y i -th element is and the others are 0, and is a C-dimensional vetor of whih all elements are. This is a onvex problem having the global optimum. However, it is atually solved by seond order one programming, whih requires exhaustive omputational ost, and it is not appliable to largesaled samples. Therefore, we take an alternative sheme to optimize the primal problem P in a manner similar to [7,]. The sheme is based on the iterative optimization for w and d, with applying projeted gradient desent. 3. Optimization with Respet to w At the t-th iteration with fixing the variable d to d [t], in a manner similar to [], the problem P results in the dual form: max {τ i} D w :max {τ i} C (v [t] i,j= i,j= X i X j v [t] )τ i τ j + e y i τ i τ i Λ ijτ j + e y i τ i, s.t. i τ i κe yi, τ i =0, (9) where Λ ij is a C-dimensional diagonal matrix, {Λ ij } = R r= d(r)[t] k r (x (r) i, x (r) j ). In this dual problem, the onstants derived from d [t] are omitted. This is optimized by iteratively solving the deomposed small subproblem [,3], as follows.

5 Bilinear Formulated Multiple Kernel Learning for Multi-lass Problem 03 Algorithm. Optimization for subproblem SubD w Require: Reindex b = b λ and λ suh that b are sorted in dereasing order Initialize =, ζ num = λ b κ, ζ den = λ. while C, ζ = ζnum ζ den b do ζ num ζ num + λ b, ζ den ζ den + λ, +. end while Output: τ =min(b,ζλ ), τ =min{κδ y,,λ (ζ β )}. The dual problem D w is deomposed into N small subproblems; the i-th subproblem fouses on the dual variable τ i assoiated with the i-th sample, while fixing the others τ j (j i): SubD w :max τ i τ i Λ ii τ i β τ i γ, s.t. τ i κe yi, τ i =0, (0) where β = Λ ij τ j e yi, γ = τ j Λ jk τ k e y j τ j. j i j i,k i j i For optimization in D w, the proess to solve SubD w works in rounds for all i and the dual variables τ i are updated until onvergene. The subproblems are rather more omplex than those in [] sine they inlude not salar value x i x j but the diagonal matrix Λ ij derived from multiple features (kernels). However, it is noteworthy that they are solved at a quite low omputational ost, as follows. Optimization for Subproblem SubD w In the following, we omit the index i for simpliity. By ignoring the onstant, the subproblem SubD w in Eq.(0) is reformulated to min τ τ, s.t. τ κλ y e y + Λ β, λ τ = λ Λ β, where τ = Λ τ + Λ β, λ is a C-dimensional vetor omposed of diagonal elements of Λ,andλ y is the y-th element of the vetor λ. Byusingb = κλ y e y + Λ β, the onstraints are rewritten as s.t. τ b, λ τ = λ b κ. () The Lagrangian for this problem is L = τ α (b τ ) ζ(λ τ λ b + κ), () where α 0, ζare Lagrangian multipliers. When the subproblem is optimized, the followings hold: L τ = τ + α ζλ =0, KKT: α (b τ )=0. (3)

6 04 T. Kobayashi and N. Otsu Therefore, we obtain α =0 τ = ζλ, ζ b λ, α > 0 τ = b,ζ> b λ. (4) By using the above, the seond onstraint in Eq.() results in λ τ =ζ α =0 λ + α >0 λ b = C α λ b κ, ζ = λ =0 b κ. (5) α =0 λ Thus, for solving the subproblem, we only seek ζ satisfying Eq.(4) and (5), and the simple algorithm is onstruted in Algorithm. The optimization of D w is the ore and most exhaustive proess for the whole optimization in P. Therefore, the effetive algorithm (Algorithm ) to solve the subproblem SubD w makes the whole optimization proess omputationally effiient. 3. Optimization with Respet to d Then, the optimization of P is performed with respet to d. In this study, we simply employ projeted gradient desent approah, although the other method suh as in [] would be appliable. In this approah, the objetive ost funtion is minimized by a line searh [6] along the projeted gradient under the onstraints d 0. Based on the priniple of strong duality, the primal P is represented by using D w in Eq.(9) with the optimal dual variables τ [t] as min {d } {( C ) d θ d + e y i τ [t] i } = W (d), s.t. d 0, i,j τ (r)[t] where θ is a R-dimensional vetor of θ (r) = i τ (r)[t] j k r (x (r) i, x (r) j ). In this ase, W is differentiable with respet to d (ref. [7]), and thus the gradients are obtained as W = θ. Thereby, the optimization in P is performed by using projeted gradient desent, d [t+] = d [t] ɛ W.Weapplyalinesearh[6] to greedily seek the parameter ɛ suh that W, i.e., the objetive ost funtion in P, is minimized while ensuring d 0. Note that, in this greedy searh, the ost funtion is evaluated several times via optimization of D w. 3.3 Resaling After optimization of D w with the fixed d, the ost funtion is further dereased by simply resaling the variables of τ and d, so as to reah the lower bound in Eq.(4). The resaling, ˆτ i = s τ i, ˆd = s d, does not affet the bilinear projetion in Eq.(8) and thus the onstraints in P are kept: ŵ X ˆd = R s τ i r= d (r) k r (x, x i )=w s Xd, (6)

7 Bilinear Formulated Multiple Kernel Learning for Multi-lass Problem 05 while the first term in the ost funtion is transformed to C ŵ + ˆd = C s w + d. (7) s The optimal resaling that minimizes the above is analytially obtained as s = d / w, and Eq.(7) equals to the lower bound (the Frobenius norm): C ŵ + ˆd = C d w = C w d F. (8) Although the resaled ˆτ is not neessarily the solution of the problem D w with the resaled ˆd, in the greedy optimization for d, the gradients using ˆτ are employed as the approximation for W ( ˆd). This resaling ontributes to fast onvergene. 4 Experimental Result We show the lassifiation performanes and omputation time of the proposed methods in omparison with the other MKL methods [7,3] on various datasets. We employed the -vs-all version of [7] to ope with multi-lass problems. The proposed method is implemented by using MATLAB with C-mex on Xeon 3GHz PC. For the methods of [7,3], we used the MATLAB odes provided by the authors and ombined them with libsvm [] and MOSEK optimization toolbox in order to speed up those methods as muh as possible. In this experiment, the parameter values in the all methods are set as follows: κ is determined from κ {0.5,, 0} based on 3-fold ross validation and the maximum number of iterations is set to 40 iterations for fair omparison of omputation time. These methods almost onverge on various datasets within 40 iterations. First, we used four benhmark datasets: waveform from UCI Mahine Learning Repository, satimage andsegment from the STATLOG projet, and USPS [4]. The multiple RBF kernels with 0 σ s (uniformly seleted on the logarithmi sale over [0, 0 ]) were employed. We drew 000 random training samples and lassified the remained samples. The trial is repeated 0 times and the average performane is reported. Fig. (a,b) shows the lassifiation results (error rates) and omputation time on those datasets. While the performanes of the proposed method are ompetitive to the others, the omputation time is muh more redued; espeially, more than 0 times faster than the method of [3]. Next, we applied the proposed method to the other pratial lassifiation problems in ell biology. The task is to predit the sub-ellular loalizations of proteins, and in this ase it results in multi-lass lassifiation problems. We employed a total of 69 kernels of whih details are desribed in [3]. MKL would

8 06 T. Kobayashi and N. Otsu Error rate (%) waveform (3) [000] satimage (6) [000] segment (7) [000] Benhmark dataset Bilinear MCMKL simplemkl [7] MCMKL SILP [3] USPS (0) [000] Computation time (se) [log-sale] waveform (3) [000] satimage (6) [000] Bilinear MCMKL simplemkl [7] MCMKL SILP [3] segment (7) [000] (a) Classifiation performane (b) Computation time Biologial dataset Error rate (%) plant (4) [376] nonplant (3) [093] Bilinear MCMKL simplemkl [7] MCMKL SILP [3] psort+ (4) [7] psort- (5) [578] () Classifiation performane Computation time (se) [log-sale] plant (4) [376] nonplant (3) [093] psort+ (4) [7] USPS (0) [000] Bilinear MCMKL simplemkl [7] MCMKL SILP [3] (d) Computation time psort- (5) [578] Fig.. The lassifiation performanes (error rates) and omputation time on the four benhmark datasets. The number of lasses is indiated in parentheses and that of training samples is in brakets. The left bar shows the result of the proposed methods. be effetively applied to these substantial types of kernel. In this experiment, we used four biologial datasets [3]: plant, nonplant, psort+, andpsort-. We randomly split the dataset into 40% for training and 60% for testing. The trial is repeated 0 times and the average performane is reported. The results are shown in Fig. (,d), demonstrating that the proposed method is quite effetive; the proposed method is superior and faster to the methods of [7,3]. The experimental result shows that the proposed method effetively and effiiently ombines a lot of heterogeneous kernel funtions. 5 Conlusion We have proposed a multiple kernel learning (MKL) method to deal with multilass problems. In the proposed method, the multi-lass lassifiation using multiple kernels is formulated in the bilinear form, and the omputationally effiient optimization sheme is proposed in order to be appliable to large-saled samples. In the experiments on the benhmarks and the biologial datasets, the proposed method exhibited the favorable performanes and omputation time ompared to the previous methods of MKL.

9 Bilinear Formulated Multiple Kernel Learning for Multi-lass Problem 07 Referenes. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vetor mahines (00), Software available at Crammer, K., Singer, Y.: On the algorithmi implementation of multilass kernelbased vetor mahines. Journal of Mahine Learning Researh, 65 9 (00) 3. Fan, R.E., Chang, K.W., Hsieh, C.J., Wang, X.R., Lin, C.J.: Liblinear: A library for large linear lassifiation. Journal of Mahine Learning Researh 9(6), (008) 4. Hull, J.: A database for handwritten text reognition researh. IEEE Trans. Pattern Analysis and Mahine Intelligene 6(5), (994) 5. Lankriet, G., Cristianini, N., Bartlett, P., Ghaoui, L.E., Jordan, M.I.: Learning the kernel matrix with semidefinite programming. Journal of Mahine Learning Researh 5, 7 7 (004) 6. Noedal, J., Wright, S. (eds.): Numerial optimization. Springer, New York (999) 7. Rakotomamonjy, A., Bah, F.R., Canu, S., Grandvalet, Y.: Simplemkl. Journal of Mahine Learning Researh 9, 49 5 (008) 8. Shölkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambridge (00) 9. Sonnenburg, S., Rätsh, G., Shäfer, C., Shölkopf, B.: Large sale multiple kernel learning. Journal of Mahine Learning Researh 7, (006) 0. Vapnik, V.: Statistial Learning Theory. Wiley, Chihester (998). Varma, M., Ray, D.: Learning the disriminative power-invariane trade-off. In: Proeedings of IEEE th International Conferene on Computer Vision (007). Xu, Z., Jin, R., King, I., Lyu, M.R.: An extended level method for effiient multiple kernel learning. In: Advanes in Neural Information Proessing Systems, vol., pp (008) 3. Zien, A., Ong, C.S.: Multilass multiple kernel learning. In: Proeedings of the 4th International Conferene on Mahine Learning (007)

Model-based mixture discriminant analysis an experimental study

Model-based mixture discriminant analysis an experimental study Model-based mixture disriminant analysis an experimental study Zohar Halbe and Mayer Aladjem Department of Eletrial and Computer Engineering, Ben-Gurion University of the Negev P.O.Box 653, Beer-Sheva,

More information

A Unified View on Multi-class Support Vector Classification Supplement

A Unified View on Multi-class Support Vector Classification Supplement Journal of Mahine Learning Researh??) Submitted 7/15; Published?/?? A Unified View on Multi-lass Support Vetor Classifiation Supplement Ürün Doğan Mirosoft Researh Tobias Glasmahers Institut für Neuroinformatik

More information

Complexity of Regularization RBF Networks

Complexity of Regularization RBF Networks Complexity of Regularization RBF Networks Mark A Kon Department of Mathematis and Statistis Boston University Boston, MA 02215 mkon@buedu Leszek Plaskota Institute of Applied Mathematis University of Warsaw

More information

Control Theory association of mathematics and engineering

Control Theory association of mathematics and engineering Control Theory assoiation of mathematis and engineering Wojieh Mitkowski Krzysztof Oprzedkiewiz Department of Automatis AGH Univ. of Siene & Tehnology, Craow, Poland, Abstrat In this paper a methodology

More information

Optimization of Statistical Decisions for Age Replacement Problems via a New Pivotal Quantity Averaging Approach

Optimization of Statistical Decisions for Age Replacement Problems via a New Pivotal Quantity Averaging Approach Amerian Journal of heoretial and Applied tatistis 6; 5(-): -8 Published online January 7, 6 (http://www.sienepublishinggroup.om/j/ajtas) doi:.648/j.ajtas.s.65.4 IN: 36-8999 (Print); IN: 36-96 (Online)

More information

A Spatiotemporal Approach to Passive Sound Source Localization

A Spatiotemporal Approach to Passive Sound Source Localization A Spatiotemporal Approah Passive Sound Soure Loalization Pasi Pertilä, Mikko Parviainen, Teemu Korhonen and Ari Visa Institute of Signal Proessing Tampere University of Tehnology, P.O.Box 553, FIN-330,

More information

max min z i i=1 x j k s.t. j=1 x j j:i T j

max min z i i=1 x j k s.t. j=1 x j j:i T j AM 221: Advaned Optimization Spring 2016 Prof. Yaron Singer Leture 22 April 18th 1 Overview In this leture, we will study the pipage rounding tehnique whih is a deterministi rounding proedure that an be

More information

LOGISTIC REGRESSION IN DEPRESSION CLASSIFICATION

LOGISTIC REGRESSION IN DEPRESSION CLASSIFICATION LOGISIC REGRESSIO I DEPRESSIO CLASSIFICAIO J. Kual,. V. ran, M. Bareš KSE, FJFI, CVU v Praze PCP, CS, 3LF UK v Praze Abstrat Well nown logisti regression and the other binary response models an be used

More information

A new initial search direction for nonlinear conjugate gradient method

A new initial search direction for nonlinear conjugate gradient method International Journal of Mathematis Researh. ISSN 0976-5840 Volume 6, Number 2 (2014), pp. 183 190 International Researh Publiation House http://www.irphouse.om A new initial searh diretion for nonlinear

More information

Solving Constrained Lasso and Elastic Net Using

Solving Constrained Lasso and Elastic Net Using Solving Constrained Lasso and Elasti Net Using ν s Carlos M. Alaı z, Alberto Torres and Jose R. Dorronsoro Universidad Auto noma de Madrid - Departamento de Ingenierı a Informa tia Toma s y Valiente 11,

More information

10.5 Unsupervised Bayesian Learning

10.5 Unsupervised Bayesian Learning The Bayes Classifier Maximum-likelihood methods: Li Yu Hongda Mao Joan Wang parameter vetor is a fixed but unknown value Bayes methods: parameter vetor is a random variable with known prior distribution

More information

The Effectiveness of the Linear Hull Effect

The Effectiveness of the Linear Hull Effect The Effetiveness of the Linear Hull Effet S. Murphy Tehnial Report RHUL MA 009 9 6 Otober 009 Department of Mathematis Royal Holloway, University of London Egham, Surrey TW0 0EX, England http://www.rhul.a.uk/mathematis/tehreports

More information

Weighted K-Nearest Neighbor Revisited

Weighted K-Nearest Neighbor Revisited Weighted -Nearest Neighbor Revisited M. Biego University of Verona Verona, Italy Email: manuele.biego@univr.it M. Loog Delft University of Tehnology Delft, The Netherlands Email: m.loog@tudelft.nl Abstrat

More information

Coding for Random Projections and Approximate Near Neighbor Search

Coding for Random Projections and Approximate Near Neighbor Search Coding for Random Projetions and Approximate Near Neighbor Searh Ping Li Department of Statistis & Biostatistis Department of Computer Siene Rutgers University Pisataay, NJ 8854, USA pingli@stat.rutgers.edu

More information

The Influences of Smooth Approximation Functions for SPTSVM

The Influences of Smooth Approximation Functions for SPTSVM The Influenes of Smooth Approximation Funtions for SPTSVM Xinxin Zhang Liaoheng University Shool of Mathematis Sienes Liaoheng, 5059 P.R. China ldzhangxin008@6.om Liya Fan Liaoheng University Shool of

More information

A NETWORK SIMPLEX ALGORITHM FOR THE MINIMUM COST-BENEFIT NETWORK FLOW PROBLEM

A NETWORK SIMPLEX ALGORITHM FOR THE MINIMUM COST-BENEFIT NETWORK FLOW PROBLEM NETWORK SIMPLEX LGORITHM FOR THE MINIMUM COST-BENEFIT NETWORK FLOW PROBLEM Cen Çalışan, Utah Valley University, 800 W. University Parway, Orem, UT 84058, 801-863-6487, en.alisan@uvu.edu BSTRCT The minimum

More information

Lecture 7: Sampling/Projections for Least-squares Approximation, Cont. 7 Sampling/Projections for Least-squares Approximation, Cont.

Lecture 7: Sampling/Projections for Least-squares Approximation, Cont. 7 Sampling/Projections for Least-squares Approximation, Cont. Stat60/CS94: Randomized Algorithms for Matries and Data Leture 7-09/5/013 Leture 7: Sampling/Projetions for Least-squares Approximation, Cont. Leturer: Mihael Mahoney Sribe: Mihael Mahoney Warning: these

More information

Planning with Uncertainty in Position: an Optimal Planner

Planning with Uncertainty in Position: an Optimal Planner Planning with Unertainty in Position: an Optimal Planner Juan Pablo Gonzalez Anthony (Tony) Stentz CMU-RI -TR-04-63 The Robotis Institute Carnegie Mellon University Pittsburgh, Pennsylvania 15213 Otober

More information

Advanced Computational Fluid Dynamics AA215A Lecture 4

Advanced Computational Fluid Dynamics AA215A Lecture 4 Advaned Computational Fluid Dynamis AA5A Leture 4 Antony Jameson Winter Quarter,, Stanford, CA Abstrat Leture 4 overs analysis of the equations of gas dynamis Contents Analysis of the equations of gas

More information

Hankel Optimal Model Order Reduction 1

Hankel Optimal Model Order Reduction 1 Massahusetts Institute of Tehnology Department of Eletrial Engineering and Computer Siene 6.245: MULTIVARIABLE CONTROL SYSTEMS by A. Megretski Hankel Optimal Model Order Redution 1 This leture overs both

More information

A NONLILEAR CONTROLLER FOR SHIP AUTOPILOTS

A NONLILEAR CONTROLLER FOR SHIP AUTOPILOTS Vietnam Journal of Mehanis, VAST, Vol. 4, No. (), pp. A NONLILEAR CONTROLLER FOR SHIP AUTOPILOTS Le Thanh Tung Hanoi University of Siene and Tehnology, Vietnam Abstrat. Conventional ship autopilots are

More information

Millennium Relativity Acceleration Composition. The Relativistic Relationship between Acceleration and Uniform Motion

Millennium Relativity Acceleration Composition. The Relativistic Relationship between Acceleration and Uniform Motion Millennium Relativity Aeleration Composition he Relativisti Relationship between Aeleration and niform Motion Copyright 003 Joseph A. Rybzyk Abstrat he relativisti priniples developed throughout the six

More information

Mathematical Programming for Multiple Kernel Learning

Mathematical Programming for Multiple Kernel Learning Mathematical Programming for Multiple Kernel Learning Alex Zien Fraunhofer FIRST.IDA, Berlin, Germany Friedrich Miescher Laboratory, Tübingen, Germany 07. July 2009 Mathematical Programming Stream, EURO

More information

Danielle Maddix AA238 Final Project December 9, 2016

Danielle Maddix AA238 Final Project December 9, 2016 Struture and Parameter Learning in Bayesian Networks with Appliations to Prediting Breast Caner Tumor Malignany in a Lower Dimension Feature Spae Danielle Maddix AA238 Final Projet Deember 9, 2016 Abstrat

More information

Taste for variety and optimum product diversity in an open economy

Taste for variety and optimum product diversity in an open economy Taste for variety and optimum produt diversity in an open eonomy Javier Coto-Martínez City University Paul Levine University of Surrey Otober 0, 005 María D.C. Garía-Alonso University of Kent Abstrat We

More information

Nonreversibility of Multiple Unicast Networks

Nonreversibility of Multiple Unicast Networks Nonreversibility of Multiple Uniast Networks Randall Dougherty and Kenneth Zeger September 27, 2005 Abstrat We prove that for any finite direted ayli network, there exists a orresponding multiple uniast

More information

Error Bounds for Context Reduction and Feature Omission

Error Bounds for Context Reduction and Feature Omission Error Bounds for Context Redution and Feature Omission Eugen Bek, Ralf Shlüter, Hermann Ney,2 Human Language Tehnology and Pattern Reognition, Computer Siene Department RWTH Aahen University, Ahornstr.

More information

An I-Vector Backend for Speaker Verification

An I-Vector Backend for Speaker Verification An I-Vetor Bakend for Speaker Verifiation Patrik Kenny, 1 Themos Stafylakis, 1 Jahangir Alam, 1 and Marel Kokmann 2 1 CRIM, Canada, {patrik.kenny, themos.stafylakis, jahangir.alam}@rim.a 2 VoieTrust, Canada,

More information

UPPER-TRUNCATED POWER LAW DISTRIBUTIONS

UPPER-TRUNCATED POWER LAW DISTRIBUTIONS Fratals, Vol. 9, No. (00) 09 World Sientifi Publishing Company UPPER-TRUNCATED POWER LAW DISTRIBUTIONS STEPHEN M. BURROUGHS and SARAH F. TEBBENS College of Marine Siene, University of South Florida, St.

More information

Average Rate Speed Scaling

Average Rate Speed Scaling Average Rate Speed Saling Nikhil Bansal David P. Bunde Ho-Leung Chan Kirk Pruhs May 2, 2008 Abstrat Speed saling is a power management tehnique that involves dynamially hanging the speed of a proessor.

More information

Controller Design Based on Transient Response Criteria. Chapter 12 1

Controller Design Based on Transient Response Criteria. Chapter 12 1 Controller Design Based on Transient Response Criteria Chapter 12 1 Desirable Controller Features 0. Stable 1. Quik responding 2. Adequate disturbane rejetion 3. Insensitive to model, measurement errors

More information

Normative and descriptive approaches to multiattribute decision making

Normative and descriptive approaches to multiattribute decision making De. 009, Volume 8, No. (Serial No.78) China-USA Business Review, ISSN 57-54, USA Normative and desriptive approahes to multiattribute deision making Milan Terek (Department of Statistis, University of

More information

Feature Selection by Independent Component Analysis and Mutual Information Maximization in EEG Signal Classification

Feature Selection by Independent Component Analysis and Mutual Information Maximization in EEG Signal Classification Feature Seletion by Independent Component Analysis and Mutual Information Maximization in EEG Signal Classifiation Tian Lan, Deniz Erdogmus, Andre Adami, Mihael Pavel BME Department, Oregon Health & Siene

More information

Assessing the Performance of a BCI: A Task-Oriented Approach

Assessing the Performance of a BCI: A Task-Oriented Approach Assessing the Performane of a BCI: A Task-Oriented Approah B. Dal Seno, L. Mainardi 2, M. Matteui Department of Eletronis and Information, IIT-Unit, Politenio di Milano, Italy 2 Department of Bioengineering,

More information

Infomax Boosting. Abstract. 1. Introduction. 2. Infomax feature pursuit. Siwei Lyu Department of Computer Science Dartmouth College

Infomax Boosting. Abstract. 1. Introduction. 2. Infomax feature pursuit. Siwei Lyu Department of Computer Science Dartmouth College Infomax Boosting Siwei Lyu Department of Computer Siene Dartmouth College Abstrat In this paper, we desribed an effiient feature pursuit sheme for boosting. The proposed method is based on the infomax

More information

A Queueing Model for Call Blending in Call Centers

A Queueing Model for Call Blending in Call Centers A Queueing Model for Call Blending in Call Centers Sandjai Bhulai and Ger Koole Vrije Universiteit Amsterdam Faulty of Sienes De Boelelaan 1081a 1081 HV Amsterdam The Netherlands E-mail: {sbhulai, koole}@s.vu.nl

More information

Grasp Planning: How to Choose a Suitable Task Wrench Space

Grasp Planning: How to Choose a Suitable Task Wrench Space Grasp Planning: How to Choose a Suitable Task Wrenh Spae Ch. Borst, M. Fisher and G. Hirzinger German Aerospae Center - DLR Institute for Robotis and Mehatronis 8223 Wessling, Germany Email: [Christoph.Borst,

More information

SINCE Zadeh s compositional rule of fuzzy inference

SINCE Zadeh s compositional rule of fuzzy inference IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 14, NO. 6, DECEMBER 2006 709 Error Estimation of Perturbations Under CRI Guosheng Cheng Yuxi Fu Abstrat The analysis of stability robustness of fuzzy reasoning

More information

ONLINE APPENDICES for Cost-Effective Quality Assurance in Crowd Labeling

ONLINE APPENDICES for Cost-Effective Quality Assurance in Crowd Labeling ONLINE APPENDICES for Cost-Effetive Quality Assurane in Crowd Labeling Jing Wang Shool of Business and Management Hong Kong University of Siene and Tehnology Clear Water Bay Kowloon Hong Kong jwang@usthk

More information

Modeling of discrete/continuous optimization problems: characterization and formulation of disjunctions and their relaxations

Modeling of discrete/continuous optimization problems: characterization and formulation of disjunctions and their relaxations Computers and Chemial Engineering (00) 4/448 www.elsevier.om/loate/omphemeng Modeling of disrete/ontinuous optimization problems: haraterization and formulation of disjuntions and their relaxations Aldo

More information

Maximum Entropy and Exponential Families

Maximum Entropy and Exponential Families Maximum Entropy and Exponential Families April 9, 209 Abstrat The goal of this note is to derive the exponential form of probability distribution from more basi onsiderations, in partiular Entropy. It

More information

Sensitivity Analysis in Markov Networks

Sensitivity Analysis in Markov Networks Sensitivity Analysis in Markov Networks Hei Chan and Adnan Darwihe Computer Siene Department University of California, Los Angeles Los Angeles, CA 90095 {hei,darwihe}@s.ula.edu Abstrat This paper explores

More information

Large Margin Multiclass Gaussian Classification with Differential Privacy

Large Margin Multiclass Gaussian Classification with Differential Privacy Large Margin Multilass Gaussian Classifiation with Differential Privay Manas A. Pathak and Bhiksha Raj Carnegie Mellon University, Pittsburgh, PA 1513, USA {manasp,bhiksha}@s.mu.edu Abstrat. As inreasing

More information

Advances in Radio Science

Advances in Radio Science Advanes in adio Siene 2003) 1: 99 104 Copernius GmbH 2003 Advanes in adio Siene A hybrid method ombining the FDTD and a time domain boundary-integral equation marhing-on-in-time algorithm A Beker and V

More information

MUSIC GENRE CLASSIFICATION USING LOCALITY PRESERVING NON-NEGATIVE TENSOR FACTORIZATION AND SPARSE REPRESENTATIONS

MUSIC GENRE CLASSIFICATION USING LOCALITY PRESERVING NON-NEGATIVE TENSOR FACTORIZATION AND SPARSE REPRESENTATIONS 10th International Soiety for Musi Information Retrieval Conferene (ISMIR 2009) MUSIC GENRE CLASSIFICATION USING LOCALITY PRESERVING NON-NEGATIVE TENSOR FACTORIZATION AND SPARSE REPRESENTATIONS Yannis

More information

Study of EM waves in Periodic Structures (mathematical details)

Study of EM waves in Periodic Structures (mathematical details) Study of EM waves in Periodi Strutures (mathematial details) Massahusetts Institute of Tehnology 6.635 partial leture notes 1 Introdution: periodi media nomenlature 1. The spae domain is defined by a basis,(a

More information

Design and Development of Three Stages Mixed Sampling Plans for Variable Attribute Variable Quality Characteristics

Design and Development of Three Stages Mixed Sampling Plans for Variable Attribute Variable Quality Characteristics International Journal of Statistis and Systems ISSN 0973-2675 Volume 12, Number 4 (2017), pp. 763-772 Researh India Publiations http://www.ripubliation.om Design and Development of Three Stages Mixed Sampling

More information

Comparison of Alternative Equivalent Circuits of Induction Motor with Real Machine Data

Comparison of Alternative Equivalent Circuits of Induction Motor with Real Machine Data Comparison of Alternative Equivalent Ciruits of Indution Motor with Real Mahine Data J. radna, J. auer, S. Fligl and V. Hlinovsky Abstrat The algorithms based on separated ontrol of the motor flux and

More information

Variation Based Online Travel Time Prediction Using Clustered Neural Networks

Variation Based Online Travel Time Prediction Using Clustered Neural Networks Variation Based Online Travel Time Predition Using lustered Neural Networks Jie Yu, Gang-Len hang, H.W. Ho and Yue Liu Abstrat-This paper proposes a variation-based online travel time predition approah

More information

Sensitivity analysis for linear optimization problem with fuzzy data in the objective function

Sensitivity analysis for linear optimization problem with fuzzy data in the objective function Sensitivity analysis for linear optimization problem with fuzzy data in the objetive funtion Stephan Dempe, Tatiana Starostina May 5, 2004 Abstrat Linear programming problems with fuzzy oeffiients in the

More information

A New Version of Flusser Moment Set for Pattern Feature Extraction

A New Version of Flusser Moment Set for Pattern Feature Extraction A New Version of Flusser Moment Set for Pattern Feature Extration Constantin-Iulian VIZITIU, Doru MUNTEANU, Cristian MOLDER Communiations and Eletroni Systems Department Military Tehnial Aademy George

More information

Finite-time stabilization of chaotic gyros based on a homogeneous supertwisting-like algorithm

Finite-time stabilization of chaotic gyros based on a homogeneous supertwisting-like algorithm OP Conferene Series: Materials Siene Engineering PAPER OPEN ACCESS Finite-time stabilization of haoti gyros based on a homogeneous supertwisting-like algorithm To ite this artile: Pitha Khamsuwan et al

More information

Research Article Combining MLC and SVM Classifiers for Learning Based Decision Making: Analysis and Evaluations

Research Article Combining MLC and SVM Classifiers for Learning Based Decision Making: Analysis and Evaluations Computational Intelligene and Neurosiene Volume 2015, Artile ID 423581, 8 pages http://dx.doi.org/10.1155/2015/423581 Researh Artile Combining MC and SVM Classifiers for earning Based Deision Making: Analysis

More information

Estimating the probability law of the codelength as a function of the approximation error in image compression

Estimating the probability law of the codelength as a function of the approximation error in image compression Estimating the probability law of the odelength as a funtion of the approximation error in image ompression François Malgouyres Marh 7, 2007 Abstrat After some reolletions on ompression of images using

More information

General Closed-form Analytical Expressions of Air-gap Inductances for Surfacemounted Permanent Magnet and Induction Machines

General Closed-form Analytical Expressions of Air-gap Inductances for Surfacemounted Permanent Magnet and Induction Machines General Closed-form Analytial Expressions of Air-gap Indutanes for Surfaemounted Permanent Magnet and Indution Mahines Ronghai Qu, Member, IEEE Eletroni & Photoni Systems Tehnologies General Eletri Company

More information

Development of Fuzzy Extreme Value Theory. Populations

Development of Fuzzy Extreme Value Theory. Populations Applied Mathematial Sienes, Vol. 6, 0, no. 7, 58 5834 Development of Fuzzy Extreme Value Theory Control Charts Using α -uts for Sewed Populations Rungsarit Intaramo Department of Mathematis, Faulty of

More information

Fiber Optic Cable Transmission Losses with Perturbation Effects

Fiber Optic Cable Transmission Losses with Perturbation Effects Fiber Opti Cable Transmission Losses with Perturbation Effets Kampanat Namngam 1*, Preeha Yupapin 2 and Pakkinee Chitsakul 1 1 Department of Mathematis and Computer Siene, Faulty of Siene, King Mongkut

More information

CMSC 451: Lecture 9 Greedy Approximation: Set Cover Thursday, Sep 28, 2017

CMSC 451: Lecture 9 Greedy Approximation: Set Cover Thursday, Sep 28, 2017 CMSC 451: Leture 9 Greedy Approximation: Set Cover Thursday, Sep 28, 2017 Reading: Chapt 11 of KT and Set 54 of DPV Set Cover: An important lass of optimization problems involves overing a ertain domain,

More information

Remark 4.1 Unlike Lyapunov theorems, LaSalle s theorem does not require the function V ( x ) to be positive definite.

Remark 4.1 Unlike Lyapunov theorems, LaSalle s theorem does not require the function V ( x ) to be positive definite. Leture Remark 4.1 Unlike Lyapunov theorems, LaSalle s theorem does not require the funtion V ( x ) to be positive definite. ost often, our interest will be to show that x( t) as t. For that we will need

More information

Neuro-Fuzzy Modeling of Heat Recovery Steam Generator

Neuro-Fuzzy Modeling of Heat Recovery Steam Generator International Journal of Mahine Learning and Computing, Vol. 2, No. 5, Otober 202 Neuro-Fuzzy Modeling of Heat Reovery Steam Generator A. Ghaffari, A. Chaibakhsh, and S. Shahhoseini represented in a network

More information

OPPORTUNISTIC SENSING FOR OBJECT RECOGNITION A UNIFIED FORMULATION FOR DYNAMIC SENSOR SELECTION AND FEATURE EXTRACTION

OPPORTUNISTIC SENSING FOR OBJECT RECOGNITION A UNIFIED FORMULATION FOR DYNAMIC SENSOR SELECTION AND FEATURE EXTRACTION OPPORTUNISTIC SENSING FOR OBJECT RECOGNITION A UNIFIED FORMULATION FOR DYNAMIC SENSOR SELECTION AND FEATURE EXTRACTION Zhaowen Wang, Jianhao Yang, Nasser Nasrabadi and Thomas Huang Bekman Institute, University

More information

Neuro-Fuzzy Control of Chemical Reactor with Disturbances

Neuro-Fuzzy Control of Chemical Reactor with Disturbances Neuro-Fuzzy Control of Chemial Reator with Disturbanes LENK BLHOÁ, JÁN DORN Department of Information Engineering and Proess Control, Institute of Information Engineering, utomation and Mathematis Faulty

More information

Developing Excel Macros for Solving Heat Diffusion Problems

Developing Excel Macros for Solving Heat Diffusion Problems Session 50 Developing Exel Maros for Solving Heat Diffusion Problems N. N. Sarker and M. A. Ketkar Department of Engineering Tehnology Prairie View A&M University Prairie View, TX 77446 Abstrat This paper

More information

Likelihood-confidence intervals for quantiles in Extreme Value Distributions

Likelihood-confidence intervals for quantiles in Extreme Value Distributions Likelihood-onfidene intervals for quantiles in Extreme Value Distributions A. Bolívar, E. Díaz-Franés, J. Ortega, and E. Vilhis. Centro de Investigaión en Matemátias; A.P. 42, Guanajuato, Gto. 36; Méxio

More information

General Equilibrium. What happens to cause a reaction to come to equilibrium?

General Equilibrium. What happens to cause a reaction to come to equilibrium? General Equilibrium Chemial Equilibrium Most hemial reations that are enountered are reversible. In other words, they go fairly easily in either the forward or reverse diretions. The thing to remember

More information

Factorized Asymptotic Bayesian Inference for Mixture Modeling

Factorized Asymptotic Bayesian Inference for Mixture Modeling Fatorized Asymptoti Bayesian Inferene for Mixture Modeling Ryohei Fujimaki NEC Laboratories Ameria Department of Media Analytis rfujimaki@sv.ne-labs.om Satoshi Morinaga NEC Corporation Information and

More information

A new method of measuring similarity between two neutrosophic soft sets and its application in pattern recognition problems

A new method of measuring similarity between two neutrosophic soft sets and its application in pattern recognition problems Neutrosophi Sets and Systems, Vol. 8, 05 63 A new method of measuring similarity between two neutrosophi soft sets and its appliation in pattern reognition problems Anjan Mukherjee, Sadhan Sarkar, Department

More information

Distributed Gaussian Mixture Model for Monitoring Multimode Plant-wide Process

Distributed Gaussian Mixture Model for Monitoring Multimode Plant-wide Process istributed Gaussian Mixture Model for Monitoring Multimode Plant-wide Proess Jinlin Zhu, Zhiqiang Ge, Zhihuan Song. State ey Laboratory of Industrial Control Tehnology, Institute of Industrial Proess Control,

More information

Lightpath routing for maximum reliability in optical mesh networks

Lightpath routing for maximum reliability in optical mesh networks Vol. 7, No. 5 / May 2008 / JOURNAL OF OPTICAL NETWORKING 449 Lightpath routing for maximum reliability in optial mesh networks Shengli Yuan, 1, * Saket Varma, 2 and Jason P. Jue 2 1 Department of Computer

More information

Bayesian Optimization Under Uncertainty

Bayesian Optimization Under Uncertainty Bayesian Optimization Under Unertainty Justin J. Beland University o Toronto justin.beland@mail.utoronto.a Prasanth B. Nair University o Toronto pbn@utias.utoronto.a Abstrat We onsider the problem o robust

More information

Agnostic Selective Classification

Agnostic Selective Classification Agnosti Seletive Classifiation Ran El-Yaniv and Yair Wiener Computer Siene Department Tehnion Israel Institute of Tehnology rani,wyair}@s,tx}.tehnion.a.il Abstrat For a learning problem whose assoiated

More information

A Riemannian Approach for Computing Geodesics in Elastic Shape Analysis

A Riemannian Approach for Computing Geodesics in Elastic Shape Analysis A Riemannian Approah for Computing Geodesis in Elasti Shape Analysis Yaqing You, Wen Huang, Kyle A. Gallivan and P.-A. Absil Florida State University, Department of Mathematis 7 Aademi Way, Tallahassee,

More information

On the application of the spectral projected gradient method in image segmentation

On the application of the spectral projected gradient method in image segmentation Noname manusript No. will be inserted by the editor) On the appliation of the spetral projeted gradient method in image segmentation Laura Antonelli Valentina De Simone Daniela di Serafino May 22, 2015

More information

A Heuristic Approach for Design and Calculation of Pressure Distribution over Naca 4 Digit Airfoil

A Heuristic Approach for Design and Calculation of Pressure Distribution over Naca 4 Digit Airfoil IOSR Journal of Engineering (IOSRJEN) ISSN (e): 2250-3021, ISSN (p): 2278-8719 PP 11-15 www.iosrjen.org A Heuristi Approah for Design and Calulation of Pressure Distribution over Naa 4 Digit Airfoil G.

More information

COMBINED PROBE FOR MACH NUMBER, TEMPERATURE AND INCIDENCE INDICATION

COMBINED PROBE FOR MACH NUMBER, TEMPERATURE AND INCIDENCE INDICATION 4 TH INTERNATIONAL CONGRESS OF THE AERONAUTICAL SCIENCES COMBINED PROBE FOR MACH NUMBER, TEMPERATURE AND INCIDENCE INDICATION Jiri Nozika*, Josef Adame*, Daniel Hanus** *Department of Fluid Dynamis and

More information

Dynamic Screening: Accelerating First-Order Algorithms for the Lasso and Group-Lasso

Dynamic Screening: Accelerating First-Order Algorithms for the Lasso and Group-Lasso Dynami Sreening: Aelerating First-Order Algorithms for the Lasso and Group-Lasso Antoine Bonnefoy, Valentin Emiya, Liva Ralaivola, Rémi Gribonval To ite this version: Antoine Bonnefoy, Valentin Emiya,

More information

Speed Regulation of a Small BLDC Motor using Genetic-Based Proportional Control

Speed Regulation of a Small BLDC Motor using Genetic-Based Proportional Control World Aademy of Siene, Engineering and Tehnology 47 8 Speed Regulation of a Small BLDC Motor using Geneti-Based Proportional Control S. Poonsawat, and T. Kulworawanihpong Abstrat This paper presents the

More information

Scalable Positivity Preserving Model Reduction Using Linear Energy Functions

Scalable Positivity Preserving Model Reduction Using Linear Energy Functions Salable Positivity Preserving Model Redution Using Linear Energy Funtions Sootla, Aivar; Rantzer, Anders Published in: IEEE 51st Annual Conferene on Deision and Control (CDC), 2012 DOI: 10.1109/CDC.2012.6427032

More information

Adaptive neuro-fuzzy inference system-based controllers for smart material actuator modelling

Adaptive neuro-fuzzy inference system-based controllers for smart material actuator modelling Adaptive neuro-fuzzy inferene system-based ontrollers for smart material atuator modelling T L Grigorie and R M Botez Éole de Tehnologie Supérieure, Montréal, Quebe, Canada The manusript was reeived on

More information

23.1 Tuning controllers, in the large view Quoting from Section 16.7:

23.1 Tuning controllers, in the large view Quoting from Section 16.7: Lesson 23. Tuning a real ontroller - modeling, proess identifiation, fine tuning 23.0 Context We have learned to view proesses as dynami systems, taking are to identify their input, intermediate, and output

More information

Centroid Detection by Gaussian Pattern Matching In Adaptive Optics

Centroid Detection by Gaussian Pattern Matching In Adaptive Optics 010 International Journal of Computer Appliations (0975-8887) Centroid Detetion by Gaussian Pattern Mathing In Adaptive Optis Akondi Vyas Indian Institute of Astrophysis Indian Institute of Siene M B Roopashree

More information

QCLAS Sensor for Purity Monitoring in Medical Gas Supply Lines

QCLAS Sensor for Purity Monitoring in Medical Gas Supply Lines DOI.56/sensoren6/P3. QLAS Sensor for Purity Monitoring in Medial Gas Supply Lines Henrik Zimmermann, Mathias Wiese, Alessandro Ragnoni neoplas ontrol GmbH, Walther-Rathenau-Str. 49a, 7489 Greifswald, Germany

More information

An Integrated Architecture of Adaptive Neural Network Control for Dynamic Systems

An Integrated Architecture of Adaptive Neural Network Control for Dynamic Systems An Integrated Arhiteture of Adaptive Neural Network Control for Dynami Systems Robert L. Tokar 2 Brian D.MVey2 'Center for Nonlinear Studies, 2Applied Theoretial Physis Division Los Alamos National Laboratory,

More information

On the Licensing of Innovations under Strategic Delegation

On the Licensing of Innovations under Strategic Delegation On the Liensing of Innovations under Strategi Delegation Judy Hsu Institute of Finanial Management Nanhua University Taiwan and X. Henry Wang Department of Eonomis University of Missouri USA Abstrat This

More information

Simplified Buckling Analysis of Skeletal Structures

Simplified Buckling Analysis of Skeletal Structures Simplified Bukling Analysis of Skeletal Strutures B.A. Izzuddin 1 ABSRAC A simplified approah is proposed for bukling analysis of skeletal strutures, whih employs a rotational spring analogy for the formulation

More information

EXACT TRAVELLING WAVE SOLUTIONS FOR THE GENERALIZED KURAMOTO-SIVASHINSKY EQUATION

EXACT TRAVELLING WAVE SOLUTIONS FOR THE GENERALIZED KURAMOTO-SIVASHINSKY EQUATION Journal of Mathematial Sienes: Advanes and Appliations Volume 3, 05, Pages -3 EXACT TRAVELLING WAVE SOLUTIONS FOR THE GENERALIZED KURAMOTO-SIVASHINSKY EQUATION JIAN YANG, XIAOJUAN LU and SHENGQIANG TANG

More information

Robust Recovery of Signals From a Structured Union of Subspaces

Robust Recovery of Signals From a Structured Union of Subspaces Robust Reovery of Signals From a Strutured Union of Subspaes 1 Yonina C. Eldar, Senior Member, IEEE and Moshe Mishali, Student Member, IEEE arxiv:87.4581v2 [nlin.cg] 3 Mar 29 Abstrat Traditional sampling

More information

The Laws of Acceleration

The Laws of Acceleration The Laws of Aeleration The Relationships between Time, Veloity, and Rate of Aeleration Copyright 2001 Joseph A. Rybzyk Abstrat Presented is a theory in fundamental theoretial physis that establishes the

More information

Phase Diffuser at the Transmitter for Lasercom Link: Effect of Partially Coherent Beam on the Bit-Error Rate.

Phase Diffuser at the Transmitter for Lasercom Link: Effect of Partially Coherent Beam on the Bit-Error Rate. Phase Diffuser at the Transmitter for Laserom Link: Effet of Partially Coherent Beam on the Bit-Error Rate. O. Korotkova* a, L. C. Andrews** a, R. L. Phillips*** b a Dept. of Mathematis, Univ. of Central

More information

RELAXED STABILIZATION CONDITIONS FOR SWITCHING T-S FUZZY SYSTEMS WITH PRACTICAL CONSTRAINTS. Received January 2011; revised July 2011

RELAXED STABILIZATION CONDITIONS FOR SWITCHING T-S FUZZY SYSTEMS WITH PRACTICAL CONSTRAINTS. Received January 2011; revised July 2011 International Journal of Innovative Computing, Information and Control ICIC International 2012 ISSN 139-198 Volume 8, Number 6, June 2012 pp. 133 15 RELAXED STABILIZATION CONDITIONS FOR SWITCHING T-S FUZZY

More information

A simple expression for radial distribution functions of pure fluids and mixtures

A simple expression for radial distribution functions of pure fluids and mixtures A simple expression for radial distribution funtions of pure fluids and mixtures Enrio Matteoli a) Istituto di Chimia Quantistia ed Energetia Moleolare, CNR, Via Risorgimento, 35, 56126 Pisa, Italy G.

More information

THE METHOD OF SECTIONING WITH APPLICATION TO SIMULATION, by Danie 1 Brent ~~uffman'i

THE METHOD OF SECTIONING WITH APPLICATION TO SIMULATION, by Danie 1 Brent ~~uffman'i THE METHOD OF SECTIONING '\ WITH APPLICATION TO SIMULATION, I by Danie 1 Brent ~~uffman'i Thesis submitted to the Graduate Faulty of the Virginia Polytehni Institute and State University in partial fulfillment

More information

CSC2515 Winter 2015 Introduc3on to Machine Learning. Lecture 5: Clustering, mixture models, and EM

CSC2515 Winter 2015 Introduc3on to Machine Learning. Lecture 5: Clustering, mixture models, and EM CSC2515 Winter 2015 Introdu3on to Mahine Learning Leture 5: Clustering, mixture models, and EM All leture slides will be available as.pdf on the ourse website: http://www.s.toronto.edu/~urtasun/ourses/csc2515/

More information

Optimization of replica exchange molecular dynamics by fast mimicking

Optimization of replica exchange molecular dynamics by fast mimicking THE JOURNAL OF CHEMICAL PHYSICS 127, 204104 2007 Optimization of replia exhange moleular dynamis by fast mimiking Jozef Hritz and Chris Oostenbrink a Leiden Amsterdam Center for Drug Researh (LACDR), Division

More information

Sensor management for PRF selection in the track-before-detect context

Sensor management for PRF selection in the track-before-detect context Sensor management for PRF seletion in the tra-before-detet ontext Fotios Katsilieris, Yvo Boers, and Hans Driessen Thales Nederland B.V. Haasbergerstraat 49, 7554 PA Hengelo, the Netherlands Email: {Fotios.Katsilieris,

More information

Multicomponent analysis on polluted waters by means of an electronic tongue

Multicomponent analysis on polluted waters by means of an electronic tongue Sensors and Atuators B 44 (1997) 423 428 Multiomponent analysis on polluted waters by means of an eletroni tongue C. Di Natale a, *, A. Maagnano a, F. Davide a, A. D Amio a, A. Legin b, Y. Vlasov b, A.

More information

Perturbation Analyses for the Cholesky Factorization with Backward Rounding Errors

Perturbation Analyses for the Cholesky Factorization with Backward Rounding Errors Perturbation Analyses for the holesky Fatorization with Bakward Rounding Errors Xiao-Wen hang Shool of omputer Siene, MGill University, Montreal, Quebe, anada, H3A A7 Abstrat. This paper gives perturbation

More information

An Integer Solution of Fractional Programming Problem

An Integer Solution of Fractional Programming Problem Gen. Math. Notes, Vol. 4, No., June 0, pp. -9 ISSN 9-784; Copyright ICSRS Publiation, 0 www.i-srs.org Available free online at http://www.geman.in An Integer Solution of Frational Programming Problem S.C.

More information

Performing Two-Way Analysis of Variance Under Variance Heterogeneity

Performing Two-Way Analysis of Variance Under Variance Heterogeneity Journal of Modern Applied Statistial Methods Volume Issue Artile 3 5--003 Performing Two-Way Analysis of Variane Under Variane Heterogeneity Sott J. Rihter University of North Carolina at Greensboro, sjriht@ung.edu

More information

A Differential Equation for Specific Catchment Area

A Differential Equation for Specific Catchment Area Proeedings of Geomorphometry 2009. Zurih, Sitzerland, 3 ugust - 2 September, 2009 Differential Equation for Speifi Cathment rea J. C. Gallant, M. F. Huthinson 2 CSIRO Land and Water, GPO Box 666, Canberra

More information