Research Article Robust ε-support Vector Regression

Size: px
Start display at page:

Download "Research Article Robust ε-support Vector Regression"

Transcription

1 Matheatical Probles in Engineering, Article ID , 5 pages Research Article Robust ε-support Vector Regression Yuan Lv and Zhong Gan School of Mechanical Engineering, Northwestern Polytechnical University, Mailbox 301, No. 127 Youyi Road, Xi an, Shaanxi , China Correspondence should be addressed to Yuan Lv; lvyuan85@live.co Received 1 August 2013; Revised 25 Noveber 2013; Accepted 25 Noveber 2013; Published 20 February 2014 Acadeic Editor: Andrzej Swierniak Copyright 2014 Y. Lv and Z. Gan. This is an open access article distributed under the Creative Coons Attribution License, which perits unrestricted use, distribution, and reproduction in any ediu, provided the original work is properly cited. Spheroid disturbance of input data brings great challenges to support vector regression; thus it is essential to study the robust regression odel. This paper is dedicated to establish a robust regression odel which akes the regression function robust against disturbance of data and syste paraeter. Firstly, two theores have been given to show that the robust linear ε-support vector regression proble could be settled by solving the dual probles. Secondly, it has been focused on the developent of robust support vector regression algorith which is extended fro linear doain to nonlinear doain. Finally, the nuerical experients result deonstrates the effectiveness of the odels and algoriths proposed in this paper. 1. Introduction As a new achine learning algorith, support vector achine (SVM) is based on the statistical learning theory which was proposed by Vapnik [1]andWolfedualprograing theory. Copared with other learning algoriths, SVM possesses a rigorous foundation of atheatical theory, wellgeneralized ability, and global optiu, with the result that is widely used in pattern recognition and functional regression. In general, the input data is assued to be accurate. However, in practice, the statistical errors and easuring errors of observed data are inevitable; that is, the data is disturbed. The effect of perturbation of input data on solving the regression proble is often neglected. Although SVR is robust, while the paraeters are optial, these paraeters are usually reasonably sensitive to noise data and singular value. The robust optiization ethod, which focused on treatability of coputation in the case of data points disturbing in convex sets, was first proposed by Soyster [2]anddeveloped,respectively, by Ben-Tal and Neirovski [3, 4] and El-Ghaoui et al. [5, 6]. Since then, Melvyn [7] presented ore coprehensive collection of disturbance without increasing the coputing difficulty, which enabled the rapid developent of the robust optiization ethod. The research aied on solving the prograing probles with polyhedron disturbance or spheroid disturbance. Bertsias and Si [8] and Alizadeh and Goldfarb [9] put forward the robust linear prograing degenerated into a second-order cone prograing, secondorder cone prograing into a seidefinite prograing, and sei-definite prograing into the NP proble. Zhixia et al. [10] applied this ethod to classification proble and proposed the robust support vector classification. This paper ais to study the regression probles of spheroid disturbance by eans of the robust optiization ethod andtoestablisharobustregressionodelwhichakesthe regression functions highly robust for perturbed data and syste paraeters. 2. Theory 2.1. Training Set. Assue that the training set is where T = {(χ 1,y 1 ),...,(χ,y )}, (1) χ i ={x i +r i u i u i 1},,...,. (2) The input set x i can be viewed as a sphere of radius r i centering on the easured value x i, r i is closely related to easureent accuracy, and is saple size.

2 2 Matheatical Probles in Engineering 2.2. Original Proble. Referring to the original proble of linear ε-support vector regression, it is not difficult to derive the original proble of robust linear ε-support vector regression, which is a convex quadratic prograing, by replacing the original input with the input collection χ i : in w,b,ξ, ( 1 2 ) w 2 +C (ξ i + ξ i ), s.t. (w(x i +r i u i )+b) y i ε+ξ i, y i (w(x i +r i u i )+b) ε+ξ i,,...,. Theore 1. The necessary and sufficient conditions for the solution of the original optiization proble (3) are that w, b, ξ,andξ are the solutions for the following second-order cone prograing: in w,b,ξ,ξ,u,v,t ( 1 2 ) (u V) +C (ξ i + ξ i ), s.t. (w x i )+b y i +r i t ε+ξ i, y i ((w x i )+b)+r i t ε+ξ i, u+v =1, u ( t) L 3, V,...,, (3) (4) If variable t is introduced, satisfying w t, theabove proble could prove to be a second-order cone prograing: in w,b,ξ, ( 1 2 ) t2 +C (ξ i + ξ i ), s.t. (w x i )+b y i +r i t ε+ξ i, y i ((w x i )+b)+r i t ε+ξ i, w t.,...,, When variables u and V are brought in, u and V are supposed to satisfy the linear constraint u+v = 1 and the second-order cone constraint (t 2 + V 2 ) 1/2 u,andthe nonlinear ters t 2 in the objective function are replaced with u V; therefore, the proble (7) can be written as the proble (4). Fro the above, the proble (3) isequivalenttothe proble (4) Dual Proble. Introduce the Lagrange function: L= ( 1 2 ) (u V) +C (ξ i + ξ i ) + + α i ((w x i )+b y i +r i t ε ξ i ) β i (y i ((w x i )+b)+r i t ε ξ i ) θ i ξ i η i ξ i +φ (u+v 1) (7) (8) ( t w ) Ln+1, where n is the diensionality of the input data χ i. Proof. Since ax {r i (w u i ), u i 1}=r i w, in {r i (w u i ), u i 1}= r i w. The original proble (3)can be converted to in w,b,ξ, ( 1 2 ) w 2 +C (ξ i + ξ i ), s.t. (w x i )+b y i +r i w ε+ξ i, y i ((w x i )+b)+r i w ε+ξ i, (5) (6) z u u z V V δt z t t z T w w, where α, β, θ, η R,φ,z u,z V,δ,z t R,z w R n are the ultipliers vectors corresponding to the constraints. Eliinating variables θ, η, z t,andz w, the following theore can be obtained. Theore 2. The following second-order cone prograing is the dual proble of the second-order cone prograing (4). ax α,β,φ,δ,z u,z V α i (y i +ε)+ s.t. z u φ= 1 2, z V φ= 1 2, 0 α i C, 0 β i C, β i (y i ε) φ,,...,. z u (δ 2 +z 2 V )1/2,

3 Matheatical Probles in Engineering 3 α i β i =0,,...,, j=1,...,, δ r i (α i +β i ) ( (α i β i )(α j β j )(x i x j )) j=1 Proof. The feasible region of the second-order cone prograing proble (4)is 1/2. (9) Proceeding fro the lower bound of the estiate of p, the Lagrange function of the second-order cone prograing proble (4) is introduced and written as L(w, b, ξ, ξ, u, V,t,α,β,φ,δ,z u,z V ): w, b, ξ, ξ, u, V,t D,α,β,φ,δ,z u,z V E, L (w, b, ξ, ξ, u, V,t,α,β,φ,δ,z u,z V ) ( 1 2 ) (u V) +C (ξ i + ξ i ), inf w,b,ξ,ξ,u,v,t R n L (w, b, ξ, ξ, u, V,t,α,β,φ,δ,z u,z V ) inf L (w, b, ξ, ξ, u, V,t,α,β,φ,δ,z u,z V ) w,b,ξ,ξ,u,v,t D (13) D= { { w, b, ξ, ξ, u, V,t (w x i )+b y i { +r i t ε+ξ i,y i ((w x i )+b) +r i t ε+ξ i, (10) inf w,b,ξ,ξ,u,v,t D let g(α,β,φ,δ,z u,z V ) (( 1 2 ) (u V) +C (ξ i + ξ i )).,...,,u (t 2 + V 2 ) 1/2, t ( n j=1 w 2 j 1/2,j=1,...n; ) u+v =1 } }. } The feasible region of the dual proble (9)is E= {α,β,φ,δ,z u,z V 0 α i C, 0 β i C,z u (δ 2 +z 2 V )1/2, δ r i (α i +β i ) ( (α i β i )(α j β j )(x i x j )) j=1 z u φ= 1 2,z V φ= 1 2, α i 1/2 β i =0,,...,, j=1,...,}. Assuing that the optial value of the proble (4)isp, p, (11) =inf {( 1 2 ) (u V)+C (ξ i + ξ i ) w, b, ξ, ξ, u, V,t D}. (12) thus = inf w,b,ξ,ξ,u,v,t R n L (w, b, ξ, ξ, u, V,t,α,β,φ,δ,z u,z V ); (14) g(α,β,φ,δ,z u,z V ) p. (15) Forula (15) iplies w, b, ξ, ξ, u, V, t D, α, β, φ, δ, z u,z V E, g(α, β, φ, δ, z u,z V ) is the lower bound of p. To get the optial value, the axial value of g(α, β, φ, δ, z u,z V ), w,b,ξ,ξ, u, V,t D,α,β,φ,δ,z u,z V E is need to be obtained. So, the dual proble (9) is deduced Robust SVR Algorith Linear Robust SVR Algorith. (1) The given training data should be written as the way of forulae (1) (2). (2) By constructing and solving the secondorder cone prograing proble (9), the solution (α i,β i,φ,δ,z u,z V ) can be obtained. Besides, it is necessary to select an appropriate penalty paraeter C>0. (3) The solution (w,b ) of the original proble (4) is acquired in the following way: w = δ (α i β i )x i ( (α i +βi )r i δ ). (16) Select a coponent α j of α or a coponent β k of β,which is located in the open interval (0, C); if α j is chosen, it is got as follows: b = y j δ (α i β i )(x i x j ) ( (α i +β i )r i δ )+ε+r i δ. (17)

4 4 Matheatical Probles in Engineering If β k is chosen, it is got as follows: b = y k δ (α i β i )(x i x k ) ( (α i +β i )r i δ ) ε r i δ. (18) (4) Then regression function is established: f (x) = δ (α i β i )(x i x) ( (α i +β i )r i δ )+b. (19) Nonlinear Robust SVR Algorith. By aking the following odifications to the algorith shown in Section 2.4.1, the nonlinear robust SVR algorith can be constructed: (1) Kernel function K(x i,x j ) (coon kernel functions include Gaussian kernel function, polynoial kernel function, Cauchy kernel function, and Laplace kernel function) is introduced; replace the inner products (x i x j ) and (x i x)appearing in the linear robust SVR algorith shown in Section 2.4.1, withk(x i,x j ) and K(x i,x); (2) Substitute R i for r i appearing in the second-order cone prograing (9). To get R i, the apping should be written as X=Φ(x); obviously, χ i x i r i is equivalent to Φ(χ i) Φ(x i ) 2 So R i =F(r i ). = ((Φ (χ i ) Φ(x i )) (Φ (χ i ) Φ(x i ))) =K(χ i,χ i ) 2K(χ i,x i )+K(x i,x i ) =F( χ i x i ) R2 i. 3. Nuerical Experients 3.1. Linear Nuerical Experient (20) Exaple 1. Given that objective function is y = 2x (x [1, 10]), the training set could be written as the way of forulae (1) (2), where x i is 10 equidistant points distributed in the interval [1, 10]. r i can be regarded as a constant which equals 0.5. The disturbance of data u i is a rando nuber which is located in the interval [ 1, 1], wherei = 10.The ethods of standard SVR and robust SVR shown in Section are eployed to regress and fit the objective function. Then the errors of ten test points, which are randoly selected, are analysed and copared. The results are shown in Table Nonlinear Nuerical Experients Exaple 2. Given that objective function is y = x 2 (x [0.5, 10]), thetrainingsetcouldbewritteninthewayof forulae (1) (2), where x i is 20 equidistant points distributed in the interval [0.5, 10]. r i can be regarded as a constant which Table 1: Table of error analysis of linear function regression. Test points Algorith Linear standard SVR Linear robust SVR Point Point Point Point Point Point Point Point Point Point AVG Table 2: Table of error analysis of nonlinear function regression. Test points Algorith Nonlinear standard SVR Nonlinear robust SVR Point Point Point Point Point Point Point Point Point Point AVG equals The disturbance of data u i is a rando nuber which is located in the interval [ 1, 1], wherei = 20.The ethodsofstandardsvrandrobustsvrshowninsection areeployedtoregressandfittheobjectivefunction. Then the errors of ten test points, which are randoly selected, are analysed and copared. The results are given in Table Discussion. Obviously, the syste paraeters are very robust. In addition, it is shown that the forecast errors oftherobustethodarelowerthanthatofthestandard SVM ethod in Table 1. Asiilarconclusioncanalsobe presented in Table 2. In short, the robust SVR ethod has beenvalidatedtobeofadvantageinthefieldofregression analysis with perturbation proble. 4. Conclusions (1) In the paper, a robust ε-svr odel has been proposed, considering the input data as the convex set of spheroid disturbance, and two iportant theores arealsogiven:itisprovedbytheore1 that the

5 Matheatical Probles in Engineering 5 convex quadratic prograing proble can be written as the second-order cone prograing. Based on duality theory, the dual proble can be deduced by Theore 2.Bysolvingthedualproble,theoptial solution of original proble is obtained. (2) By virtue of kernel function, the robust linear ε-svr odel could be prooted to the nonlinear doains. Finally, the nuerical experients deonstrate the robustness and effectiveness of the proposed algorith. Conflict of Interests The authors declare that there is no conflict of interests regarding the publication of this paper. Acknowledgents The theores and algoriths proposed in this paper are the extension of robust support vector classification ethod [10], which provides a great assistance for the authors. The authors are here to express their sincere gratitude. References [1] V. N. Vapnik, The Nature of Statistical Learning Theory,Springer, New York, NY, USA, [2] A. L. Soyster, Inexact linear prograing with generalized resource sets, European Operational Research,vol.3, no. 4, pp , [3] A. Ben-Tal and A. Neirovski, Robust convex optiization, Matheatics of Operations Research,vol.23,no.4,pp , [4] A. Ben-Tal and A. Neirovski, Robust solutions of uncertain linear progras, Operations Research Letters,vol.25,no.1,pp. 1 13, [5] L.ElGhaouiandH.Lebret, Robustsolutionstoleast-squares probles with uncertain data, SIAM Journal on Matrix Analysis and Applications, vol. 18, no. 4, pp , [6] L. El Ghaoui, F. Oustry, and H. Lebret, Robust solutions to uncertain seidefinite progras, SIAM Journal on Optiization,vol.9,no.1,pp.33 52,1999. [7] S. Melvyn, Robust optiization [Ph.D. thesis],2004. [8] D. Bertsias and M. Si, Tractable approxiations to robust conic optiization probles, Matheatical Prograing, vol. 107,no.1-2,pp.5 36,2006. [9] F. Alizadeh and D. Goldfarb, Second-order cone prograing, Matheatical Prograing, Series B, vol. 95, no. 1, pp. 3 51, [10] Y. Zhixia, T. Yingjie, and D. Naiyang, Second order cone prograing forulations for robust support vector ordinal regression achine, in Proceedings of the 1st International Syposiu on Optiization and Systes Biology, pp , 2013.

6 Advances in Operations Research Advances in Decision Sciences Applied Matheatics Algebra Probability and Statistics The Scientific World Journal International Differential Equations Subit your anuscripts at International Advances in Cobinatorics Matheatical Physics Coplex Analysis International Matheatics and Matheatical Sciences Matheatical Probles in Engineering Matheatics Discrete Matheatics Discrete Dynaics in Nature and Society Function Spaces Abstract and Applied Analysis International Stochastic Analysis Optiization

Kernel Methods and Support Vector Machines

Kernel Methods and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic

More information

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering

More information

Support Vector Machines. Maximizing the Margin

Support Vector Machines. Maximizing the Margin Support Vector Machines Support vector achines (SVMs) learn a hypothesis: h(x) = b + Σ i= y i α i k(x, x i ) (x, y ),..., (x, y ) are the training exs., y i {, } b is the bias weight. α,..., α are the

More information

1 Bounding the Margin

1 Bounding the Margin COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost

More information

Research Article Some Formulae of Products of the Apostol-Bernoulli and Apostol-Euler Polynomials

Research Article Some Formulae of Products of the Apostol-Bernoulli and Apostol-Euler Polynomials Discrete Dynaics in Nature and Society Volue 202, Article ID 927953, pages doi:055/202/927953 Research Article Soe Forulae of Products of the Apostol-Bernoulli and Apostol-Euler Polynoials Yuan He and

More information

Bayes Decision Rule and Naïve Bayes Classifier

Bayes Decision Rule and Naïve Bayes Classifier Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.

More information

Support Vector Machines. Goals for the lecture

Support Vector Machines. Goals for the lecture Support Vector Machines Mark Craven and David Page Coputer Sciences 760 Spring 2018 www.biostat.wisc.edu/~craven/cs760/ Soe of the slides in these lectures have been adapted/borrowed fro aterials developed

More information

Model Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon

Model Fitting. CURM Background Material, Fall 2014 Dr. Doreen De Leon Model Fitting CURM Background Material, Fall 014 Dr. Doreen De Leon 1 Introduction Given a set of data points, we often want to fit a selected odel or type to the data (e.g., we suspect an exponential

More information

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines

Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes

More information

Feature Extraction Techniques

Feature Extraction Techniques Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that

More information

Support Vector Machines. Machine Learning Series Jerry Jeychandra Blohm Lab

Support Vector Machines. Machine Learning Series Jerry Jeychandra Blohm Lab Support Vector Machines Machine Learning Series Jerry Jeychandra Bloh Lab Outline Main goal: To understand how support vector achines (SVMs) perfor optial classification for labelled data sets, also a

More information

Geometrical intuition behind the dual problem

Geometrical intuition behind the dual problem Based on: Geoetrical intuition behind the dual proble KP Bennett, EJ Bredensteiner, Duality and Geoetry in SVM Classifiers, Proceedings of the International Conference on Machine Learning, 2000 1 Geoetrical

More information

Soft-margin SVM can address linearly separable problems with outliers

Soft-margin SVM can address linearly separable problems with outliers Non-linear Support Vector Machines Non-linearly separable probles Hard-argin SVM can address linearly separable probles Soft-argin SVM can address linearly separable probles with outliers Non-linearly

More information

Research Article On the Isolated Vertices and Connectivity in Random Intersection Graphs

Research Article On the Isolated Vertices and Connectivity in Random Intersection Graphs International Cobinatorics Volue 2011, Article ID 872703, 9 pages doi:10.1155/2011/872703 Research Article On the Isolated Vertices and Connectivity in Rando Intersection Graphs Yilun Shang Institute for

More information

Research Article Applications of Normal S-Iterative Method to a Nonlinear Integral Equation

Research Article Applications of Normal S-Iterative Method to a Nonlinear Integral Equation e Scientific World Journal Volue 2014, Article ID 943127, 5 pages http://dx.doi.org/10.1155/2014/943127 Research Article Applications of Noral S-Iterative Method to a Nonlinear Integral Equation Faik Gürsoy

More information

UNIVERSITY OF TRENTO ON THE USE OF SVM FOR ELECTROMAGNETIC SUBSURFACE SENSING. A. Boni, M. Conci, A. Massa, and S. Piffer.

UNIVERSITY OF TRENTO ON THE USE OF SVM FOR ELECTROMAGNETIC SUBSURFACE SENSING. A. Boni, M. Conci, A. Massa, and S. Piffer. UIVRSITY OF TRTO DIPARTITO DI IGGRIA SCIZA DLL IFORAZIO 3823 Povo Trento (Italy) Via Soarive 4 http://www.disi.unitn.it O TH US OF SV FOR LCTROAGTIC SUBSURFAC SSIG A. Boni. Conci A. assa and S. Piffer

More information

Optimum Value of Poverty Measure Using Inverse Optimization Programming Problem

Optimum Value of Poverty Measure Using Inverse Optimization Programming Problem International Journal of Conteporary Matheatical Sciences Vol. 14, 2019, no. 1, 31-42 HIKARI Ltd, www.-hikari.co https://doi.org/10.12988/ijcs.2019.914 Optiu Value of Poverty Measure Using Inverse Optiization

More information

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE7C (Spring 018: Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee7c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee7c@berkeley.edu October 15,

More information

Ştefan ŞTEFĂNESCU * is the minimum global value for the function h (x)

Ştefan ŞTEFĂNESCU * is the minimum global value for the function h (x) 7Applying Nelder Mead s Optiization Algorith APPLYING NELDER MEAD S OPTIMIZATION ALGORITHM FOR MULTIPLE GLOBAL MINIMA Abstract Ştefan ŞTEFĂNESCU * The iterative deterinistic optiization ethod could not

More information

MSEC MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL SOLUTION FOR MAINTENANCE AND PERFORMANCE

MSEC MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL SOLUTION FOR MAINTENANCE AND PERFORMANCE Proceeding of the ASME 9 International Manufacturing Science and Engineering Conference MSEC9 October 4-7, 9, West Lafayette, Indiana, USA MSEC9-8466 MODELING OF DEGRADATION PROCESSES TO OBTAIN AN OPTIMAL

More information

Estimating Parameters for a Gaussian pdf

Estimating Parameters for a Gaussian pdf Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee227c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee227c@berkeley.edu October

More information

Introduction to Discrete Optimization

Introduction to Discrete Optimization Prof. Friedrich Eisenbrand Martin Nieeier Due Date: March 9 9 Discussions: March 9 Introduction to Discrete Optiization Spring 9 s Exercise Consider a school district with I neighborhoods J schools and

More information

Robustness and Regularization of Support Vector Machines

Robustness and Regularization of Support Vector Machines Robustness and Regularization of Support Vector Machines Huan Xu ECE, McGill University Montreal, QC, Canada xuhuan@ci.cgill.ca Constantine Caraanis ECE, The University of Texas at Austin Austin, TX, USA

More information

Lecture 9: Multi Kernel SVM

Lecture 9: Multi Kernel SVM Lecture 9: Multi Kernel SVM Stéphane Canu stephane.canu@litislab.eu Sao Paulo 204 April 6, 204 Roadap Tuning the kernel: MKL The ultiple kernel proble Sparse kernel achines for regression: SVR SipleMKL:

More information

Handwriting Detection Model Based on Four-Dimensional Vector Space Model

Handwriting Detection Model Based on Four-Dimensional Vector Space Model Journal of Matheatics Research; Vol. 10, No. 4; August 2018 ISSN 1916-9795 E-ISSN 1916-9809 Published by Canadian Center of Science and Education Handwriting Detection Model Based on Four-Diensional Vector

More information

Stochastic Subgradient Methods

Stochastic Subgradient Methods Stochastic Subgradient Methods Lingjie Weng Yutian Chen Bren School of Inforation and Coputer Science University of California, Irvine {wengl, yutianc}@ics.uci.edu Abstract Stochastic subgradient ethods

More information

Research Article Approximate Multidegree Reduction of λ-bézier Curves

Research Article Approximate Multidegree Reduction of λ-bézier Curves Matheatical Probles in Engineering Volue 6 Article ID 87 pages http://dxdoiorg//6/87 Research Article Approxiate Multidegree Reduction of λ-bézier Curves Gang Hu Huanxin Cao and Suxia Zhang Departent of

More information

Support Vector Machines MIT Course Notes Cynthia Rudin

Support Vector Machines MIT Course Notes Cynthia Rudin Support Vector Machines MIT 5.097 Course Notes Cynthia Rudin Credit: Ng, Hastie, Tibshirani, Friedan Thanks: Şeyda Ertekin Let s start with soe intuition about argins. The argin of an exaple x i = distance

More information

Hybrid System Identification: An SDP Approach

Hybrid System Identification: An SDP Approach 49th IEEE Conference on Decision and Control Deceber 15-17, 2010 Hilton Atlanta Hotel, Atlanta, GA, USA Hybrid Syste Identification: An SDP Approach C Feng, C M Lagoa, N Ozay and M Sznaier Abstract The

More information

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION Vol. IX Uncertainty Models For Robustness Analysis - A. Garulli, A. Tesi and A. Vicino

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION Vol. IX Uncertainty Models For Robustness Analysis - A. Garulli, A. Tesi and A. Vicino UNCERTAINTY MODELS FOR ROBUSTNESS ANALYSIS A. Garulli Dipartiento di Ingegneria dell Inforazione, Università di Siena, Italy A. Tesi Dipartiento di Sistei e Inforatica, Università di Firenze, Italy A.

More information

The Solution of One-Phase Inverse Stefan Problem. by Homotopy Analysis Method

The Solution of One-Phase Inverse Stefan Problem. by Homotopy Analysis Method Applied Matheatical Sciences, Vol. 8, 214, no. 53, 2635-2644 HIKARI Ltd, www.-hikari.co http://dx.doi.org/1.12988/as.214.43152 The Solution of One-Phase Inverse Stefan Proble by Hootopy Analysis Method

More information

Machine Learning Basics: Estimators, Bias and Variance

Machine Learning Basics: Estimators, Bias and Variance Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics

More information

HIGH RESOLUTION NEAR-FIELD MULTIPLE TARGET DETECTION AND LOCALIZATION USING SUPPORT VECTOR MACHINES

HIGH RESOLUTION NEAR-FIELD MULTIPLE TARGET DETECTION AND LOCALIZATION USING SUPPORT VECTOR MACHINES ICONIC 2007 St. Louis, O, USA June 27-29, 2007 HIGH RESOLUTION NEAR-FIELD ULTIPLE TARGET DETECTION AND LOCALIZATION USING SUPPORT VECTOR ACHINES A. Randazzo,. A. Abou-Khousa 2,.Pastorino, and R. Zoughi

More information

Computational and Statistical Learning Theory

Computational and Statistical Learning Theory Coputational and Statistical Learning Theory Proble sets 5 and 6 Due: Noveber th Please send your solutions to learning-subissions@ttic.edu Notations/Definitions Recall the definition of saple based Radeacher

More information

Probabilistic Machine Learning

Probabilistic Machine Learning Probabilistic Machine Learning by Prof. Seungchul Lee isystes Design Lab http://isystes.unist.ac.kr/ UNIST Table of Contents I.. Probabilistic Linear Regression I... Maxiu Likelihood Solution II... Maxiu-a-Posteriori

More information

PAC-Bayes Analysis Of Maximum Entropy Learning

PAC-Bayes Analysis Of Maximum Entropy Learning PAC-Bayes Analysis Of Maxiu Entropy Learning John Shawe-Taylor and David R. Hardoon Centre for Coputational Statistics and Machine Learning Departent of Coputer Science University College London, UK, WC1E

More information

Convex Programming for Scheduling Unrelated Parallel Machines

Convex Programming for Scheduling Unrelated Parallel Machines Convex Prograing for Scheduling Unrelated Parallel Machines Yossi Azar Air Epstein Abstract We consider the classical proble of scheduling parallel unrelated achines. Each job is to be processed by exactly

More information

Shannon Sampling II. Connections to Learning Theory

Shannon Sampling II. Connections to Learning Theory Shannon Sapling II Connections to Learning heory Steve Sale oyota echnological Institute at Chicago 147 East 60th Street, Chicago, IL 60637, USA E-ail: sale@athberkeleyedu Ding-Xuan Zhou Departent of Matheatics,

More information

Introduction to Machine Learning. Recitation 11

Introduction to Machine Learning. Recitation 11 Introduction to Machine Learning Lecturer: Regev Schweiger Recitation Fall Seester Scribe: Regev Schweiger. Kernel Ridge Regression We now take on the task of kernel-izing ridge regression. Let x,...,

More information

Study on Markov Alternative Renewal Reward. Process for VLSI Cell Partitioning

Study on Markov Alternative Renewal Reward. Process for VLSI Cell Partitioning Int. Journal of Math. Analysis, Vol. 7, 2013, no. 40, 1949-1960 HIKARI Ltd, www.-hikari.co http://dx.doi.org/10.12988/ia.2013.36142 Study on Markov Alternative Renewal Reward Process for VLSI Cell Partitioning

More information

The Methods of Solution for Constrained Nonlinear Programming

The Methods of Solution for Constrained Nonlinear Programming Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 01-06 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.co The Methods of Solution for Constrained

More information

CS Lecture 13. More Maximum Likelihood

CS Lecture 13. More Maximum Likelihood CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood

More information

Ensemble Based on Data Envelopment Analysis

Ensemble Based on Data Envelopment Analysis Enseble Based on Data Envelopent Analysis So Young Sohn & Hong Choi Departent of Coputer Science & Industrial Systes Engineering, Yonsei University, Seoul, Korea Tel) 82-2-223-404, Fax) 82-2- 364-7807

More information

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics

ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS. A Thesis. Presented to. The Faculty of the Department of Mathematics ESTIMATING AND FORMING CONFIDENCE INTERVALS FOR EXTREMA OF RANDOM POLYNOMIALS A Thesis Presented to The Faculty of the Departent of Matheatics San Jose State University In Partial Fulfillent of the Requireents

More information

Anton Bourdine. 1. Introduction. and approximate propagation constants by following simple ratio (Equation (32.22) in [1]):

Anton Bourdine. 1. Introduction. and approximate propagation constants by following simple ratio (Equation (32.22) in [1]): Matheatical Probles in Engineering olue 5, Article ID 843, pages http://dx.doi.org/.55/5/843 Research Article Fast and Siple Method for Evaluation of Polarization Correction to Propagation Constant of

More information

1 Proof of learning bounds

1 Proof of learning bounds COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #4 Scribe: Akshay Mittal February 13, 2013 1 Proof of learning bounds For intuition of the following theore, suppose there exists a

More information

Research Article Perturbations of Polynomials with Operator Coefficients

Research Article Perturbations of Polynomials with Operator Coefficients Coplex Analysis Volue 2013, Article ID 801382, 5 pages http://dx.doi.org/10.1155/2013/801382 Research Article Perturbations of Polynoials with Operator Coefficients Michael Gil Departent of Matheatics,

More information

Research Article Norm Estimates for Solutions of Polynomial Operator Equations

Research Article Norm Estimates for Solutions of Polynomial Operator Equations Matheatics Volue 5, Article ID 5489, 7 pages http://dxdoiorg/55/5/5489 Research Article Nor Estiates for Solutions of Polynoial Operator Equations Michael Gil Departent of Matheatics, Ben Gurion University

More information

Bipartite subgraphs and the smallest eigenvalue

Bipartite subgraphs and the smallest eigenvalue Bipartite subgraphs and the sallest eigenvalue Noga Alon Benny Sudaov Abstract Two results dealing with the relation between the sallest eigenvalue of a graph and its bipartite subgraphs are obtained.

More information

FPTAS for optimizing polynomials over the mixed-integer points of polytopes in fixed dimension

FPTAS for optimizing polynomials over the mixed-integer points of polytopes in fixed dimension Matheatical Prograing anuscript No. (will be inserted by the editor) Jesús A. De Loera Rayond Heecke Matthias Köppe Robert Weisantel FPTAS for optiizing polynoials over the ixed-integer points of polytopes

More information

Predicting FTSE 100 Close Price Using Hybrid Model

Predicting FTSE 100 Close Price Using Hybrid Model SAI Intelligent Systes Conference 2015 Noveber 10-11, 2015 London, UK Predicting FTSE 100 Close Price Using Hybrid Model Bashar Al-hnaity, Departent of Electronic and Coputer Engineering, Brunel University

More information

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds

More information

e-companion ONLY AVAILABLE IN ELECTRONIC FORM

e-companion ONLY AVAILABLE IN ELECTRONIC FORM OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer

More information

Complex Quadratic Optimization and Semidefinite Programming

Complex Quadratic Optimization and Semidefinite Programming Coplex Quadratic Optiization and Seidefinite Prograing Shuzhong Zhang Yongwei Huang August 4 Abstract In this paper we study the approxiation algoriths for a class of discrete quadratic optiization probles

More information

Outline. Basic concepts: SVM and kernels SVM primal/dual problems. Chih-Jen Lin (National Taiwan Univ.) 1 / 22

Outline. Basic concepts: SVM and kernels SVM primal/dual problems. Chih-Jen Lin (National Taiwan Univ.) 1 / 22 Outline Basic concepts: SVM and kernels SVM primal/dual problems Chih-Jen Lin (National Taiwan Univ.) 1 / 22 Outline Basic concepts: SVM and kernels Basic concepts: SVM and kernels SVM primal/dual problems

More information

Perturbation on Polynomials

Perturbation on Polynomials Perturbation on Polynoials Isaila Diouf 1, Babacar Diakhaté 1 & Abdoul O Watt 2 1 Départeent Maths-Infos, Université Cheikh Anta Diop, Dakar, Senegal Journal of Matheatics Research; Vol 5, No 3; 2013 ISSN

More information

Comparison of Stability of Selected Numerical Methods for Solving Stiff Semi- Linear Differential Equations

Comparison of Stability of Selected Numerical Methods for Solving Stiff Semi- Linear Differential Equations International Journal of Applied Science and Technology Vol. 7, No. 3, Septeber 217 Coparison of Stability of Selected Nuerical Methods for Solving Stiff Sei- Linear Differential Equations Kwaku Darkwah

More information

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions

An l 1 Regularized Method for Numerical Differentiation Using Empirical Eigenfunctions Journal of Matheatical Research with Applications Jul., 207, Vol. 37, No. 4, pp. 496 504 DOI:0.3770/j.issn:2095-265.207.04.0 Http://jre.dlut.edu.cn An l Regularized Method for Nuerical Differentiation

More information

The Algorithms Optimization of Artificial Neural Network Based on Particle Swarm

The Algorithms Optimization of Artificial Neural Network Based on Particle Swarm Send Orders for Reprints to reprints@benthascience.ae The Open Cybernetics & Systeics Journal, 04, 8, 59-54 59 Open Access The Algoriths Optiization of Artificial Neural Network Based on Particle Swar

More information

Research Article A New Biparametric Family of Two-Point Optimal Fourth-Order Multiple-Root Finders

Research Article A New Biparametric Family of Two-Point Optimal Fourth-Order Multiple-Root Finders Applied Matheatics, Article ID 737305, 7 pages http://dx.doi.org/10.1155/2014/737305 Research Article A New Biparaetric Faily of Two-Point Optial Fourth-Order Multiple-Root Finders Young Ik Ki and Young

More information

REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION

REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION ISSN 139 14X INFORMATION TECHNOLOGY AND CONTROL, 008, Vol.37, No.3 REDUCTION OF FINITE ELEMENT MODELS BY PARAMETER IDENTIFICATION Riantas Barauskas, Vidantas Riavičius Departent of Syste Analysis, Kaunas

More information

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks

Intelligent Systems: Reasoning and Recognition. Artificial Neural Networks Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

On the Communication Complexity of Lipschitzian Optimization for the Coordinated Model of Computation

On the Communication Complexity of Lipschitzian Optimization for the Coordinated Model of Computation journal of coplexity 6, 459473 (2000) doi:0.006jco.2000.0544, available online at http:www.idealibrary.co on On the Counication Coplexity of Lipschitzian Optiization for the Coordinated Model of Coputation

More information

Multi-view Discriminative Manifold Embedding for Pattern Classification

Multi-view Discriminative Manifold Embedding for Pattern Classification Multi-view Discriinative Manifold Ebedding for Pattern Classification X. Wang Departen of Inforation Zhenghzou 450053, China Y. Guo Departent of Digestive Zhengzhou 450053, China Z. Wang Henan University

More information

arxiv: v1 [cs.ds] 29 Jan 2012

arxiv: v1 [cs.ds] 29 Jan 2012 A parallel approxiation algorith for ixed packing covering seidefinite progras arxiv:1201.6090v1 [cs.ds] 29 Jan 2012 Rahul Jain National U. Singapore January 28, 2012 Abstract Penghui Yao National U. Singapore

More information

On Fuzzy Three Level Large Scale Linear Programming Problem

On Fuzzy Three Level Large Scale Linear Programming Problem J. Stat. Appl. Pro. 3, No. 3, 307-315 (2014) 307 Journal of Statistics Applications & Probability An International Journal http://dx.doi.org/10.12785/jsap/030302 On Fuzzy Three Level Large Scale Linear

More information

An Improved Particle Filter with Applications in Ballistic Target Tracking

An Improved Particle Filter with Applications in Ballistic Target Tracking Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing

More information

Asynchronous Gossip Algorithms for Stochastic Optimization

Asynchronous Gossip Algorithms for Stochastic Optimization Asynchronous Gossip Algoriths for Stochastic Optiization S. Sundhar Ra ECE Dept. University of Illinois Urbana, IL 680 ssrini@illinois.edu A. Nedić IESE Dept. University of Illinois Urbana, IL 680 angelia@illinois.edu

More information

Block designs and statistics

Block designs and statistics Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent

More information

COS 424: Interacting with Data. Written Exercises

COS 424: Interacting with Data. Written Exercises COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well

More information

3D acoustic wave modeling with a time-space domain dispersion-relation-based Finite-difference scheme

3D acoustic wave modeling with a time-space domain dispersion-relation-based Finite-difference scheme P-8 3D acoustic wave odeling with a tie-space doain dispersion-relation-based Finite-difference schee Yang Liu * and rinal K. Sen State Key Laboratory of Petroleu Resource and Prospecting (China University

More information

Tight Complexity Bounds for Optimizing Composite Objectives

Tight Complexity Bounds for Optimizing Composite Objectives Tight Coplexity Bounds for Optiizing Coposite Objectives Blake Woodworth Toyota Technological Institute at Chicago Chicago, IL, 60637 blake@ttic.edu Nathan Srebro Toyota Technological Institute at Chicago

More information

Non-Parametric Non-Line-of-Sight Identification 1

Non-Parametric Non-Line-of-Sight Identification 1 Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,

More information

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels

Extension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels Extension of CSRSM for the Paraetric Study of the Face Stability of Pressurized Tunnels Guilhe Mollon 1, Daniel Dias 2, and Abdul-Haid Soubra 3, M.ASCE 1 LGCIE, INSA Lyon, Université de Lyon, Doaine scientifique

More information

Pattern Recognition and Machine Learning. Artificial Neural networks

Pattern Recognition and Machine Learning. Artificial Neural networks Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial

More information

Probability Distributions

Probability Distributions Probability Distributions In Chapter, we ephasized the central role played by probability theory in the solution of pattern recognition probles. We turn now to an exploration of soe particular exaples

More information

Solving initial value problems by residual power series method

Solving initial value problems by residual power series method Theoretical Matheatics & Applications, vol.3, no.1, 13, 199-1 ISSN: 179-9687 (print), 179-979 (online) Scienpress Ltd, 13 Solving initial value probles by residual power series ethod Mohaed H. Al-Sadi

More information

Using a De-Convolution Window for Operating Modal Analysis

Using a De-Convolution Window for Operating Modal Analysis Using a De-Convolution Window for Operating Modal Analysis Brian Schwarz Vibrant Technology, Inc. Scotts Valley, CA Mark Richardson Vibrant Technology, Inc. Scotts Valley, CA Abstract Operating Modal Analysis

More information

A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION

A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION A eshsize boosting algorith in kernel density estiation A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION C.C. Ishiekwene, S.M. Ogbonwan and J.E. Osewenkhae Departent of Matheatics, University

More information

arxiv: v1 [cs.lg] 8 Jan 2019

arxiv: v1 [cs.lg] 8 Jan 2019 Data Masking with Privacy Guarantees Anh T. Pha Oregon State University phatheanhbka@gail.co Shalini Ghosh Sasung Research shalini.ghosh@gail.co Vinod Yegneswaran SRI international vinod@csl.sri.co arxiv:90.085v

More information

Recovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup)

Recovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup) Recovering Data fro Underdeterined Quadratic Measureents (CS 229a Project: Final Writeup) Mahdi Soltanolkotabi Deceber 16, 2011 1 Introduction Data that arises fro engineering applications often contains

More information

On a Multisection Style Binomial Summation Identity for Fibonacci Numbers

On a Multisection Style Binomial Summation Identity for Fibonacci Numbers Int J Contep Math Sciences, Vol 9, 04, no 4, 75-86 HIKARI Ltd, www-hiarico http://dxdoiorg/0988/ics0447 On a Multisection Style Binoial Suation Identity for Fibonacci Nubers Bernhard A Moser Software Copetence

More information

1. INTRODUCTION AND RESULTS

1. INTRODUCTION AND RESULTS SOME IDENTITIES INVOLVING THE FIBONACCI NUMBERS AND LUCAS NUMBERS Wenpeng Zhang Research Center for Basic Science, Xi an Jiaotong University Xi an Shaanxi, People s Republic of China (Subitted August 00

More information

Construction of Some New Classes of Boolean Bent Functions and Their Duals

Construction of Some New Classes of Boolean Bent Functions and Their Duals International Journal of Algebra, Vol. 11, 2017, no. 2, 53-64 HIKARI Ltd, www.-hikari.co https://doi.org/10.12988/ija.2017.61168 Construction of Soe New Classes of Boolean Bent Functions and Their Duals

More information

A Simple Regression Problem

A Simple Regression Problem A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where

More information

On Constant Power Water-filling

On Constant Power Water-filling On Constant Power Water-filling Wei Yu and John M. Cioffi Electrical Engineering Departent Stanford University, Stanford, CA94305, U.S.A. eails: {weiyu,cioffi}@stanford.edu Abstract This paper derives

More information

Methodology of Projection of Wave Functions of. Light Nuclei on Cluster Channels on

Methodology of Projection of Wave Functions of. Light Nuclei on Cluster Channels on Advanced Studies in Theoretical Physics Vol. 0 06 no. 89-97 HIKARI td www.-hikari.co http://dx.doi.org/0.988/astp.06.65 Methodology of Proection of Wave Functions of ight Nuclei on Cluster Channels on

More information

Sharp Time Data Tradeoffs for Linear Inverse Problems

Sharp Time Data Tradeoffs for Linear Inverse Problems Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used

More information

Computational and Statistical Learning Theory

Computational and Statistical Learning Theory Coputational and Statistical Learning Theory TTIC 31120 Prof. Nati Srebro Lecture 2: PAC Learning and VC Theory I Fro Adversarial Online to Statistical Three reasons to ove fro worst-case deterinistic

More information

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis

Experimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna

More information

On Rough Interval Three Level Large Scale Quadratic Integer Programming Problem

On Rough Interval Three Level Large Scale Quadratic Integer Programming Problem J. Stat. Appl. Pro. 6, No. 2, 305-318 2017) 305 Journal of Statistics Applications & Probability An International Journal http://dx.doi.org/10.18576/jsap/060206 On Rough Interval Three evel arge Scale

More information

A Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair

A Simplified Analytical Approach for Efficiency Evaluation of the Weaving Machines with Automatic Filling Repair Proceedings of the 6th SEAS International Conference on Siulation, Modelling and Optiization, Lisbon, Portugal, Septeber -4, 006 0 A Siplified Analytical Approach for Efficiency Evaluation of the eaving

More information

Q ESTIMATION WITHIN A FORMATION PROGRAM q_estimation

Q ESTIMATION WITHIN A FORMATION PROGRAM q_estimation Foration Attributes Progra q_estiation Q ESTIMATION WITHIN A FOMATION POGAM q_estiation Estiating Q between stratal slices Progra q_estiation estiate seisic attenuation (1/Q) on coplex stratal slices using

More information

An Approximate Model for the Theoretical Prediction of the Velocity Increase in the Intermediate Ballistics Period

An Approximate Model for the Theoretical Prediction of the Velocity Increase in the Intermediate Ballistics Period An Approxiate Model for the Theoretical Prediction of the Velocity... 77 Central European Journal of Energetic Materials, 205, 2(), 77-88 ISSN 2353-843 An Approxiate Model for the Theoretical Prediction

More information

1 Generalization bounds based on Rademacher complexity

1 Generalization bounds based on Rademacher complexity COS 5: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #0 Scribe: Suqi Liu March 07, 08 Last tie we started proving this very general result about how quickly the epirical average converges

More information

Results regarding the argument of certain p-valent analytic functions defined by a generalized integral operator

Results regarding the argument of certain p-valent analytic functions defined by a generalized integral operator El-Ashwah ournal of Inequalities and Applications 1, 1:35 http://www.journalofinequalitiesandapplications.co/content/1/1/35 RESEARCH Results regarding the arguent of certain p-valent analytic functions

More information

Introduction to Kernel methods

Introduction to Kernel methods Introduction to Kernel ethods ML Workshop, ISI Kolkata Chiranjib Bhattacharyya Machine Learning lab Dept of CSA, IISc chiru@csa.iisc.ernet.in http://drona.csa.iisc.ernet.in/~chiru 19th Oct, 2012 Introduction

More information

The linear sampling method and the MUSIC algorithm

The linear sampling method and the MUSIC algorithm INSTITUTE OF PHYSICS PUBLISHING INVERSE PROBLEMS Inverse Probles 17 (2001) 591 595 www.iop.org/journals/ip PII: S0266-5611(01)16989-3 The linear sapling ethod and the MUSIC algorith Margaret Cheney Departent

More information