A CLASSIFICATION OF REMOTE SENSING IMAGE BASED ON IMPROVED COMPOUND KERNELS OF SVM

Similar documents
Kernel-based Methods and Support Vector Machines

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines

Binary classification: Support Vector Machines

Support vector machines II

An Introduction to. Support Vector Machine

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

An Improved Support Vector Machine Using Class-Median Vectors *

Supervised learning: Linear regression Logistic regression

A handwritten signature recognition system based on LSVM. Chen jie ping

Study on a Fire Detection System Based on Support Vector Machine

Unsupervised Learning and Other Neural Networks

Support vector machines

Regression and the LMS Algorithm

CSE 5526: Introduction to Neural Networks Linear Regression

Chapter 10 Two Stage Sampling (Subsampling)

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses

Johns Hopkins University Department of Biostatistics Math Review for Introductory Courses

Research on SVM Prediction Model Based on Chaos Theory

Generative classification models

Multiple Choice Test. Chapter Adequacy of Models for Regression

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

FACE RECOGNITION BASED ON OPTIMAL KERNEL MINIMAX PROBABILITY MACHINE

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

Gender Classification from ECG Signal Analysis using Least Square Support Vector Machine

Principal Components. Analysis. Basic Intuition. A Method of Self Organized Learning

Comparison of SVMs in Number Plate Recognition

Machine Learning. Introduction to Regression. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

PGE 310: Formulation and Solution in Geosystems Engineering. Dr. Balhoff. Interpolation

CS 2750 Machine Learning. Lecture 7. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

Objectives of Multiple Regression

Dimensionality Reduction and Learning

A Comparison of Neural Network, Rough Sets and Support Vector Machine on Remote Sensing Image Classification

0/1 INTEGER PROGRAMMING AND SEMIDEFINTE PROGRAMMING

Introduction to local (nonparametric) density estimation. methods

Prediction of Machine Tool Condition Using Support Vector Machine

Correlation and Simple Linear Regression

ECE 421/599 Electric Energy Systems 7 Optimal Dispatch of Generation. Instructor: Kai Sun Fall 2014

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

V. Rezaie, T. Ahmad, C. Daneshfard, M. Khanmohammadi and S. Nejatian

Functions of Random Variables

Dimensionality reduction Feature selection

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

LECTURE 9: Principal Components Analysis

Arithmetic Mean and Geometric Mean

A Robust Total Least Mean Square Algorithm For Nonlinear Adaptive Filter

Fourth Order Four-Stage Diagonally Implicit Runge-Kutta Method for Linear Ordinary Differential Equations ABSTRACT INTRODUCTION

Radial Basis Function Networks

MATH 247/Winter Notes on the adjoint and on normal operators.

Solution of General Dual Fuzzy Linear Systems. Using ABS Algorithm

4. Standard Regression Model and Spatial Dependence Tests

An Improved Differential Evolution Algorithm Based on Statistical Log-linear Model

Analysis of Lagrange Interpolation Formula

Model Fitting, RANSAC. Jana Kosecka

Block-Based Compact Thermal Modeling of Semiconductor Integrated Circuits

MAX-MIN AND MIN-MAX VALUES OF VARIOUS MEASURES OF FUZZY DIVERGENCE

Line Fitting and Regression

Machine Learning. knowledge acquisition skill refinement. Relation between machine learning and data mining. P. Berka, /18

ABOUT ONE APPROACH TO APPROXIMATION OF CONTINUOUS FUNCTION BY THREE-LAYERED NEURAL NETWORK

LINEARLY CONSTRAINED MINIMIZATION BY USING NEWTON S METHOD

Lecture 07: Poles and Zeros

Evaluating Polynomials

Linear regression (cont.) Linear methods for classification

Nonlinear Blind Source Separation Using Hybrid Neural Networks*

PROJECTION PROBLEM FOR REGULAR POLYGONS

ScienceDirect. A SVM Stock Selection Model within PCA

A Planning Approach of Engineering Characteristics Based on QFD-TRIZ Integrated

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

VOL. 5, NO. 8, August 2015 ISSN ARPN Journal of Science and Technology All rights reserved.

Linear regression (cont) Logistic regression

Can we take the Mysticism Out of the Pearson Coefficient of Linear Correlation?

Lecture 16: Backpropogation Algorithm Neural Networks with smooth activation functions

Research on Frequency Estimation Based on LS-SVC in Unknown Noise

Lecture Notes 2. The ability to manipulate matrices is critical in economics.

CZASOPISMO TECHNICZNE TECHNICAL TRANSACTIONS FUNDAMENTAL SCIENCES NAUKI PODSTAWOWE 1-NP/2015 KRZYSZTOF WESOŁOWSKI*

Classification : Logistic regression. Generative classification model.

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971))

n -dimensional vectors follow naturally from the one

LECTURE 21: Support Vector Machines

New Schedule. Dec. 8 same same same Oct. 21. ^2 weeks ^1 week ^1 week. Pattern Recognition for Vision

An investigative study on the influence of correlation of PD statistical features on PD pattern recognition

Chapter 9 Jordan Block Matrices

Beam Warming Second-Order Upwind Method

Bayes (Naïve or not) Classifiers: Generative Approach

Fault Diagnosis Using Feature Vectors and Fuzzy Fault Pattern Rulebase

To use adaptive cluster sampling we must first make some definitions of the sampling universe:

Stochastic GIS cellular automata for land use change simulation: application of a kernel based model

Simple Linear Regression

A conic cutting surface method for linear-quadraticsemidefinite

MA/CSSE 473 Day 27. Dynamic programming

Mu Sequences/Series Solutions National Convention 2014

Computational learning and discovery

ESTIMATION OF MISCLASSIFICATION ERROR USING BAYESIAN CLASSIFIERS

Generalization of the Dissimilarity Measure of Fuzzy Sets

A New Development on ANN in China Biomimetic Pattern Recognition and Multi Weight Vector Neurons

About a Fuzzy Distance between Two Fuzzy Partitions and Application in Attribute Reduction Problem

A COMPARATIVE STUDY OF THE METHODS OF SOLVING NON-LINEAR PROGRAMMING PROBLEM

Chapter Two. An Introduction to Regression ( )

CIS 800/002 The Algorithmic Foundations of Data Privacy October 13, Lecture 9. Database Update Algorithms: Multiplicative Weights

Journal of Chemical and Pharmaceutical Research, 2014, 6(7): Research Article

MMJ 1113 FINITE ELEMENT METHOD Introduction to PART I

Transcription:

A CLASSIFICAION OF REMOE SENSING IMAGE BASED ON IMPROVED COMPOUND KERNELS OF SVM Jag Zhao Wal Gao * Zl Lu Gufe Mou L Lu 3 La Yu College of Iformato ad Electrcal Egeerg Cha Agrcultural Uverst Beg P. R. Cha 00083 Yua X Na Na Agrcultural echolog Co. Ltd. Yua Provce P. R. Cha 6504 3 Kumg Agrculture macher research sttute Yua Provce P. R. Cha 650034 * Correspodg author Address: College of Iformato ad Electrcal Egeerg Cha Agrcultural Uverst No. 7 Qghua Dog Lu Hada Beg 00083 P. R. Cha el: +86-00-6736755 Fa: +86-00-6736755 Emal: gaol@cau.edu.c Astract: Keords: he accurac of RS classfcato ased o SVM hch s developed from statstcal learg theor s hgh uder small umer of tra samples hch results satsfacto of classfcato o RS usg SVM methods. he tradtoal RS classfcato method comes vsual terpretato th computer classfcato. he accurac of the RS classfcato hoever s mproved a lot ased o SVM method ecause t saves much laor ad tme hch s used to terpret mages ad collect trag samples. Kerel fuctos pla a mportat part the SVM algorthm. It uses mproved compoud kerel fucto ad therefore has a hgher accurac of classfcato o RS mages. Moreover compoud kerel mproves the geeralzato ad learg alt of the kerel. compoud kerel remote sesg mage classfcato support vector mache. INRODUCION Classfcato o remote sesg mages s ver mportat the feld of remote sesg processg. Amog the classfcato algorthms the accurac of SVM C. J. Burges 998 classfcato s oe of the hghest

Jag Zhao Wal Gao Zl Lu Gufe Mou L Lu La Yu methods. SVM s short for Support Vector Mache hch s a mache learg method proposed Vapk accordg to statstcal leag theor Vapk 995; Vapk 998 t tegrates ma techques cludg optmal hperplae Crsta 005 mercer kerel slave varale cove quadratc programmg ad so o. Support Vector Mache successfull solves ma practcal prolems such as small umer of samples o lear mult dmesos local mmum ad so o. I several challegg applcatos SVM acheves the est performace so far ZHANG Xue Gog 000. I ths paper a compoud kerel s proposed hch s etter tha a sgle kerel to mprove geeralzato ad learg alt.. SVM MEHOD. SVM Bass Support Vector Mache s a mache learg algorthm ased o statstcal learg theor usg the prcple of structural rsk mmzato mmzg the errors of sample hle shrkg the upperoud of geeralzato error of the model therefore mprovg the geeralzato of the model. Compared th other mache learg algorthms hch are ased o the prcple of emprcal rsk mmzato statstcal learg theor proposes a e strateg: the mechasm of SVM: fd a optmal classfcato haperplae hch satsfes the requremet of classfcato; separate the to classes as much as possle ad mamze the marg of oth sdes of the hperplae amel make the separated data farthest from the hperplae. A trag sample ca e separated dfferet hperplaes. Whe the marg of the hperplae s largest the hperplae s the optmal hperplae of separalt Crsta et al. 005... Lear Separalt A to-class classfcato prolem ca e stated follog a: N trag samples ca e represeted as a set of pars = th d the lael of the class hch ca e set to values of ad R stads for feature vector th d compoets. he hperplae s defed as g 0. Fd the optmal hperplae hch leads to mamzato of the marg. he optmal questo s the traslated to seek the mmzato of the follog fucto ad :

A Classfcato Of Remote Sesg Image Based O Improved Compoud Kerels Of Svm 3 C 0 ] [ Where: s ormal to the hperplae C s a regularsato parameter s the offset. he dual prolem of the aove prolem s searchg the mamzato of the follog fucto: Q 3 hch s costraed C 00 4 he classfcato rule ased o optmal hperplae s to fd the optmal classfcato fucto: f } sg{ } sg{ 5 through solvg the aove prolems the coeffcet of a o support vector s zero... No-lear Separalt For a o-lear prolem e use kerel fuctos hch satsf Mercer s codto to proect data oto hgher dmesos here the data are cosdered to e lear separale. Wth kerel fuctos troduced olear algorthm ca e mplemeted thout creasg the complet of the algorthm. If e use er products ' ' K to replace dot products the optmal hperplae hch equals to covert the orgal feature space to a e feature space therefore maorzed fucto of fucto3 turs to: K Q 6 Ad the correspodg dscrmato 5 turs to: K f } sg{ 7

4 Jag Zhao Wal Gao Zl Lu Gufe Mou L Lu La Yu. Kerel Fucto Kerel fuctos have propertes as follos: Propert : If K K are to kerels ad a a are to postve real umers the K u v = a * K u v + a * K u v s also a kerel hch satsfes Mercer s codto. Propert : If K K are to kerels the K u v = K u v * K u v s also a kerel hch satsfes Mercer s codto. Propert 3: f K s a kerel the epoet of K s also a kerel that s K u v = ep K u v XIA Hoga 009. here are 4 tpes of kerels hch are ofte used: lear kerels polomal kerels Gauss RBF kerels sgmod kerels. lear kerels : K polomal kerels : K r 3RBF kerels K ep 4sgmod kerel: K tah r d 3 COMPOUND KERNELS Curretl there are so ma kerels each of hch has dvdual characterstc. But the ca e classfed to to ma tpes that s local kerel ad gloal kerels Smts et al. 00. local kerels Ol the data hose values approach each other have a fluece o the kerel values. Bascall all kerels ased o a dstace fucto are local kerels. pcal local kerels are: RBF: K ep gloal kerels Samples that are far aa from each others stll have a fluece o the kerel value. All kerels ased o the dot-product are gloal: lear: K polomal: K r sgmod: K tah r he upperoud of the epected rsk of SVM s the proporto of the average umer of support vector the trag sample to the total umer of trag samples: E[Perror] E[umer of support vector]/total umer of support vector d

A Classfcato Of Remote Sesg Image Based O Improved Compoud Kerels Of Svm 5 tra samples- We ca get that f the umer of support vector s reduced the alt of geeralzato of SVM ca e mproved. herefore Gauss kerels ca e mproved as follos: K a ep 0 8 hrough addg a coeffcet a hch s a real umer greater tha the asolute value of the coeffcet of the quadratc term of quadratc programmg fucto equato 8 s creased. Hece the optmal value of α s reduced ad the umer of support vector s reduced the alt of geeralzato therefore s mproved. If the total umer of tra samples s fed the error rate of classfcato ca e reduced decreasg the umer of support vectors. If kerel fuctos satsf Mercer s codto the lear comato of them are elgle for kerels. Eamples are: K a K K 9 Where oth a ad are real umers greater tha K K ca e a kerel. Gloal kerels have good geeralzato hle local kerels have good learg alt. Hece comg the to kerels ll make full use of ther merts hch acheves good learg alt ad geeralzato. For a compoud kerel four parameters eed to e cofrmed. he values of the parameters have great effect of the accurac of classfcato. Optmal C a s eeded for the compoud kerel. 4. EXPERIMEAL RESULS AND DISCUSSION I ths paper a remote sesg mage hch resoluto s 30 meters of rce padd Guagdog provce s selected to test the results the program of classfcato s rtte ased o lsvm. he results are shoed tale as follos: ale. he classfcato accurac usg dfferet kerels. Numer of kerel Numer of sv Accurac pels 350 RBF C= γ=0. 30 9.% 350 lear 0 85.3% 350 Compoud of 70 93.% Lear ad RBF C= γ=0.5 a= =3 ale. shos result of the classfcato of the remote sesg mage. he classfcato method of compoud kerel has the hghest accurac ad

6 Jag Zhao Wal Gao Zl Lu Gufe Mou L Lu La Yu eeds less umer of sv. From the result e ca get that the compoud kerel has good alt of geeralzato ad learg. Compared to the RBF kerel the compoud kerel has a hgher accurac of classfcato ut the umer of support vector s loer tha t. Hece the compoud kerel acheves good geeralzato alt. I the test the value of C a has a great effect o the accurac of classfcato. For dfferet remote sesg mage ad dfferet dfferet compoud kerels ad C a should e selected. he optmal set of compoud ad value of C a should e fed through repeated trals. 5. CONCLUSION I cocluso the compoud kerels eld etter results tha the sgle kerel. he compoud kerels eed less umer of support vectors hch meas that the kerels have good geeralzato alt ad acheve hgher classfcato accurac tha sgle kerels. REFERENCES C. J. Burges A tutoral o support vector maches for patter recogto Data mg ad koledge dscover U. Faad Ed. Kluer Academc 998 pp. 43. N.Crsta Shae-alor J.A troducto to support vector maches ad other Kerelased learg methods:house of Electrocs Idustr 005 G. Smts E. Jordaa Improved svm regresso usg mtures of kerels IJCNN 00. V.N. Vapk he ature of statstcal learg theor NeYork:Sprger-Verlag995. V.N. Vapk Statstcal Learg theor Wle Ne York 998. XIA Hoga DING Zchu LI Zhe GUO Cucu SONG Huazhu. A Adaptve Compoud Kerel Fucto of Support Vector Maches. JOURNAL OF WUHAN UNIVERSIY OF ECHNOLOGY 009 ZHANG Xue Gog. Itroducto statstcal learg theor ad support vectormaches Act Autom atca S ca 000 6 : 3 4