CHARACTER RECOGNITION USING A SELF-ADAPTIVE TRAINING
|
|
- Wilfred Goodman
- 5 years ago
- Views:
Transcription
1 CHARACTER RECOGNITION USING A SELF-ADAPTIVE TRAINING Dr. Eng. Shasuddin Ahed $ College of Business and Econoics (AACSB Accredited) United Arab Eirates University, P O Box 7555, Al Ain, UAE. and $ Edith Cowan University School of Engineering and Matheatics 00-Joondalup drive, Joondalup, Perth, WA-607, Australia. Eail: Dr_Shasuddin_Ahed@Yahoo.Co ABSTRACT A robust algorith to train neural networ (NN) with individual character presented in pixel forat is described. The output of the trained NN is related with the pixel binary forat. The trained NN perfors the classification based on the original pixel forat code presented to the NN. A apping schee is proposed to deterine the self-adaptive variable learning rates for rapid converge. The proposed character recognition ethod trains at a faster rate than that of the standard bac propagation training ethod and prevents oscillations during training. Key Words: Self-adaptive training, ANN, Dynaic variable learning rates, Character recognition.. Introduction A self-adaptive ANN training schee that identifies variable learning rates is discussed in this study. Consider an ANN error function containing training weights. The actual error function in higher diension is decoposed into lower diensional error function (Ahed et al., 000). The transfored error function retains true convex characteristics of the original error function. The training schee selects the learning rate for a weight paraeter say w j, j=,,.., by a sall factor, w j, such that iproveent in training is noted. Once this phase is over, all the networ weights are then updated and the error function is evaluated.. Character Recognition Proble A training experient is considered to recognize the letters L and T. An ANN with nine inputs, two hidden units and a single output unit is considered for this proble. Each input pattern consists of a 3x3 pixel binary iage of the letter. The training set is fored by four orientations of each letter as shown in Figure and Figure. A 9-- ANN configuration is chosen to copare the results given in Kaarthi and Pittner (999). Figure 3 shows the variable learning rates per epoch. As the training continues the variable learning rates as seen in Figure 4 converges to lower agnitude. Consequently the training converges rapidly. The error function value reduces of the order 0-3 at the early stage of training and further trained at 0-6 tolerance liit for accuracy. Notice that the function converges as the self-adaptive training paraeters gradually reduce to sall agnitude. Figure 5 shows training
2 IACIS 00 CHARACTER RECOGNITION USING A SELF-ADAPTIVE TRAINING perforance with standard bac propagation (BP) training. The training perforance is copared in Figure 6 with the standard BP training. The binary input is passed through the 9-- ANN (Figure 7) input neurons and output neuron provides the necessary signal to recognize the L-T input pattern. It is evident fro Figure 3 that considerable reduction in error function is achieved within epochs to train the proble. The standard BP taes on average 78 epochs to train the character recognition proble, while the proposed ethod taes on average 79 epochs. (a) (b) (c) (d) Figure Four Orientations of the Letter L (a) (b) (c) (d) Figure Four Orientations of the Letter T Self-Adaptive Learning Rates 5.00E E E E E-0.50E-0.00E-0.50E-0.00E E E+00 Figure 3 Convergence with self-adative bac propagation training ethod ( L-T letter recognition) Figure 4 Self-adaptive paraeter generation with self-adaptive bac propagation training ethod 4.00E E E-0.50E-0.00E-0.50E-0.00E E E Self-adaptive BP Standard BP Figure 6 Function evalutions and epoch easure coparison Figure 5 Convergence with standard BP training ethod (L-T letter recognition proble). Identifying self-adaptive learning rates The standard BP training ethod and its variant are not self-adaptive but ad-hoc in training ethod (Fahlan, 988; Hinton, 987; Jacobs, 988; Wier, 99; Vogl, 986 and Kaarthi et al., 999). General references on BP neural networs include Ruelhart, McClelland (989); Muller and Reinhardt (990); and Hertz, Krogh and Paler (99). More discussions on neural networs as statistical odel can be found in Cheng and Titterington (994), Werbos (994), Kuan and White (994), Bishop (995).
3 CHARACTER RECOGNITION USING A SELF-ADAPTIVE TRAINING IACIS 00 Input Layer Hidden Layer Outer Layer Input Pattern i=9 n= o= w no w in Figure 7 Three layered feed forward 9-- ANN To describe the self-adaptive iterative training ethod, an ANN error function f (w) with log activation function is defined in hidden layer. When the algorith is applied to the error function with an initial arbitrary weight vector w, at epoch, the algorith generates a sequence of vectors w +, w +, during epoch +, +,.,.., and so on. The iterative algorith is globally convergent if the sequence of vector converges to a feasible solution set Ω. Consider for exaple the following training proble, where w is defined over the diension E : iniize f (w) () subject to: w E. Let, Ω E be the solution set, and the application of an algorithic interpolation ap, B, generates the sequence w +, w+,..., starting with weight vector w such that ( w, w+, w+,...) Ω, then the algorith converges globally and the algorithic ap is closed over Ω. Consider further that the error function is a iniization proble, since it is possible to train such function (Hayin, 994). Let Ω be a non-epty copact feasible subset of E, and if the algorithic ap B generates a sequence: { w } Ω such that f(w) decreases at each iteration while satisfying f( w ) > f( w + ) > f( w + ),., then the error function f( w ) is a descent function. The expression w defines a sequence (w, w, w 3,.., w n ). In ANN coputation f (w) is assued to posses descent properties, which iplies it is convex in nature (Hyin, 994). 3
4 IACIS 00 CHARACTER RECOGNITION USING A SELF-ADAPTIVE TRAINING Therefore, it is possible to define a descent direction along which the error function can be trained. Following sections deonstrate how the self-adaptive variable learning rates are generated, such that the training direction is always correct. Property Suppose that f : E fi E and the gradient of the error function, f (w), is defined then there is a T directional vector d such that f ( w) d < 0, and f ( w + hd ) < f ( w) :{ h (0, d ), d > 0}, then the vector d is a descent direction of f (w), where d is assued arbitrary positive scalar. A vector defined as directional derivative conceptualizes the direction along which the error function converges to iniu. Also, assue that the error function is sooth and continuous. Therefore the negative gradient of the error function points towards the iniu ANN error space. Next the directional derivative is defined using the following property. Let, f : E + fi E, w E and d is a non-zero vector satisfying ( w + h d) E, h > 0 and fi 0 The directional derivative at location w along the descent direction d is given by: hfi 0 + f ( w+ hd) - f ( w) h f ( w; d) = li. () h. Property Let f : E fi E is a descent function. Consider any training weight w E and d E : d 0. Then the directional derivative f ( w; d) of the error function f (w) in direction d always exists. The central thee behind the self-adaptive BP training is the coputation of the directional search vector d and the learning rate paraeter h in weight space. Since the exact location of the iniu valley is not nown iediately, there is an uncertainty in identifying the boundary or region in weight space over which the training ay explore. The uncertainty can be reduced if it is possible to eliinate the sections of the error function (Kiefer, 953), which do not contain the iniu. An interpolation-training ap can do this operating over the error function in constrained interval. Hence a closed apping is established. Therefore, what is needed is a training schee that explores the constrained region, L, of the error surface. Consider the interpolation ap, B, saples the error surface with discrete length in a given direction. Few definitions are needed to describe the interpolation-training ap. For convenience define the following expression to update the ANN connection weights: u w + = w h d + or u = w + hd. (3) Now consider an ANN training proble expressed as: h arg{ in f ( w + hd )} (4) subject to : h L where, L defines a closed interval L = { h : h E } over the error function. 4
5 CHARACTER RECOGNITION USING A SELF-ADAPTIVE TRAINING IACIS 00 Property 3 The interpolation ap that coputes learning rates is defined in restricted error space written as: A : E E fi E such that: A( w, d) = { u : u = ( w + h d ) h} L (5) f ( u) = in f ( w + h d) (6) Subject to : h h L. Suppose that the algorith B operating on f(w) produces descent directions. The correct learning rate is obtained during the training phase, when the ap is closed as set value apping (Luenberger, 984). It is, therefore, now possible by any standard technique; though difficult; to identify learning rate, η, for each epoch while descending along the iniu trajectory and exploring all the diensions. The error function as a result onotonically converges to a iniu value. The algorith gradually converges to the acceptable iniu liit in few epochs.. Training One of the ost difficult coputational steps in identifying the trained weights in ANN is that the ANN error function contains several local inius within the reasonable range of w j. The reduced ANN error function can be considered a continuous function of training weights w j, j=,,3,.,, describing a hyper surface in diensional space (Moller, 997). Suitable reduction of the error surface constitutes a parabolic hyper surface, and the learning rates are deterined fro the reduced function with initial rando weight estiated at the beginning of training. As a result, few epochs are needed to converge to the iniu trajectory of the error function. 3. Character recognition proble Ten siulation experients are carried out to test the proposed training ethod against the standard BP training ethod (Ruelhart et al., (989) with sall rando starting weights. Siulation results are shown in Table and Table. The sall rando starting weights are used to initiate the training. Table shows the result against the standard BP training ethod. The average nuber of epoch with the self-adaptive bac propagation ethod is The axiu and iniu values in nubers of epoch are 33 and 53 respectively. The average terinal function value is.8x0-9, which indicates that the ANN is trained to produce accurate results. The average nuber of epoch is 77.7 with the standard BP training. The axiu and iniu nubers of epoch are 365 and 80 respectively. The relative efficiency of the self- 5
6 IACIS 00 CHARACTER RECOGNITION USING A SELF-ADAPTIVE TRAINING adaptive bac propagation-training algorith over the standard BP-training algorith in nuber of epoch is (77.7/78.8).8. Table shows the coparison with different training ethods including the result given in Kaarthi et al. (999). The proposed ethod perfors better than the weight extrapolation ethod suggested by Kaarthi et al. (999). They train L-T letter recognition proble with 8 epoch and the terinal function value is at the end of training. The self-adaptive training ethod finds terinal function value, which is coparatively less than the weight extrapolation and standard BP training ethod. L-T Letter Recognition Expt # Self-adaptive bac propagation Standard bac propagation Mean E Median E Standard E Deviation Range E Miniu E E-05 Maxiu E L-T Letter Recognition Measure Perforance Measures Self-adaptive BP Standard Bac Propagation Kaarthi et al. (999) Weight extrapolation Average S. Deviation Max Min Speed up Table Training perforance with 9- - ANN: L-T letter recognition proble Table Coparison with standard bac propagation and self-adaptive training ethod (9-- ANN: L-T letter recognition proble) CONCLUSION A strictly self-adaptive training schee to train forward ANN is shown. The iportant feature of this ANN training is that the learning rates are dynaically coputed each epoch by an interpolation ap. The ANN error function is transfored into a lower diensional error space and the reduced error function is eployed to identify the variable learning rates. As the training progresses the geoetry of the ANN error function constantly changes and therefore the interpolation ap always identifies variable learning rates that gradually reduces to a lower agnitude. As a result the error function also reduces to a saller terinal function value. It iproves over the BP training ethod and the weight interpolation ethod deonstrated by Kaarti et al., (999) in character recognition proble. The proposed self-adaptive training ethod needs 79 epochs, while the standard BP ethod needs 78 epoch and the weight extrapolation ethod needs 8 epoch to train the L-T character recognition proble with 9-- ANN configurations. REFERENCES Ahed, S., and Cross, J., (000). Convergence in Artificial Neural Networ without Learning Paraeter, Second International Coputer Science Conventions on Neural 6
7 CHARACTER RECOGNITION USING A SELF-ADAPTIVE TRAINING IACIS 00 Coputations, 000, Berlin, Gerany Ahed, S., Cross, J., and Bouzerdou, A. (000). Perforance Analysis of a new ulti-directional training algorith for Feed-Forward Neural Networ, World Neural Networ Journal, Bishop, Christopher M. (995). Neural Networs for Pattern Recognition. Oxford, UK: Clarendon. 4 Cheng, Bing and D. M. Titterington. (994). Neural Networs: A Review Fro a Statistical Per-spective. Statistical Science 9 (): Fahlan, S.E., (988). Faster-learning variations on bac-propagation: an epirical study, in Touretzy, D., Hinton, G. and Sejnowsi, T. (Eds), Proceedings of the Connectionist Models Suer School, Morgan Kaufann, San Mateo, CA, Hayin, Sion., (994). Neural Networs: A Coprehensive Foundation., Macillan College Publishing, Hertz, John, Anders Krogh, and Richard G. Paler. (99). Introduction to the Theory of Neural Coputation. Reading, MA: Addison-Wesley 8 Hinton, G. E., (987). Learning Translation Invariant Recognition in Massively Parallel Networs. In De Baer, J.W., Nijan, A.J., and Treleaven, P.C. (Eds.), Proceedings PARLE Conference on Parallel Architectures and Languages Europe, - 3. Berlin: Springer-Verlag. 9 Jacobs, R.A. (988). Increased Rate of Convergence Through Learning Rate Adaptation. Neural Networs,, Jang, J.-S.R., Sun, C.-T., Mizutani, E., (997). Neuro-Fuzzy and Soft Coputing, Prentice Hall International, Inc, NJ, USA. Kaarthi, S.V., and Pittner, S., (999). Accelerating Neural Networ Training Using Weight Extrapolations, Neural Networs,, Kiefer, J., (953). Sequential Miniax search for a Maxiu. Proceedings of the Aerican atheatical Society, 4, Kuan, Chung-Ming and Halbert White. (994). Artificial Neural Networs: An Econoetric Per-spective. Econoetric Reviews 3 (): Luenberger, D. G., (984). Introduction to Linear and Nonlinear Prograing., nd ed. Addision-Wesley, Reading, Mass., USA. 5 Moller, Martin., (997). Efficient training of feed forward neural networs, in Brown Anthony, editor, Neural Networ: Analysis, Architectures and Applications,: Institute of Physics Publishing, Bristol and Philadelphia. 6 Muller, Berndt and Joachi Reinhardt. (990). Neural Networs : An Introduction. Berlin: Springer. 7 Ruelhart, D.E. and McClelland, J.L. (989), Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. I, MIT Press, Cabridge, MA. 8 Vogl, T. P., Mangis, J.K., Rigler, A.K., Zin, W.T., and Alon, D.L., (988). Accelerating the Convergence of the Bac-Propagation Method. Biological Cybernetics, 59, Weir, M.K., (99). A Method for Self-Deterination of Adaptive Learning Rates in Bac Propagation. Neural Networs, 4, Werbos, Paul. (994). The Roots of Bac propagation: Fro Ordered Derivatives to Neural Net-wors and Political Forecasting. New Yor: John Wiley. 7
Intelligent Systems: Reasoning and Recognition. Perceptrons and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley osig 1 Winter Seester 2018 Lesson 6 27 February 2018 Outline Perceptrons and Support Vector achines Notation...2 Linear odels...3 Lines, Planes
More informationPattern Classification using Simplified Neural Networks with Pruning Algorithm
Pattern Classification using Siplified Neural Networks with Pruning Algorith S. M. Karuzzaan 1 Ahed Ryadh Hasan 2 Abstract: In recent years, any neural network odels have been proposed for pattern classification,
More informationIntelligent Systems: Reasoning and Recognition. Artificial Neural Networks
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial
More informationKernel Methods and Support Vector Machines
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley ENSIAG 2 / osig 1 Second Seester 2012/2013 Lesson 20 2 ay 2013 Kernel ethods and Support Vector achines Contents Kernel Functions...2 Quadratic
More informationQualitative Modelling of Time Series Using Self-Organizing Maps: Application to Animal Science
Proceedings of the 6th WSEAS International Conference on Applied Coputer Science, Tenerife, Canary Islands, Spain, Deceber 16-18, 2006 183 Qualitative Modelling of Tie Series Using Self-Organizing Maps:
More informationCh 12: Variations on Backpropagation
Ch 2: Variations on Backpropagation The basic backpropagation algorith is too slow for ost practical applications. It ay take days or weeks of coputer tie. We deonstrate why the backpropagation algorith
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lessons 7 20 Dec 2017 Outline Artificial Neural networks Notation...2 Introduction...3 Key Equations... 3 Artificial
More informationFeature Extraction Techniques
Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that
More informationFeedforward Networks
Feedforward Networks Gradient Descent Learning and Backpropagation Christian Jacob CPSC 433 Christian Jacob Dept.of Coputer Science,University of Calgary CPSC 433 - Feedforward Networks 2 Adaptive "Prograing"
More informationFeedforward Networks. Gradient Descent Learning and Backpropagation. Christian Jacob. CPSC 533 Winter 2004
Feedforward Networks Gradient Descent Learning and Backpropagation Christian Jacob CPSC 533 Winter 2004 Christian Jacob Dept.of Coputer Science,University of Calgary 2 05-2-Backprop-print.nb Adaptive "Prograing"
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016 Lessons 7 14 Dec 2016 Outline Artificial Neural networks Notation...2 1. Introduction...3... 3 The Artificial
More informationCS Lecture 13. More Maximum Likelihood
CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood
More informationMultilayer Neural Networks
Multilayer Neural Networs Brain s. Coputer Designed to sole logic and arithetic probles Can sole a gazillion arithetic and logic probles in an hour absolute precision Usually one ery fast procesor high
More informationFeedforward Networks
Feedforward Neural Networks - Backpropagation Feedforward Networks Gradient Descent Learning and Backpropagation CPSC 533 Fall 2003 Christian Jacob Dept.of Coputer Science,University of Calgary Feedforward
More informationŞtefan ŞTEFĂNESCU * is the minimum global value for the function h (x)
7Applying Nelder Mead s Optiization Algorith APPLYING NELDER MEAD S OPTIMIZATION ALGORITHM FOR MULTIPLE GLOBAL MINIMA Abstract Ştefan ŞTEFĂNESCU * The iterative deterinistic optiization ethod could not
More informationE0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis
E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds
More informationHand Written Digit Recognition Using Backpropagation Neural Network on Master-Slave Architecture
J.V.S.Srinivas et al./ International Journal of Coputer Science & Engineering Technology (IJCSET) Hand Written Digit Recognition Using Backpropagation Neural Network on Master-Slave Architecture J.V.S.SRINIVAS
More informationPattern Recognition and Machine Learning. Artificial Neural networks
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2016/2017 Lessons 9 11 Jan 2017 Outline Artificial Neural networks Notation...2 Convolutional Neural Networks...3
More informationPattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition
More informationNumerical Studies of a Nonlinear Heat Equation with Square Root Reaction Term
Nuerical Studies of a Nonlinear Heat Equation with Square Root Reaction Ter Ron Bucire, 1 Karl McMurtry, 1 Ronald E. Micens 2 1 Matheatics Departent, Occidental College, Los Angeles, California 90041 2
More informationAn Improved Particle Filter with Applications in Ballistic Target Tracking
Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing
More informationEstimating Parameters for a Gaussian pdf
Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3
More informatione-companion ONLY AVAILABLE IN ELECTRONIC FORM
OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer
More informationNonmonotonic Networks. a. IRST, I Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I Povo (Trento) Italy
Storage Capacity and Dynaics of Nononotonic Networks Bruno Crespi a and Ignazio Lazzizzera b a. IRST, I-38050 Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I-38050 Povo (Trento) Italy INFN Gruppo
More informationACTIVE VIBRATION CONTROL FOR STRUCTURE HAVING NON- LINEAR BEHAVIOR UNDER EARTHQUAKE EXCITATION
International onference on Earthquae Engineering and Disaster itigation, Jaarta, April 14-15, 8 ATIVE VIBRATION ONTROL FOR TRUTURE HAVING NON- LINEAR BEHAVIOR UNDER EARTHQUAE EXITATION Herlien D. etio
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee227c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee227c@berkeley.edu October
More informationConstructing Locally Best Invariant Tests of the Linear Regression Model Using the Density Function of a Maximal Invariant
Aerican Journal of Matheatics and Statistics 03, 3(): 45-5 DOI: 0.593/j.ajs.03030.07 Constructing Locally Best Invariant Tests of the Linear Regression Model Using the Density Function of a Maxial Invariant
More informationStochastic Subgradient Methods
Stochastic Subgradient Methods Lingjie Weng Yutian Chen Bren School of Inforation and Coputer Science University of California, Irvine {wengl, yutianc}@ics.uci.edu Abstract Stochastic subgradient ethods
More information3D acoustic wave modeling with a time-space domain dispersion-relation-based Finite-difference scheme
P-8 3D acoustic wave odeling with a tie-space doain dispersion-relation-based Finite-difference schee Yang Liu * and rinal K. Sen State Key Laboratory of Petroleu Resource and Prospecting (China University
More informationEMPIRICAL COMPLEXITY ANALYSIS OF A MILP-APPROACH FOR OPTIMIZATION OF HYBRID SYSTEMS
EMPIRICAL COMPLEXITY ANALYSIS OF A MILP-APPROACH FOR OPTIMIZATION OF HYBRID SYSTEMS Jochen Till, Sebastian Engell, Sebastian Panek, and Olaf Stursberg Process Control Lab (CT-AST), University of Dortund,
More informationA Simple Regression Problem
A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where
More informationSharp Time Data Tradeoffs for Linear Inverse Problems
Sharp Tie Data Tradeoffs for Linear Inverse Probles Saet Oyak Benjain Recht Mahdi Soltanolkotabi January 016 Abstract In this paper we characterize sharp tie-data tradeoffs for optiization probles used
More informationIAENG International Journal of Computer Science, 42:2, IJCS_42_2_06. Approximation Capabilities of Interpretable Fuzzy Inference Systems
IAENG International Journal of Coputer Science, 4:, IJCS_4 6 Approxiation Capabilities of Interpretable Fuzzy Inference Systes Hirofui Miyajia, Noritaka Shigei, and Hiroi Miyajia 3 Abstract Many studies
More informationForecasting Financial Indices: The Baltic Dry Indices
International Journal of Maritie, Trade & Econoic Issues pp. 109-130 Volue I, Issue (1), 2013 Forecasting Financial Indices: The Baltic Dry Indices Eleftherios I. Thalassinos 1, Mike P. Hanias 2, Panayiotis
More informationKeywords: Estimator, Bias, Mean-squared error, normality, generalized Pareto distribution
Testing approxiate norality of an estiator using the estiated MSE and bias with an application to the shape paraeter of the generalized Pareto distribution J. Martin van Zyl Abstract In this work the norality
More informationOn Poset Merging. 1 Introduction. Peter Chen Guoli Ding Steve Seiden. Keywords: Merging, Partial Order, Lower Bounds. AMS Classification: 68W40
On Poset Merging Peter Chen Guoli Ding Steve Seiden Abstract We consider the follow poset erging proble: Let X and Y be two subsets of a partially ordered set S. Given coplete inforation about the ordering
More informationExperimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis
City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna
More informationADAPTIVE FUZZY PROBABILISTIC CLUSTERING OF INCOMPLETE DATA. Yevgeniy Bodyanskiy, Alina Shafronenko, Valentyna Volkova
International Journal "Inforation Models and Analyses" Vol. / 03 uber ADAPTIVE FUZZY PROBABILISTIC CLUSTERIG OF ICOMPLETE DATA Yevgeniy Bodyansiy Alina Shafroneno Valentyna Volova Abstract: in the paper
More informationFault Diagnosis of Planetary Gear Based on Fuzzy Entropy of CEEMDAN and MLP Neural Network by Using Vibration Signal
ITM Web of Conferences 11, 82 (217) DOI: 1.151/ itconf/2171182 IST217 Fault Diagnosis of Planetary Gear Based on Fuzzy Entropy of CEEMDAN and MLP Neural Networ by Using Vibration Signal Xi-Hui CHEN, Gang
More informationReduced Length Checking Sequences
Reduced Length Checing Sequences Robert M. Hierons 1 and Hasan Ural 2 1 Departent of Inforation Systes and Coputing, Brunel University, Middlesex, UB8 3PH, United Kingdo 2 School of Inforation echnology
More informationOptimum Value of Poverty Measure Using Inverse Optimization Programming Problem
International Journal of Conteporary Matheatical Sciences Vol. 14, 2019, no. 1, 31-42 HIKARI Ltd, www.-hikari.co https://doi.org/10.12988/ijcs.2019.914 Optiu Value of Poverty Measure Using Inverse Optiization
More informationDepartment of Electronic and Optical Engineering, Ordnance Engineering College, Shijiazhuang, , China
6th International Conference on Machinery, Materials, Environent, Biotechnology and Coputer (MMEBC 06) Solving Multi-Sensor Multi-Target Assignent Proble Based on Copositive Cobat Efficiency and QPSO Algorith
More informationThe Methods of Solution for Constrained Nonlinear Programming
Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 01-06 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.co The Methods of Solution for Constrained
More informationData-Driven Imaging in Anisotropic Media
18 th World Conference on Non destructive Testing, 16- April 1, Durban, South Africa Data-Driven Iaging in Anisotropic Media Arno VOLKER 1 and Alan HUNTER 1 TNO Stieltjesweg 1, 6 AD, Delft, The Netherlands
More informationMachine Learning Basics: Estimators, Bias and Variance
Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics
More informationKinematics and dynamics, a computational approach
Kineatics and dynaics, a coputational approach We begin the discussion of nuerical approaches to echanics with the definition for the velocity r r ( t t) r ( t) v( t) li li or r( t t) r( t) v( t) t for
More informationFigure 1: Equivalent electric (RC) circuit of a neurons membrane
Exercise: Leaky integrate and fire odel of neural spike generation This exercise investigates a siplified odel of how neurons spike in response to current inputs, one of the ost fundaental properties of
More informationResearch Article Robust ε-support Vector Regression
Matheatical Probles in Engineering, Article ID 373571, 5 pages http://dx.doi.org/10.1155/2014/373571 Research Article Robust ε-support Vector Regression Yuan Lv and Zhong Gan School of Mechanical Engineering,
More informationAn Adaptive UKF Algorithm for the State and Parameter Estimations of a Mobile Robot
Vol. 34, No. 1 ACTA AUTOMATICA SINICA January, 2008 An Adaptive UKF Algorith for the State and Paraeter Estiations of a Mobile Robot SONG Qi 1, 2 HAN Jian-Da 1 Abstract For iproving the estiation accuracy
More informationThe Algorithms Optimization of Artificial Neural Network Based on Particle Swarm
Send Orders for Reprints to reprints@benthascience.ae The Open Cybernetics & Systeics Journal, 04, 8, 59-54 59 Open Access The Algoriths Optiization of Artificial Neural Network Based on Particle Swar
More informationA Smoothed Boosting Algorithm Using Probabilistic Output Codes
A Soothed Boosting Algorith Using Probabilistic Output Codes Rong Jin rongjin@cse.su.edu Dept. of Coputer Science and Engineering, Michigan State University, MI 48824, USA Jian Zhang jian.zhang@cs.cu.edu
More informationLONG-TERM PREDICTIVE VALUE INTERVAL WITH THE FUZZY TIME SERIES
Journal of Marine Science and Technology, Vol 19, No 5, pp 509-513 (2011) 509 LONG-TERM PREDICTIVE VALUE INTERVAL WITH THE FUZZY TIME SERIES Ming-Tao Chou* Key words: fuzzy tie series, fuzzy forecasting,
More informationNeural Networks and the Back-propagation Algorithm
Neural Networks and the Back-propagation Algorithm Francisco S. Melo In these notes, we provide a brief overview of the main concepts concerning neural networks and the back-propagation algorithm. We closely
More informationOptimal nonlinear Bayesian experimental design: an application to amplitude versus offset experiments
Geophys. J. Int. (23) 155, 411 421 Optial nonlinear Bayesian experiental design: an application to aplitude versus offset experients Jojanneke van den Berg, 1, Andrew Curtis 2,3 and Jeannot Trapert 1 1
More informationINTELLECTUAL DATA ANALYSIS IN AIRCRAFT DESIGN
INTELLECTUAL DATA ANALYSIS IN AIRCRAFT DESIGN V.A. Koarov 1, S.A. Piyavskiy 2 1 Saara National Research University, Saara, Russia 2 Saara State Architectural University, Saara, Russia Abstract. This article
More informationNon-Parametric Non-Line-of-Sight Identification 1
Non-Paraetric Non-Line-of-Sight Identification Sinan Gezici, Hisashi Kobayashi and H. Vincent Poor Departent of Electrical Engineering School of Engineering and Applied Science Princeton University, Princeton,
More informationInspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information
Cite as: Straub D. (2014). Value of inforation analysis with structural reliability ethods. Structural Safety, 49: 75-86. Value of Inforation Analysis with Structural Reliability Methods Daniel Straub
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE7C (Spring 018: Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee7c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee7c@berkeley.edu October 15,
More informationExtension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels
Extension of CSRSM for the Paraetric Study of the Face Stability of Pressurized Tunnels Guilhe Mollon 1, Daniel Dias 2, and Abdul-Haid Soubra 3, M.ASCE 1 LGCIE, INSA Lyon, Université de Lyon, Doaine scientifique
More informationStable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems
Stable Adaptive Momentum for Rapid Online Learning in Nonlinear Systems Thore Graepel and Nicol N. Schraudolph Institute of Computational Science ETH Zürich, Switzerland {graepel,schraudo}@inf.ethz.ch
More information1 Proof of learning bounds
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #4 Scribe: Akshay Mittal February 13, 2013 1 Proof of learning bounds For intuition of the following theore, suppose there exists a
More informationA Theoretical Analysis of a Warm Start Technique
A Theoretical Analysis of a War Start Technique Martin A. Zinkevich Yahoo! Labs 701 First Avenue Sunnyvale, CA Abstract Batch gradient descent looks at every data point for every step, which is wasteful
More informationOptimization of ripple filter for pencil beam scanning
Nuclear Science and Techniques 24 (2013) 060404 Optiization of ripple filter for pencil bea scanning YANG Zhaoxia ZHANG Manzhou LI Deing * Shanghai Institute of Applied Physics, Chinese Acadey of Sciences,
More informationRecovering Data from Underdetermined Quadratic Measurements (CS 229a Project: Final Writeup)
Recovering Data fro Underdeterined Quadratic Measureents (CS 229a Project: Final Writeup) Mahdi Soltanolkotabi Deceber 16, 2011 1 Introduction Data that arises fro engineering applications often contains
More informationNUMERICAL MODELLING OF THE TYRE/ROAD CONTACT
NUMERICAL MODELLING OF THE TYRE/ROAD CONTACT PACS REFERENCE: 43.5.LJ Krister Larsson Departent of Applied Acoustics Chalers University of Technology SE-412 96 Sweden Tel: +46 ()31 772 22 Fax: +46 ()31
More informationA MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION
A eshsize boosting algorith in kernel density estiation A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION C.C. Ishiekwene, S.M. Ogbonwan and J.E. Osewenkhae Departent of Matheatics, University
More informationStatistical clustering and Mineral Spectral Unmixing in Aviris Hyperspectral Image of Cuprite, NV
CS229 REPORT, DECEMBER 05 1 Statistical clustering and Mineral Spectral Unixing in Aviris Hyperspectral Iage of Cuprite, NV Mario Parente, Argyris Zynis I. INTRODUCTION Hyperspectral Iaging is a technique
More informationarxiv: v2 [math.co] 3 Dec 2008
arxiv:0805.2814v2 [ath.co] 3 Dec 2008 Connectivity of the Unifor Rando Intersection Graph Sion R. Blacburn and Stefanie Gere Departent of Matheatics Royal Holloway, University of London Egha, Surrey TW20
More informationSpine Fin Efficiency A Three Sided Pyramidal Fin of Equilateral Triangular Cross-Sectional Area
Proceedings of the 006 WSEAS/IASME International Conference on Heat and Mass Transfer, Miai, Florida, USA, January 18-0, 006 (pp13-18) Spine Fin Efficiency A Three Sided Pyraidal Fin of Equilateral Triangular
More informationIdentical Maximum Likelihood State Estimation Based on Incremental Finite Mixture Model in PHD Filter
Identical Maxiu Lielihood State Estiation Based on Increental Finite Mixture Model in PHD Filter Gang Wu Eail: xjtuwugang@gail.co Jing Liu Eail: elelj20080730@ail.xjtu.edu.cn Chongzhao Han Eail: czhan@ail.xjtu.edu.cn
More informationSupport Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization
Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering
More informationSPECTRUM sensing is a core concept of cognitive radio
World Acadey of Science, Engineering and Technology International Journal of Electronics and Counication Engineering Vol:6, o:2, 202 Efficient Detection Using Sequential Probability Ratio Test in Mobile
More informationNeural Network Learning as an Inverse Problem
Neural Network Learning as an Inverse Proble VĚRA KU RKOVÁ, Institute of Coputer Science, Acadey of Sciences of the Czech Republic, Pod Vodárenskou věží 2, 182 07 Prague 8, Czech Republic. Eail: vera@cs.cas.cz
More informationBootstrapping Dependent Data
Bootstrapping Dependent Data One of the key issues confronting bootstrap resapling approxiations is how to deal with dependent data. Consider a sequence fx t g n t= of dependent rando variables. Clearly
More informationFairness via priority scheduling
Fairness via priority scheduling Veeraruna Kavitha, N Heachandra and Debayan Das IEOR, IIT Bobay, Mubai, 400076, India vavitha,nh,debayan}@iitbacin Abstract In the context of ulti-agent resource allocation
More informationUfuk Demirci* and Feza Kerestecioglu**
1 INDIRECT ADAPTIVE CONTROL OF MISSILES Ufuk Deirci* and Feza Kerestecioglu** *Turkish Navy Guided Missile Test Station, Beykoz, Istanbul, TURKEY **Departent of Electrical and Electronics Engineering,
More informationCombining Classifiers
Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/
More informationDEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS
ISSN 1440-771X AUSTRALIA DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS An Iproved Method for Bandwidth Selection When Estiating ROC Curves Peter G Hall and Rob J Hyndan Working Paper 11/00 An iproved
More informationSymbolic Analysis as Universal Tool for Deriving Properties of Non-linear Algorithms Case study of EM Algorithm
Acta Polytechnica Hungarica Vol., No., 04 Sybolic Analysis as Universal Tool for Deriving Properties of Non-linear Algoriths Case study of EM Algorith Vladiir Mladenović, Miroslav Lutovac, Dana Porrat
More informationResearch Article On the Isolated Vertices and Connectivity in Random Intersection Graphs
International Cobinatorics Volue 2011, Article ID 872703, 9 pages doi:10.1155/2011/872703 Research Article On the Isolated Vertices and Connectivity in Rando Intersection Graphs Yilun Shang Institute for
More informationMultiscale Entropy Analysis: A New Method to Detect Determinism in a Time. Series. A. Sarkar and P. Barat. Variable Energy Cyclotron Centre
Multiscale Entropy Analysis: A New Method to Detect Deterinis in a Tie Series A. Sarkar and P. Barat Variable Energy Cyclotron Centre /AF Bidhan Nagar, Kolkata 700064, India PACS nubers: 05.45.Tp, 89.75.-k,
More informationAn Inverse Interpolation Method Utilizing In-Flight Strain Measurements for Determining Loads and Structural Response of Aerospace Vehicles
An Inverse Interpolation Method Utilizing In-Flight Strain Measureents for Deterining Loads and Structural Response of Aerospace Vehicles S. Shkarayev and R. Krashantisa University of Arizona, Tucson,
More informationA New Approach to Solving Dynamic Traveling Salesman Problems
A New Approach to Solving Dynaic Traveling Salesan Probles Changhe Li 1 Ming Yang 1 Lishan Kang 1 1 China University of Geosciences(Wuhan) School of Coputer 4374 Wuhan,P.R.China lchwfx@yahoo.co,yanging72@gail.co,ang_whu@yahoo.co
More informationNeural Dynamic Optimization for Control Systems Part III: Applications
502 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART B: CYBERNETICS, VOL. 31, NO. 4, AUGUST 2001 Neural Dynaic Optiization for Control Systes Part III: Applications Chang-Yun Seong, Meber, IEEE,
More informationInternational Scientific and Technological Conference EXTREME ROBOTICS October 8-9, 2015, Saint-Petersburg, Russia
International Scientific and Technological Conference EXTREME ROBOTICS October 8-9, 215, Saint-Petersburg, Russia LEARNING MOBILE ROBOT BASED ON ADAPTIVE CONTROLLED MARKOV CHAINS V.Ya. Vilisov University
More informationarxiv: v1 [cs.ds] 3 Feb 2014
arxiv:40.043v [cs.ds] 3 Feb 04 A Bound on the Expected Optiality of Rando Feasible Solutions to Cobinatorial Optiization Probles Evan A. Sultani The Johns Hopins University APL evan@sultani.co http://www.sultani.co/
More informationEffects of landscape characteristics on accuracy of land cover change detection
Effects of landscape characteristics on accuracy of land cover change detection Yingying Mei, Jingxiong Zhang School of Reote Sensing and Inforation Engineering, Wuhan University, 129 Luoyu Road, Wuhan
More informationSupplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion
Suppleentary Material for Fast and Provable Algoriths for Spectrally Sparse Signal Reconstruction via Low-Ran Hanel Matrix Copletion Jian-Feng Cai Tianing Wang Ke Wei March 1, 017 Abstract We establish
More informationMulti-view Discriminative Manifold Embedding for Pattern Classification
Multi-view Discriinative Manifold Ebedding for Pattern Classification X. Wang Departen of Inforation Zhenghzou 450053, China Y. Guo Departent of Digestive Zhengzhou 450053, China Z. Wang Henan University
More informationPredictive Vaccinology: Optimisation of Predictions Using Support Vector Machine Classifiers
Predictive Vaccinology: Optiisation of Predictions Using Support Vector Machine Classifiers Ivana Bozic,2, Guang Lan Zhang 2,3, and Vladiir Brusic 2,4 Faculty of Matheatics, University of Belgrade, Belgrade,
More informationA Note on the Applied Use of MDL Approximations
A Note on the Applied Use of MDL Approxiations Daniel J. Navarro Departent of Psychology Ohio State University Abstract An applied proble is discussed in which two nested psychological odels of retention
More informationNumerical Solution of the MRLW Equation Using Finite Difference Method. 1 Introduction
ISSN 1749-3889 print, 1749-3897 online International Journal of Nonlinear Science Vol.1401 No.3,pp.355-361 Nuerical Solution of the MRLW Equation Using Finite Difference Method Pınar Keskin, Dursun Irk
More informationAnalysis of Hu's Moment Invariants on Image Scaling and Rotation
Edith Cowan University Research Online ECU Publications Pre. 11 1 Analysis of Hu's Moent Invariants on Iage Scaling and Rotation Zhihu Huang Edith Cowan University Jinsong Leng Edith Cowan University 1.119/ICCET.1.548554
More informationAdaptive Stabilization of a Class of Nonlinear Systems With Nonparametric Uncertainty
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 46, NO. 11, NOVEMBER 2001 1821 Adaptive Stabilization of a Class of Nonlinear Systes With Nonparaetric Uncertainty Aleander V. Roup and Dennis S. Bernstein
More informationGrafting: Fast, Incremental Feature Selection by Gradient Descent in Function Space
Journal of Machine Learning Research 3 (2003) 1333-1356 Subitted 5/02; Published 3/03 Grafting: Fast, Increental Feature Selection by Gradient Descent in Function Space Sion Perkins Space and Reote Sensing
More informationDecentralized Adaptive Control of Nonlinear Systems Using Radial Basis Neural Networks
050 IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 44, NO., NOVEMBER 999 Decentralized Adaptive Control of Nonlinear Systes Using Radial Basis Neural Networks Jeffrey T. Spooner and Kevin M. Passino Abstract
More informationGREY FORECASTING AND NEURAL NETWORK MODEL OF SPORT PERFORMANCE
Journal of heoretical and Applied Inforation echnology 3 st March 03 Vol 49 No3 005-03 JAI & LLS All rights reserved ISSN: 99-8645 wwwatitorg E-ISSN: 87-395 GREY FORECASING AND NEURAL NEWORK MODEL OF SPOR
More informationSensorless Control of Induction Motor Drive Using SVPWM - MRAS Speed Observer
Journal of Eerging Trends in Engineering and Applied Sciences (JETEAS) 2 (3): 509-513 Journal Scholarlink of Eerging Research Trends Institute in Engineering Journals, 2011 and Applied (ISSN: 2141-7016)
More informationThis article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and
This article appeared in a ournal published by Elsevier. The attached copy is furnished to the author for internal non-coercial research and education use, including for instruction at the authors institution
More informationOn Constant Power Water-filling
On Constant Power Water-filling Wei Yu and John M. Cioffi Electrical Engineering Departent Stanford University, Stanford, CA94305, U.S.A. eails: {weiyu,cioffi}@stanford.edu Abstract This paper derives
More information