Nonlinear Regression using Particle Swarm Optimization and Genetic Algorithm

Size: px
Start display at page:

Download "Nonlinear Regression using Particle Swarm Optimization and Genetic Algorithm"

Transcription

1 International Journal Computer Applications (5 ) Volume 153 No, November 1 Nonlinear Regression using Particle Swarm Optimization and Genetic Pakize Erdoğmuş Düzce University, Computer Engineering Department, DUZCE Simge Ekiz Düzce University, Computer Engineering Department, DUZCE ABSTRACT Nonlinear regression is a type regression which is used for modeling a relation between the independent variables and dependent variables. Finding the proper regression model and coefficients is important for all disciplines. In this study, it is aimed at finding the nonlinear model coefficients with two wellknown populationbased optimization algorithms. Genetic s (GA) and Particle Swarm Optimization (PSO) were used for finding some nonlinear regression model coefficients. It is shown that both algorithms can be used as an alternative way for coefficients estimation nonlinear regression models. Keywords Nonlinear regression, Genetic, Particle Swarm Optimization 1. INTRODUCTION Regression is predicting the coefficients for a function. So it is known as function estimation or approximation. Depending on the relationship between dependent variables and independent variables, a regression problem can be two types. There can be linear regression function as well as nonlinear regression function (Kumar, 15). Nonlinear Regression based prediction has already been successfully implemented in various areas scientific research and technology. It is mostly used for function estimation or solutions that enable modeling a dependence between two variables (Akoa, Simeu,& Lebowsky, 13). Lu, Sugano, Okabe, and Sato (11) used Adaptive Linear Regression in order to estimate gaze coordinates and sub pixel eye alignment. Martinez, Carbone, and Pissaloux (1) realized gaze estimation using local features and nonlinear regression. In another study, nonlinear regression analysis technique is used to develop a model for prediction the flashover voltage ceramic insulators (porcelain, flashover voltage nonceramic insulators (NCIs) (Venkataraman & Gorur, ). Nonlinear regression is applied to Internet traffic monitoring by Frecon et al. (1). A nonlinear regression procedure, based on an original branch and bound resolution procedure, is devised for the full identification bivariate OfBm. The Nonlinear regression model is predicted in some ways. One the solution methods is the linear model approximation. But converting nonlinear regression model to linear is not possible for all the models. So there are some methods for the nonlinear regression model. One the solution methods for nonlinear regression is an iterative estimation the parameters like GaussNewton (Gray, Docherty, Fisk, & Murray, 1). The main idea is the minimization the nonlinear least squares. So, nonlinear regression is an optimization problem. Not only some classic optimization methods but also some heuristic algorithm can be used to find the optimum parameters the nonlinear regression model. The disadvantage the classic optimization method like GaussNewton can be trapped local minimal (Lu, Yang, Qin, Luo, Momayez, 1). So; heuristic methods can be an alternative for the nonlinear regression parameter estimation. In this study, two wellknown populationbased algorithms are used to find the optimum parameters the nonlinear regression model. The rest the paper is organized as follows; in the following section, Nonlinear Regression is explained. In the third section, Nonlinear Regression with GA and PSO for some test problems are applied. In the fourth section, experimental results with comparisons are presented. Finally, conclusions about the results are given.. NONLİNEAR REGRESSION For a scientific study, the first step is modeling the variation between the dependent values and independent values. After decided which model describes the variation best, parameter estimation the model is made. With the development computer technology, it has been developed some algorithms for the estimation the models' parameters. Regression model can be linear or nonlinear. Unlike linear regression, there are a few limitations a nonlinear regression model. The way in which the unknown parameters in the function are estimated, however, is conceptually the same as it is in linear least squares regression ( Nonlinear Least s Regression, n.d.). A nonlinear model can be a basic form, given by Equation (1). Y=f(x,A)+e; (1) The function in Equation (1) is nonlinear and has some parameters given with A=(A1,A,..). The aim is to estimate these parameters given with A. Some the methods for the nonlinear regression are GaussNewton and Levenberg Marquardt methods. Both methods are based on the minimization the nonlinear least squares. The GaussNewton method is a simple iterative gradient descent optimization method. The iterative gradient method is used to modify a mathematical model by minimizing the leastsquares between the modeled response and observed values (Minot, Lu, & Li, 15). But this method requires starting values for the unknown parameters. So the performance the method is related to the initial values the parameters. Bad starting values can cause the convergence to the local minimum as it is seen most the classical optimization methods. In comparison to gradientbased methods, Newtontype methods are advantageous with respect to convergence rate, which is usually quadratic. The

2 difficulty is that Newtontype methods require solving a matrix inversion at each iteration (Minot et al., 15). The LevenbergMarquardt algorithm is an iterative technique that locates a local minimum a multivariate function that is expressed as the sum squares several nonlinear, realvalued functions. It has become a standard technique for nonlinear leastsquares problems, widely adopted in various disciplines for dealing with datafitting applications. LevenbergMarquardt algorithm solves nonlinear least squares problems by combining the steepest descent and GaussNewton method. It introduces a damping parameter ʎ into the classical GaussNewton algorithm (Polykarpou, & Kyriakides, 1). When the parameters are far from the optimal, the damping factor has larger values and the method acts more like a steepest descent method, but guaranteed to converge. The damping factor has small values when the current solution is close to the optimal and acts like a Gauss Newton method (Lourakis, & Argyros, 5). The algorithms GausNewton and LevenbergMarquardt Methods are given in Figure 1(Basics on Continuous Optimization, 11). International Journal Computer Applications (5 ) Volume 153 No, November 1 crossover, and mutation are applied to the population. The pseudo code GA is given Figure. The advantage GA is that it is popular and applied successfully nearly in every area. Because GA operators, sometimes it can t be practical to use GA for the solution an optimization problem. Create initial population and calculate fitness values do Select best individuals for the next generation Apply elitism Apply crossover Apply mutation until terminating condition met Figure. The pseudo code GA 3. Particle Swarm Optimization PSO is one the successfully studied populationbased heuristic search algorithm inspired by the social behaviors flocks (Bamakan, Wang, & Rayasan, 1). In PSO, each solution called particle. Particles consist the swarm. Figure 1. GaussNewton and Levenberg Marquardt algorithms for nonlinear regression METHODS GaussNewton LevenbergMarquardt f: R n R, f x = m i=1 (f i (x)) R n to R x () an initial solution x, a local minimum the cost function f. begin k ; while STOP CRIT and k < k max do x k+1 x k + δ k ; wit δ (k) =arg min δ F(x (k) ) + J F x k δ ; k k + 1; return x k end f: R n R, f x = m i=1 (f i (x)) R n to R x () an initial solution x, a local minimum the cost function f. begin k ; λ max diag(j T J) ; x x () ; while STOP CRIT and k < k max do Find δ suc tat J T J + λdiag J T J δ = J T f; x x + δ; if f x < f x then x x ; λ λ v ; else λ vλ; k k + 1; return x 3. NONLİNEAR REGRESSION WITH GA AND PSO 3.1 Genetic GA is one the most studied evolutionary computation technique since David Goldberg was firstly published (Holland, 15; Ijjina, and Chalavadi, 1). GA is a populationbased heuristic search algorithm inspired by the theory evolution. In the algorithm, best properties are transferred from the generation to generation with crossover and elitism. starts some initial random solutions the optimization problem called individuals. Each individual consists the variables the optimization problem called chromosome. GA uses some genetic operators such as crossover, mutation, and elitism in order to find the optimum solution. In each generation, fitness function values for each individual are calculated. The best individuals are selected for the next generation by some methods such as tournt selection or roulette wheel. After the selection, elitism, In PSO, the algorithm starts with initial solutions. Each initial solution represents the coordinate a particle. Particles starts fly to hyperspace. Particles adapt their velocities according to social and individual information. After the iterations, particles coordinates converge to the best particle coordinate which is the global optimum solution (Liu,& Zhou, 15). PSO has quite simple and fast converging algorithm. There is no operator. There are two important formulas in PSO. Particles move according to this formula given in () and (3) (Cavuslu, Karakuzu, & Karakaya, 1). v i k+1 = K v i k + φ 1 rand k p best x i k k + φ rand g best x i k x i k+1 = x i k + v i k+1 (3) PSO has little parameters. The constriction factor(k) is a damping effect on the amplitude an individual particle s ()

3 International Journal Computer Applications (5 ) Volume 153 No, November 1 oscillations. 1 and represent the cognitive and social parameters, respectively. Rand is random number uniformly distributed. P best i k is the best position for the i.th particle at the k.th iteration, g best is the global best position, x i k, v i k are the position and velocity the i.th particle at the k.th iteration respectively. Although PSO is a populationbased algorithm, it has many advantages such as simplicity, little parameters to be adjusted and rapid convergence. The pseudo code PSO is given Figure 3. P is the number particles. Figure 3. The pseudo code PSO Generate initial swarm(p) do for i=1:p update local best update global bet update velocity and location end until stopping criteria met problems. The average and standard deviation the estimated parameter is given in the tables. So the given results are accepted the real solution. The results found with GA and PSO are compared with these results. Table 1. Nonlinear Regression Test Problems Problem Number Name Model 1 Chwirut1 y = f(x i β)+ = e β 1x β +β 3 x Chwirut y = f(x i β)+ = e β 1x β +β 3 x + 3 Gauss1 y = f(x i β)+ = β 1 e β x +β 3 e (x β 4) /β 5 + β e (x β ) /β + 4 Nelson log y = f(x i β)+ = β 1 β x 1 e β 3x + 5 Kirby y = f(x i β)+ = β 1+ β x+ β 3 x 1+ β 4 x+ β 5 x + Gauss3 y = f(x i β)+ = β 1 e β x + β 3 e (x β 4) /β 5 + β e (x β ) /β + ENSO y = f x i β + = β 1 + β cos πx/1 + β 3 sin πx/1 β 5 cos πx/β 4 + β sin πx/β 4 + β cos πx/β + β sin πx/β + Thurber y = f x i β + = β 1 + β x + β 3 x + β 4 x β 5 x + β x + β x 3 + Rat43 y = f x i β + = β 1 (1 + e β β 3 x ) 1/β Bennett5 y = f x i β + = β 1 + (β + x) 1/β Optimization the parameters Nonlinear Regression model with GA and PSO In this study, some nonlinear regression problems are selected for the testing the performance GA and PSO with classical nonlinear methods ( Nonlinear Least s Regression, n.d.). The nonlinear models the problems are given in Table 1. Some initial starting values for the nonlinear regression models are also given. The difficulty levels the problems, the number parameters and models classification are given in Table. The estimation the parameters found classical nonlinear regression analysis are given with the 3

4 International Journal Computer Applications (5 ) Volume 153 No, November 1 Table. Properties Nonlinear Regression Test Problems Problem Number Name Difficulty Level Model Classification Number Parameter /Number Observation 1 Chwirut1 Lover Exponential 3/14 Chwirut Lover Exponential 3/54 3 Gauss1 Lover Exponential /5 4 Nelson Average Exponential 3/1 5 Kirby Average Rational 5/151 Gauss3 Average Exponential /5 ENSO Average Miscellaneous /1 Thurber Higher Rational /3 Rat43 Higher Exponential 4/15 1 Bennett5 Higher Miscellaneous 3/154 The object function in this study is the difference between the real values and the calculated values with the estimated models. There is no constrained. So the problem is unconstrained optimization problem. The aim is to minimize the squares the object function given in Equation (4). Fobj= m i=1 (Y i f i (x)) (4) GA and PSO parameters used in this study are given Table 3. GA PopulationSize:1 Generations:1 CrossoverFraction:. Table 3. GA and PSO parameters PSO SwarmSize:1 SelfAdjustment:1.4 SocialAdjustment:1.4 EliteCount:1 Max iteration:*number variable SelectionFcn:Roulette 4. EXPERIMENTAL RESULTS AND ANALYSIS The selected nonlinear regression problems are solved with GA and PSO. In this study, codes are written by Matlab with Intel Core Duo 3.GHz processor, 4bit Windows version operating system. Results are given in Table 4 Table 13. As it has seen in the tables, the parameters found GA are mostly nearer to nonlinear regression parameters. PSO solution absolute errors have been compared with GA solution absolute errors. As it has been seen in Table 4(Chwirut1), Table 5(Chwirut), Table 1(ENSO), Table 11(Thurber), Table 1(Rat43) and Table 13(Benett5), results found with GA is nearer to real values found Nonlinear Regression. GA outperforms PSO in view the solution time also. 5. CONCLUSIONS As it is known, nonlinear regression with classical methods like GaussNewton and LevenbergMarquardt has some disadvantages. The first disadvantage the classical methods is that they require a lot mathematical operations. Matrix operations, gradient operation and Jacobean matrix calculation and some other mathematic operators have been required both GaussNewton and LevenbergMarquardt methods. Another disadvantage is that classic methods like GaussNewton can be trapped local minima. And the convergence to the local minima can be too slow. So the number iteration for the minimization the nonlinear least squares can be timeconsuming. Both classical methods require starting values for the unknown parameters. So the performances the methods are related to the initial values the parameters. Bad starting values can cause the convergence to the local minimum as it is seen most the classical optimization methods. So in order to overcome these difficulties, heuristic search algorithms are suggested as an alternative. In this study, the nonlinear least squares problems were solved with the same starting values given in the reference. According to the reference web site, reported results for nonlinear regression were confirmed by at least two different algorithms and stware packages using analytic derivatives. Results prove that GA and PSO are good alternatives for the classical nonlinear least squares regression. But GA is more successful in view parameters estimation. For future studies, it is aimed to test the classical methods like GaussNewton with some heuristic optimization algorithms and show the performances the methods in view both solution time and optimal values.. REFERENCES [1] Kumar, T. (15, February). Linear and Non Linear Regression Problem by K Nearest Neighbor Approach: By Using Three Sigma Rule. Computational Intelligence & Communication Technology (CICT), pp. 11. doi: 1.11/CICT [] Akoa, B. E., Simeu, E., and Lebowsky, F. (13, July). Video decoder monitoring using nonlinear regression. 13 IEEE 1th International OnLine Testing Symposium (IOLTS), pp doi: 1.11/IOLTS.13.43G. [3] Lu, F., Sugano, Y., Okabe, T., and Sato, Y. (11, November). Inferring human gaze from appearance via adaptive linear regression. 11 International Conference on Computer Vision, pp doi: 1.11/ICCV [4] Martinez, F., Carbone, A., and Pissaloux, E. (1, September). Gaze estimation using local features and nonlinear regression. 1th IEEE International 31

5 [5] Conference on Image Processing, pp doi:.11/icip.1.41 [] Venkataraman, S., and Gorur, R. S. (). Non linear regression model to predict flashover nonceramic insulators. 3th Annual North American Power Symposium, NAPS, pp. 3.doi: 1.11/NAPS [] Frecon, J., Fontugne, R., Didier, G., Pustelnik, N., Fukuda, K., and Abry, P. (1, March). Nonlinear regression for bivariate selfsimilarity identification application to anomaly detection in Internet traffic based on a joint scaling analysis packet and byte counts. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp [] Gray, R. A., Docherty, P. D., Fisk, L. M., and Murray, R. (1). A modified approach to objective surface generation within the GaussNewton parameter identification to ignore outlier data points. Biomedical Signal Processing and Control, 3, pp [] Lu, Z., Yang, C., Qin, D., Luo, Y., and Momayez, M. (1). Estimating ultrasonic timeflight through echo signal envelope and modified Gauss Newton method. Measurement, 4, pp [1] Nonlinear Least s Regression. (n.d.). Engineering Statistics Handbook. Retrieved from md14.htm [11] Minot, A., Lu, Y. M., and Li, N. (15). A distributed GaussNewton method for power system state estimation. IEEE Transactions on Power Systems, 31(5), pp doi: 1.11/TPWRS [1] Polykarpou, E., and Kyriakides, E. (1, April). Parameter estimation for measurementbased load modeling using the LevenbergMarquardt. APPENDIX International Journal Computer Applications (5 ) Volume 153 No, November 1 algorithm. Electrotechnical Conference (MELECON), 1 1th Mediterranean, pp. 1. doi: 1.11/MELCON [13] Lourakis, M. L. A., and Argyros, A. A. (5, October). Is LevenbergMarquardt the most efficient optimization algorithm for implementing bundle adjustment?. Tenth IEEE International Conference on Computer Vision (ICCV'5, 1, pp ). doi: 1.11/ICCV.5.1. [14] Basics on Continuous Optimization, (11, July). Retrieved from [15] Holland, J. H. (15). Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. U Michigan Press, Retrieved from AJ&redir_esc=y [1] Ijjina, E. P., and Chalavadi, K. M. (1). Human action recognition using genetic algorithms and convolutional neural networks. Pattern Recognition,5, pp [1] Bamakan, S. M. H., Wang, H., and Ravasan, A. Z. (1). Parameters Optimization for Nonparallel Support Vector Machine by Particle Swarm Optimization. Procedia Computer Science, 1, pp [1] Liu, F., and Zhou, Z. (15). A new data classification method based on chaotic particle swarm optimization and least squaresupport vector machine. Chemometrics and Intelligent Laboratory Systems, 14, pp [1] Cavuslu, M. A., Karakuzu, C., and Karakaya, F. (1). Neural identification dynamic systems on FPGA with improved PSO learning. Applied St Computing, 1(), pp.1. Table 4. The Chwirut1 parameters estimation with GA, PSO and classical methods Chwirut1 Avg.,1,13,1,1,11,14,13 5,4E5 1.13E E,5,3,,3E,1,1,1,13E E E4,1,113,1 1,3E 5,15,15,15,E 1.533E.144E4 34,4 34,1 3,1 34,4 34,4 34,4, ,15,1,5 4,4,3,3,44,343,54 3

6 International Journal Computer Applications (5 ) Volume 153 No, November 1 Table 5. The Chwirut parameters estimation with GA, PSO and classical methods Chwirut Avg.,15,1,155,11,15,1,1,E E E,5,55,5,,5,5,5 4,1E E3.151E4,11,1,1,5,11,1,11 1,1E 1.15E E3 513,5 51, 513,55, 513,4 513, ,4 1,4E5 1,33 1,,51,433,4,431,354,53 Table. The Gauss1 parameters estimation with GA, PSO and classical methods, 1, Gauss1,,4 1, 1, 1,, Avg..11E E1,,1,,5,,,, 1.451E 1.141E4,,3,,1,,,, 1.433E E1 1, 1,1 1, 4,5 1, 1, 1,,.41111E E1 Parameter5, 5, 3,53,, 5, 4,,5.3133E E1 Parameter,,,,,,,,.14534E+1.313E1 Parameter 14, 15, 14,, 15, 15, 15,, 1.51E E1 Parameter 15, 15,1 15,3,5 15, 15, 15,, 1.335E E1 43, , , 31,44 43, 44, 43, 44 5,53 1,41 4,33,541,3,1,43,34,5 33

7 International Journal Computer Applications (5 ) Volume 153 No, November 1 Table. The Nelson parameters estimation with GA, PSO and classical methods Nelson Avg.,,1,454,13,,5,31,141,E,13,5E 14,1 1,1E 5 4,34E,3,33,,1,4,44,,34,5,,1,143 3,E,E 3,35E 5 1,15E,14E 3,35E,4 1,11,45,55,13,35,13,1.531E+ 5.11E E E.11454E E3 Table. The Kirby parameters estimation with GA, PSO and classical methods. Kirby 1, 1,51 1,14,3 1, 1, 1, Avg.,5,43,,,,1,5,41 1,445E Parameter5,, 5,45E 5,5E,,,,,5,51,5,4,54,51,54 5,4 1,34E 5 1, 1,11E 5, 1,E 4445, 1,,, 11,35 34,3 5 5,4 11,14E E+.3433E 1.33E E E E5 5,53E E E5 3,44E1.145E5.111E 1, 1,3 14,5 1,3 3,3,153,43,,53 Table. The Gauss3 parameters estimation with GA, PSO and classical methods Gauss3 Avg.,11,5,1,131,1,45,,11,15,11,1,,1,1,1, 1,,,1 1, 1,.43E E1,5E E E4 1, 3,E 1.553E E1 111,14 111,4 111,45,14 111,43 111, ,441 4, E E1 34

8 International Journal Computer Applications (5 ) Volume 153 No, November 1 Parameter5 3,4 3,545 3,14,543 3,1535 3,14 3,15,1.335E E1 Parameter 3,444 4, 3,,414 4,1 4,14 4,13, E E+ Parameter 14,33 14, 14,54,1 14,51 14,54 14,5, E E1 Parameter 1,4,55 1,131,5 1,3 1,3 1,, E E1 14, ,5 3 1, 4 1, 14,4 5 14,4 3 14,4,3,53 11,33,55,315,1 1,,,1 Table 1. The ENSO parameters estimation with GA, PSO and classical methods ENSO Avg. 1,55 1,5154 1,515,1 1,513 1,511 1,51, E E1 3,4 3, 3,,11 3,5 3,5 3,, 3.15E E1,5314,534,53,,53,5334,53, 5.313E E1 44,55 44,41 44,3,4 44,31 44,311 44,31, E+1.445E1 Parameter5 1,31 1, 1,3,5 1,3 1, 1,, E+.311E1 Parameter,4,553,5341,41,53,5,5, E E1 Parameter,4,1,4,,4,,,.1444E E1 Parameter,13,5,15,11,1,141,15,.134E E1 Parameter 1,4 1,4 1,41,4 1,4 1,4 1,4, 1.441E E1,53,3,551,1,53,54 5,53,1,3511,14 3,41,4 1,31, 1,,314 Table 11. The Thurber parameters estimation with GA, PSO and classical methods 1,5 5 13, Thurber 1, 3 3,543 Avg. 1,4 4 13, 1,,5 1.13E E+ 113, , 3 14,1,345 1,31 14, , 3 1, E E+1 4,4 551,14 4,4 41, 4, 5,15 541,1 3 5, 5.333E+.1E+1 35

9 International Journal Computer Applications (5 ) Volume 153 No, November 1 45,1 4,3 1,411,311 4,,54,143 14, E E+ Parameter5,4,33,45,3,45,5,3,4.54E E Parameter,3,431,33,1,31,41,31,3 3.5E E Parameter,,41,14,11 1,5E 5,5 3433, 14 13, 55,4 4 55,5 5,5,4,14 114, 414 4, ,115 13,11,5 14,553,,51 1,51 4,55, E E3 Table 1. The Rat43 parameters estimation with GA, PSO and classical methods,1, 1 Rat43,5,3 Avg.,,,43,5.4151E+ 1.31E+1 5,15 5,55 5,,14 5,31 5,43 5,5, E+.35E+,543,43,,1,5,11,5,.533E E1 1, 1,5 1,443,41 1, 1,311 1,4, 1.435E+.1335E1,41 3, 53 3, 4 4,15,4 5,1 55,1 1,53 1,54 1,41,13 3,5,1 1,131,51,53 Table 13. The Benett5 parameters estimation with GA, PSO and classical methods 5,1 44 1,4 4 Bennett5 35, 4 44,1 Avg. 3, 15,3 3, 3 4, E E+ 3,5 4,15 43,51 4,3 4,4 4,55 45,1 1, E E+, 1,,3,34,35 1,,51, E1.3E, 4,4,35 1,1,5,3,5,4,44 14,5,134 5,55, 1,3,44,13 IJCA TM : 3

Beta Damping Quantum Behaved Particle Swarm Optimization

Beta Damping Quantum Behaved Particle Swarm Optimization Beta Damping Quantum Behaved Particle Swarm Optimization Tarek M. Elbarbary, Hesham A. Hefny, Atef abel Moneim Institute of Statistical Studies and Research, Cairo University, Giza, Egypt tareqbarbary@yahoo.com,

More information

OPTIMAL DISPATCH OF REAL POWER GENERATION USING PARTICLE SWARM OPTIMIZATION: A CASE STUDY OF EGBIN THERMAL STATION

OPTIMAL DISPATCH OF REAL POWER GENERATION USING PARTICLE SWARM OPTIMIZATION: A CASE STUDY OF EGBIN THERMAL STATION OPTIMAL DISPATCH OF REAL POWER GENERATION USING PARTICLE SWARM OPTIMIZATION: A CASE STUDY OF EGBIN THERMAL STATION Onah C. O. 1, Agber J. U. 2 and Ikule F. T. 3 1, 2, 3 Department of Electrical and Electronics

More information

Lecture 9 Evolutionary Computation: Genetic algorithms

Lecture 9 Evolutionary Computation: Genetic algorithms Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Simulation of natural evolution Genetic algorithms Case study: maintenance scheduling with genetic

More information

Research Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems

Research Article A Novel Differential Evolution Invasive Weed Optimization Algorithm for Solving Nonlinear Equations Systems Journal of Applied Mathematics Volume 2013, Article ID 757391, 18 pages http://dx.doi.org/10.1155/2013/757391 Research Article A Novel Differential Evolution Invasive Weed Optimization for Solving Nonlinear

More information

CSC 4510 Machine Learning

CSC 4510 Machine Learning 10: Gene(c Algorithms CSC 4510 Machine Learning Dr. Mary Angela Papalaskari Department of CompuBng Sciences Villanova University Course website: www.csc.villanova.edu/~map/4510/ Slides of this presenta(on

More information

Hybrid particle swarm algorithm for solving nonlinear constraint. optimization problem [5].

Hybrid particle swarm algorithm for solving nonlinear constraint. optimization problem [5]. Hybrid particle swarm algorithm for solving nonlinear constraint optimization problems BINGQIN QIAO, XIAOMING CHANG Computers and Software College Taiyuan University of Technology Department of Economic

More information

Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction

Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction Evolutionary Functional Link Interval Type-2 Fuzzy Neural System for Exchange Rate Prediction 3. Introduction Currency exchange rate is an important element in international finance. It is one of the chaotic,

More information

OPTIMAL POWER FLOW BASED ON PARTICLE SWARM OPTIMIZATION

OPTIMAL POWER FLOW BASED ON PARTICLE SWARM OPTIMIZATION U.P.B. Sci. Bull., Series C, Vol. 78, Iss. 3, 2016 ISSN 2286-3540 OPTIMAL POWER FLOW BASED ON PARTICLE SWARM OPTIMIZATION Layth AL-BAHRANI 1, Virgil DUMBRAVA 2 Optimal Power Flow (OPF) is one of the most

More information

ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC ALGORITHM FOR NONLINEAR MIMO MODEL OF MACHINING PROCESSES

ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC ALGORITHM FOR NONLINEAR MIMO MODEL OF MACHINING PROCESSES International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1455 1475 ARTIFICIAL NEURAL NETWORK WITH HYBRID TAGUCHI-GENETIC

More information

Binary Particle Swarm Optimization with Crossover Operation for Discrete Optimization

Binary Particle Swarm Optimization with Crossover Operation for Discrete Optimization Binary Particle Swarm Optimization with Crossover Operation for Discrete Optimization Deepak Singh Raipur Institute of Technology Raipur, India Vikas Singh ABV- Indian Institute of Information Technology

More information

Available online at ScienceDirect. Procedia Computer Science 20 (2013 ) 90 95

Available online at  ScienceDirect. Procedia Computer Science 20 (2013 ) 90 95 Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 20 (2013 ) 90 95 Complex Adaptive Systems, Publication 3 Cihan H. Dagli, Editor in Chief Conference Organized by Missouri

More information

An Improved Differential Evolution Trained Neural Network Scheme for Nonlinear System Identification

An Improved Differential Evolution Trained Neural Network Scheme for Nonlinear System Identification International Journal of Automation and Computing 6(2), May 2009, 137-144 DOI: 10.1007/s11633-009-0137-0 An Improved Differential Evolution Trained Neural Network Scheme for Nonlinear System Identification

More information

Zebo Peng Embedded Systems Laboratory IDA, Linköping University

Zebo Peng Embedded Systems Laboratory IDA, Linköping University TDTS 01 Lecture 8 Optimization Heuristics for Synthesis Zebo Peng Embedded Systems Laboratory IDA, Linköping University Lecture 8 Optimization problems Heuristic techniques Simulated annealing Genetic

More information

Journal of Engineering Science and Technology Review 7 (1) (2014)

Journal of Engineering Science and Technology Review 7 (1) (2014) Jestr Journal of Engineering Science and Technology Review 7 () (204) 32 36 JOURNAL OF Engineering Science and Technology Review www.jestr.org Particle Swarm Optimization-based BP Neural Network for UHV

More information

Particle swarm optimization (PSO): a potentially useful tool for chemometrics?

Particle swarm optimization (PSO): a potentially useful tool for chemometrics? Particle swarm optimization (PSO): a potentially useful tool for chemometrics? Federico Marini 1, Beata Walczak 2 1 Sapienza University of Rome, Rome, Italy 2 Silesian University, Katowice, Poland Rome,

More information

Optimization. Totally not complete this is...don't use it yet...

Optimization. Totally not complete this is...don't use it yet... Optimization Totally not complete this is...don't use it yet... Bisection? Doing a root method is akin to doing a optimization method, but bi-section would not be an effective method - can detect sign

More information

An Improved Quantum Evolutionary Algorithm with 2-Crossovers

An Improved Quantum Evolutionary Algorithm with 2-Crossovers An Improved Quantum Evolutionary Algorithm with 2-Crossovers Zhihui Xing 1, Haibin Duan 1,2, and Chunfang Xu 1 1 School of Automation Science and Electrical Engineering, Beihang University, Beijing, 100191,

More information

Blind Source Separation Using Artificial immune system

Blind Source Separation Using Artificial immune system American Journal of Engineering Research (AJER) e-issn : 2320-0847 p-issn : 2320-0936 Volume-03, Issue-02, pp-240-247 www.ajer.org Research Paper Open Access Blind Source Separation Using Artificial immune

More information

B-Positive Particle Swarm Optimization (B.P.S.O)

B-Positive Particle Swarm Optimization (B.P.S.O) Int. J. Com. Net. Tech. 1, No. 2, 95-102 (2013) 95 International Journal of Computing and Network Technology http://dx.doi.org/10.12785/ijcnt/010201 B-Positive Particle Swarm Optimization (B.P.S.O) Muhammad

More information

Tuning of Extended Kalman Filter for nonlinear State Estimation

Tuning of Extended Kalman Filter for nonlinear State Estimation OSR Journal of Computer Engineering (OSR-JCE) e-ssn: 78-0661,p-SSN: 78-877, Volume 18, ssue 5, Ver. V (Sep. - Oct. 016), PP 14-19 www.iosrjournals.org Tuning of Extended Kalman Filter for nonlinear State

More information

Sections 18.6 and 18.7 Artificial Neural Networks

Sections 18.6 and 18.7 Artificial Neural Networks Sections 18.6 and 18.7 Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline The brain vs artifical neural networks

More information

2 Differential Evolution and its Control Parameters

2 Differential Evolution and its Control Parameters COMPETITIVE DIFFERENTIAL EVOLUTION AND GENETIC ALGORITHM IN GA-DS TOOLBOX J. Tvrdík University of Ostrava 1 Introduction The global optimization problem with box constrains is formed as follows: for a

More information

Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing

Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing International Conference on Artificial Intelligence (IC-AI), Las Vegas, USA, 2002: 1163-1169 Solving Numerical Optimization Problems by Simulating Particle-Wave Duality and Social Information Sharing Xiao-Feng

More information

Particle Swarm Optimization of Hidden Markov Models: a comparative study

Particle Swarm Optimization of Hidden Markov Models: a comparative study Particle Swarm Optimization of Hidden Markov Models: a comparative study D. Novák Department of Cybernetics Czech Technical University in Prague Czech Republic email:xnovakd@labe.felk.cvut.cz M. Macaš,

More information

A Particle Swarm Optimization (PSO) Primer

A Particle Swarm Optimization (PSO) Primer A Particle Swarm Optimization (PSO) Primer With Applications Brian Birge Overview Introduction Theory Applications Computational Intelligence Summary Introduction Subset of Evolutionary Computation Genetic

More information

Linear Regression. CSL603 - Fall 2017 Narayanan C Krishnan

Linear Regression. CSL603 - Fall 2017 Narayanan C Krishnan Linear Regression CSL603 - Fall 2017 Narayanan C Krishnan ckn@iitrpr.ac.in Outline Univariate regression Multivariate regression Probabilistic view of regression Loss functions Bias-Variance analysis Regularization

More information

Contents. Preface. 1 Introduction Optimization view on mathematical models NLP models, black-box versus explicit expression 3

Contents. Preface. 1 Introduction Optimization view on mathematical models NLP models, black-box versus explicit expression 3 Contents Preface ix 1 Introduction 1 1.1 Optimization view on mathematical models 1 1.2 NLP models, black-box versus explicit expression 3 2 Mathematical modeling, cases 7 2.1 Introduction 7 2.2 Enclosing

More information

Linear Regression. CSL465/603 - Fall 2016 Narayanan C Krishnan

Linear Regression. CSL465/603 - Fall 2016 Narayanan C Krishnan Linear Regression CSL465/603 - Fall 2016 Narayanan C Krishnan ckn@iitrpr.ac.in Outline Univariate regression Multivariate regression Probabilistic view of regression Loss functions Bias-Variance analysis

More information

Sections 18.6 and 18.7 Analysis of Artificial Neural Networks

Sections 18.6 and 18.7 Analysis of Artificial Neural Networks Sections 18.6 and 18.7 Analysis of Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline Univariate regression

More information

Linear Regression. S. Sumitra

Linear Regression. S. Sumitra Linear Regression S Sumitra Notations: x i : ith data point; x T : transpose of x; x ij : ith data point s jth attribute Let {(x 1, y 1 ), (x, y )(x N, y N )} be the given data, x i D and y i Y Here D

More information

Sections 18.6 and 18.7 Artificial Neural Networks

Sections 18.6 and 18.7 Artificial Neural Networks Sections 18.6 and 18.7 Artificial Neural Networks CS4811 - Artificial Intelligence Nilufer Onder Department of Computer Science Michigan Technological University Outline The brain vs. artifical neural

More information

Genetic Algorithm for Solving the Economic Load Dispatch

Genetic Algorithm for Solving the Economic Load Dispatch International Journal of Electronic and Electrical Engineering. ISSN 0974-2174, Volume 7, Number 5 (2014), pp. 523-528 International Research Publication House http://www.irphouse.com Genetic Algorithm

More information

CAPACITOR PLACEMENT USING FUZZY AND PARTICLE SWARM OPTIMIZATION METHOD FOR MAXIMUM ANNUAL SAVINGS

CAPACITOR PLACEMENT USING FUZZY AND PARTICLE SWARM OPTIMIZATION METHOD FOR MAXIMUM ANNUAL SAVINGS CAPACITOR PLACEMENT USING FUZZY AND PARTICLE SWARM OPTIMIZATION METHOD FOR MAXIMUM ANNUAL SAVINGS M. Damodar Reddy and V. C. Veera Reddy Department of Electrical and Electronics Engineering, S.V. University,

More information

A PARTICLE SWARM OPTIMIZATION TO OPTIMAL SHUNT-CAPACITOR PLACEMENT IN RADIAL DISTRIBUTION SYSTEMS

A PARTICLE SWARM OPTIMIZATION TO OPTIMAL SHUNT-CAPACITOR PLACEMENT IN RADIAL DISTRIBUTION SYSTEMS ISSN (Print) : 30 3765 ISSN (Online): 78 8875 (An ISO 397: 007 Certified Organization) ol., Issue 0, October 03 A PARTICLE SWARM OPTIMIZATION TO OPTIMAL SHUNT-CAPACITOR PLACEMENT IN RADIAL DISTRIBUTION

More information

Selected Topics in Optimization. Some slides borrowed from

Selected Topics in Optimization. Some slides borrowed from Selected Topics in Optimization Some slides borrowed from http://www.stat.cmu.edu/~ryantibs/convexopt/ Overview Optimization problems are almost everywhere in statistics and machine learning. Input Model

More information

Rprop Using the Natural Gradient

Rprop Using the Natural Gradient Trends and Applications in Constructive Approximation (Eds.) M.G. de Bruin, D.H. Mache & J. Szabados International Series of Numerical Mathematics Vol. 1?? c 2005 Birkhäuser Verlag Basel (ISBN 3-7643-7124-2)

More information

CS325 Artificial Intelligence Chs. 18 & 4 Supervised Machine Learning (cont)

CS325 Artificial Intelligence Chs. 18 & 4 Supervised Machine Learning (cont) CS325 Artificial Intelligence Cengiz Spring 2013 Model Complexity in Learning f(x) x Model Complexity in Learning f(x) x Let s start with the linear case... Linear Regression Linear Regression price =

More information

Regular paper. Particle Swarm Optimization Applied to the Economic Dispatch Problem

Regular paper. Particle Swarm Optimization Applied to the Economic Dispatch Problem Rafik Labdani Linda Slimani Tarek Bouktir Electrical Engineering Department, Oum El Bouaghi University, 04000 Algeria. rlabdani@yahoo.fr J. Electrical Systems 2-2 (2006): 95-102 Regular paper Particle

More information

Applying Particle Swarm Optimization to Adaptive Controller Leandro dos Santos Coelho 1 and Fabio A. Guerra 2

Applying Particle Swarm Optimization to Adaptive Controller Leandro dos Santos Coelho 1 and Fabio A. Guerra 2 Applying Particle Swarm Optimization to Adaptive Controller Leandro dos Santos Coelho 1 and Fabio A. Guerra 2 1 Production and Systems Engineering Graduate Program, PPGEPS Pontifical Catholic University

More information

Development. biologically-inspired computing. lecture 16. Informatics luis rocha x x x. Syntactic Operations. biologically Inspired computing

Development. biologically-inspired computing. lecture 16. Informatics luis rocha x x x. Syntactic Operations. biologically Inspired computing lecture 16 -inspired S S2 n p!!! 1 S Syntactic Operations al Code:N Development x x x 1 2 n p S Sections I485/H400 course outlook Assignments: 35% Students will complete 4/5 assignments based on algorithms

More information

NEAREST NEIGHBOR CLASSIFICATION WITH IMPROVED WEIGHTED DISSIMILARITY MEASURE

NEAREST NEIGHBOR CLASSIFICATION WITH IMPROVED WEIGHTED DISSIMILARITY MEASURE THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Series A, OF THE ROMANIAN ACADEMY Volume 0, Number /009, pp. 000 000 NEAREST NEIGHBOR CLASSIFICATION WITH IMPROVED WEIGHTED DISSIMILARITY MEASURE

More information

Fuzzy adaptive catfish particle swarm optimization

Fuzzy adaptive catfish particle swarm optimization ORIGINAL RESEARCH Fuzzy adaptive catfish particle swarm optimization Li-Yeh Chuang, Sheng-Wei Tsai, Cheng-Hong Yang. Institute of Biotechnology and Chemical Engineering, I-Shou University, Kaohsiung, Taiwan

More information

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009

AN INTRODUCTION TO NEURAL NETWORKS. Scott Kuindersma November 12, 2009 AN INTRODUCTION TO NEURAL NETWORKS Scott Kuindersma November 12, 2009 SUPERVISED LEARNING We are given some training data: We must learn a function If y is discrete, we call it classification If it is

More information

13. Nonlinear least squares

13. Nonlinear least squares L. Vandenberghe ECE133A (Fall 2018) 13. Nonlinear least squares definition and examples derivatives and optimality condition Gauss Newton method Levenberg Marquardt method 13.1 Nonlinear least squares

More information

Lecture 10. Neural networks and optimization. Machine Learning and Data Mining November Nando de Freitas UBC. Nonlinear Supervised Learning

Lecture 10. Neural networks and optimization. Machine Learning and Data Mining November Nando de Freitas UBC. Nonlinear Supervised Learning Lecture 0 Neural networks and optimization Machine Learning and Data Mining November 2009 UBC Gradient Searching for a good solution can be interpreted as looking for a minimum of some error (loss) function

More information

Joint Entropy based Sampling in PBIL for MLFS

Joint Entropy based Sampling in PBIL for MLFS Joint Entropy based Sampling in PBIL for MLFS In Jun Yu( 유인준 ) 2017 08 28 Artificial Intelligence Lab 1 1. Introduction Evolutionary Algorithms(EA) have recently received much attention from the Feature

More information

Competitive Self-adaptation in Evolutionary Algorithms

Competitive Self-adaptation in Evolutionary Algorithms Competitive Self-adaptation in Evolutionary Algorithms Josef Tvrdík University of Ostrava josef.tvrdik@osu.cz Ivan Křivý University of Ostrava ivan.krivy@osu.cz Abstract Heuristic search for the global

More information

4. Multilayer Perceptrons

4. Multilayer Perceptrons 4. Multilayer Perceptrons This is a supervised error-correction learning algorithm. 1 4.1 Introduction A multilayer feedforward network consists of an input layer, one or more hidden layers, and an output

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Gradient-based Methods Marc Toussaint U Stuttgart Gradient descent methods Plain gradient descent (with adaptive stepsize) Steepest descent (w.r.t. a known metric) Conjugate

More information

Data Fitting and Uncertainty

Data Fitting and Uncertainty TiloStrutz Data Fitting and Uncertainty A practical introduction to weighted least squares and beyond With 124 figures, 23 tables and 71 test questions and examples VIEWEG+ TEUBNER IX Contents I Framework

More information

Crossing Genetic and Swarm Intelligence Algorithms to Generate Logic Circuits

Crossing Genetic and Swarm Intelligence Algorithms to Generate Logic Circuits Crossing Genetic and Swarm Intelligence Algorithms to Generate Logic Circuits Cecília Reis and J. A. Tenreiro Machado GECAD - Knowledge Engineering and Decision Support Group / Electrical Engineering Department

More information

Numerical Optimization: Basic Concepts and Algorithms

Numerical Optimization: Basic Concepts and Algorithms May 27th 2015 Numerical Optimization: Basic Concepts and Algorithms R. Duvigneau R. Duvigneau - Numerical Optimization: Basic Concepts and Algorithms 1 Outline Some basic concepts in optimization Some

More information

Multi-wind Field Output Power Prediction Method based on Energy Internet and DBPSO-LSSVM

Multi-wind Field Output Power Prediction Method based on Energy Internet and DBPSO-LSSVM , pp.128-133 http://dx.doi.org/1.14257/astl.16.138.27 Multi-wind Field Output Power Prediction Method based on Energy Internet and DBPSO-LSSVM *Jianlou Lou 1, Hui Cao 1, Bin Song 2, Jizhe Xiao 1 1 School

More information

An Evolution Strategy for the Induction of Fuzzy Finite-state Automata

An Evolution Strategy for the Induction of Fuzzy Finite-state Automata Journal of Mathematics and Statistics 2 (2): 386-390, 2006 ISSN 1549-3644 Science Publications, 2006 An Evolution Strategy for the Induction of Fuzzy Finite-state Automata 1,2 Mozhiwen and 1 Wanmin 1 College

More information

Optimization of Threshold for Energy Based Spectrum Sensing Using Differential Evolution

Optimization of Threshold for Energy Based Spectrum Sensing Using Differential Evolution Wireless Engineering and Technology 011 130-134 doi:10.436/wet.011.3019 Published Online July 011 (http://www.scirp.org/journal/wet) Optimization of Threshold for Energy Based Spectrum Sensing Using Differential

More information

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau

Last update: October 26, Neural networks. CMSC 421: Section Dana Nau Last update: October 26, 207 Neural networks CMSC 42: Section 8.7 Dana Nau Outline Applications of neural networks Brains Neural network units Perceptrons Multilayer perceptrons 2 Example Applications

More information

Secondary Frequency Control of Microgrids In Islanded Operation Mode and Its Optimum Regulation Based on the Particle Swarm Optimization Algorithm

Secondary Frequency Control of Microgrids In Islanded Operation Mode and Its Optimum Regulation Based on the Particle Swarm Optimization Algorithm International Academic Institute for Science and Technology International Academic Journal of Science and Engineering Vol. 3, No. 1, 2016, pp. 159-169. ISSN 2454-3896 International Academic Journal of

More information

Local Strong Convexity of Maximum-Likelihood TDOA-Based Source Localization and Its Algorithmic Implications

Local Strong Convexity of Maximum-Likelihood TDOA-Based Source Localization and Its Algorithmic Implications Local Strong Convexity of Maximum-Likelihood TDOA-Based Source Localization and Its Algorithmic Implications Huikang Liu, Yuen-Man Pun, and Anthony Man-Cho So Dept of Syst Eng & Eng Manag, The Chinese

More information

Numerical Methods For Optimization Problems Arising In Energetic Districts

Numerical Methods For Optimization Problems Arising In Energetic Districts Numerical Methods For Optimization Problems Arising In Energetic Districts Elisa Riccietti, Stefania Bellavia and Stefano Sello Abstract This paper deals with the optimization of energy resources management

More information

Stochastic Velocity Threshold Inspired by Evolutionary Programming

Stochastic Velocity Threshold Inspired by Evolutionary Programming Stochastic Velocity Threshold Inspired by Evolutionary Programming Zhihua Cui Xingjuan Cai and Jianchao Zeng Complex System and Computational Intelligence Laboratory, Taiyuan University of Science and

More information

Verification of a hypothesis about unification and simplification for position updating formulas in particle swarm optimization.

Verification of a hypothesis about unification and simplification for position updating formulas in particle swarm optimization. nd Workshop on Advanced Research and Technology in Industry Applications (WARTIA ) Verification of a hypothesis about unification and simplification for position updating formulas in particle swarm optimization

More information

On Optimal Power Flow

On Optimal Power Flow On Optimal Power Flow K. C. Sravanthi #1, Dr. M. S. Krishnarayalu #2 # Department of Electrical and Electronics Engineering V R Siddhartha Engineering College, Vijayawada, AP, India Abstract-Optimal Power

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks 鮑興國 Ph.D. National Taiwan University of Science and Technology Outline Perceptrons Gradient descent Multi-layer networks Backpropagation Hidden layer representations Examples

More information

A Method of HVAC Process Object Identification Based on PSO

A Method of HVAC Process Object Identification Based on PSO 2017 3 45 313 doi 10.3969 j.issn.1673-7237.2017.03.004 a a b a. b. 201804 PID PID 2 TU831 A 1673-7237 2017 03-0019-05 A Method of HVAC Process Object Identification Based on PSO HOU Dan - lin a PAN Yi

More information

Capacitor Placement for Economical Electrical Systems using Ant Colony Search Algorithm

Capacitor Placement for Economical Electrical Systems using Ant Colony Search Algorithm Capacitor Placement for Economical Electrical Systems using Ant Colony Search Algorithm Bharat Solanki Abstract The optimal capacitor placement problem involves determination of the location, number, type

More information

Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine

Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine Electric Load Forecasting Using Wavelet Transform and Extreme Learning Machine Song Li 1, Peng Wang 1 and Lalit Goel 1 1 School of Electrical and Electronic Engineering Nanyang Technological University

More information

CONTROL OF ROBOT CAMERA SYSTEM WITH ACTUATOR S DYNAMICS TO TRACK MOVING OBJECT

CONTROL OF ROBOT CAMERA SYSTEM WITH ACTUATOR S DYNAMICS TO TRACK MOVING OBJECT Journal of Computer Science and Cybernetics, V.31, N.3 (2015), 255 265 DOI: 10.15625/1813-9663/31/3/6127 CONTROL OF ROBOT CAMERA SYSTEM WITH ACTUATOR S DYNAMICS TO TRACK MOVING OBJECT NGUYEN TIEN KIEM

More information

Department of Mathematics, Graphic Era University, Dehradun, Uttarakhand, India

Department of Mathematics, Graphic Era University, Dehradun, Uttarakhand, India Genetic Algorithm for Minimization of Total Cost Including Customer s Waiting Cost and Machine Setup Cost for Sequence Dependent Jobs on a Single Processor Neelam Tyagi #1, Mehdi Abedi *2 Ram Gopal Varshney

More information

Fast Nonnegative Matrix Factorization with Rank-one ADMM

Fast Nonnegative Matrix Factorization with Rank-one ADMM Fast Nonnegative Matrix Factorization with Rank-one Dongjin Song, David A. Meyer, Martin Renqiang Min, Department of ECE, UCSD, La Jolla, CA, 9093-0409 dosong@ucsd.edu Department of Mathematics, UCSD,

More information

Local Search & Optimization

Local Search & Optimization Local Search & Optimization CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2018 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition, Chapter 4 Some

More information

Local Search & Optimization

Local Search & Optimization Local Search & Optimization CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2017 Soleymani Artificial Intelligence: A Modern Approach, 3 rd Edition, Chapter 4 Outline

More information

System Identification and Optimization Methods Based on Derivatives. Chapters 5 & 6 from Jang

System Identification and Optimization Methods Based on Derivatives. Chapters 5 & 6 from Jang System Identification and Optimization Methods Based on Derivatives Chapters 5 & 6 from Jang Neuro-Fuzzy and Soft Computing Model space Adaptive networks Neural networks Fuzzy inf. systems Approach space

More information

Gas Detection System Based on Multi-Sensor Fusion with BP Neural Network

Gas Detection System Based on Multi-Sensor Fusion with BP Neural Network Sensors & Transducers 2013 by IFSA http://www.sensorsportal.com Gas Detection System Based on Multi-Sensor Fusion with BP Neural Network Qiu-Xia LIU Department of Physics, Heze University, Heze Shandong

More information

Gaussian and Linear Discriminant Analysis; Multiclass Classification

Gaussian and Linear Discriminant Analysis; Multiclass Classification Gaussian and Linear Discriminant Analysis; Multiclass Classification Professor Ameet Talwalkar Slide Credit: Professor Fei Sha Professor Ameet Talwalkar CS260 Machine Learning Algorithms October 13, 2015

More information

Bounded Approximation Algorithms

Bounded Approximation Algorithms Bounded Approximation Algorithms Sometimes we can handle NP problems with polynomial time algorithms which are guaranteed to return a solution within some specific bound of the optimal solution within

More information

Neural Networks and Deep Learning

Neural Networks and Deep Learning Neural Networks and Deep Learning Professor Ameet Talwalkar November 12, 2015 Professor Ameet Talwalkar Neural Networks and Deep Learning November 12, 2015 1 / 16 Outline 1 Review of last lecture AdaBoost

More information

Acceleration of Levenberg-Marquardt method training of chaotic systems fuzzy modeling

Acceleration of Levenberg-Marquardt method training of chaotic systems fuzzy modeling ISSN 746-7233, England, UK World Journal of Modelling and Simulation Vol. 3 (2007) No. 4, pp. 289-298 Acceleration of Levenberg-Marquardt method training of chaotic systems fuzzy modeling Yuhui Wang, Qingxian

More information

Lecture 5: Logistic Regression. Neural Networks

Lecture 5: Logistic Regression. Neural Networks Lecture 5: Logistic Regression. Neural Networks Logistic regression Comparison with generative models Feed-forward neural networks Backpropagation Tricks for training neural networks COMP-652, Lecture

More information

DESIGN OF MULTILAYER MICROWAVE BROADBAND ABSORBERS USING CENTRAL FORCE OPTIMIZATION

DESIGN OF MULTILAYER MICROWAVE BROADBAND ABSORBERS USING CENTRAL FORCE OPTIMIZATION Progress In Electromagnetics Research B, Vol. 26, 101 113, 2010 DESIGN OF MULTILAYER MICROWAVE BROADBAND ABSORBERS USING CENTRAL FORCE OPTIMIZATION M. J. Asi and N. I. Dib Department of Electrical Engineering

More information

STEPPED-FREQUENCY ISAR MOTION COMPENSATION USING PARTICLE SWARM OPTIMIZATION WITH AN ISLAND MODEL

STEPPED-FREQUENCY ISAR MOTION COMPENSATION USING PARTICLE SWARM OPTIMIZATION WITH AN ISLAND MODEL Progress In Electromagnetics Research, PIER 85, 25 37, 2008 STEPPED-FREQUENCY ISAR MOTION COMPENSATION USING PARTICLE SWARM OPTIMIZATION WITH AN ISLAND MODEL S. H. Park and H. T. Kim Department of Electronic

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

A Comparison of Nonlinear Regression Codes

A Comparison of Nonlinear Regression Codes Journal of Modern Applied Statistical Methods Volume 4 Issue 1 Article 31 5-1-2005 A Comparison of Nonlinear Regression Codes Paul Fredrick Mondragon United States Navy Brian Borchers New Mexico Tech Follow

More information

Application of GA and PSO Tuned Fuzzy Controller for AGC of Three Area Thermal- Thermal-Hydro Power System

Application of GA and PSO Tuned Fuzzy Controller for AGC of Three Area Thermal- Thermal-Hydro Power System International Journal of Computer Theory and Engineering, Vol. 2, No. 2 April, 2 793-82 Application of GA and PSO Tuned Fuzzy Controller for AGC of Three Area Thermal- Thermal-Hydro Power System S. K.

More information

Coordinate Descent and Ascent Methods

Coordinate Descent and Ascent Methods Coordinate Descent and Ascent Methods Julie Nutini Machine Learning Reading Group November 3 rd, 2015 1 / 22 Projected-Gradient Methods Motivation Rewrite non-smooth problem as smooth constrained problem:

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY /ESD.77 MSDO /ESD.77J Multidisciplinary System Design Optimization (MSDO) Spring 2010.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY /ESD.77 MSDO /ESD.77J Multidisciplinary System Design Optimization (MSDO) Spring 2010. 16.888/ESD.77J Multidisciplinary System Design Optimization (MSDO) Spring 2010 Assignment 4 Instructors: Prof. Olivier de Weck Prof. Karen Willcox Dr. Anas Alfaris Dr. Douglas Allaire TAs: Andrew March

More information

Available online at AASRI Procedia 1 (2012 ) AASRI Conference on Computational Intelligence and Bioinformatics

Available online at  AASRI Procedia 1 (2012 ) AASRI Conference on Computational Intelligence and Bioinformatics Available online at www.sciencedirect.com AASRI Procedia ( ) 377 383 AASRI Procedia www.elsevier.com/locate/procedia AASRI Conference on Computational Intelligence and Bioinformatics Chaotic Time Series

More information

Matrix Derivatives and Descent Optimization Methods

Matrix Derivatives and Descent Optimization Methods Matrix Derivatives and Descent Optimization Methods 1 Qiang Ning Department of Electrical and Computer Engineering Beckman Institute for Advanced Science and Techonology University of Illinois at Urbana-Champaign

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning First-Order Methods, L1-Regularization, Coordinate Descent Winter 2016 Some images from this lecture are taken from Google Image Search. Admin Room: We ll count final numbers

More information

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING *

A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL FUZZY CLUSTERING * No.2, Vol.1, Winter 2012 2012 Published by JSES. A SEASONAL FUZZY TIME SERIES FORECASTING METHOD BASED ON GUSTAFSON-KESSEL * Faruk ALPASLAN a, Ozge CAGCAG b Abstract Fuzzy time series forecasting methods

More information

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA 1/ 21

Neural Networks. Chapter 18, Section 7. TB Artificial Intelligence. Slides from AIMA   1/ 21 Neural Networks Chapter 8, Section 7 TB Artificial Intelligence Slides from AIMA http://aima.cs.berkeley.edu / 2 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural

More information

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web: jxh

Evolutionary Computation Theory. Jun He School of Computer Science University of Birmingham Web:   jxh Evolutionary Computation Theory Jun He School of Computer Science University of Birmingham Web: www.cs.bham.ac.uk/ jxh Outline Motivation History Schema Theorem Convergence and Convergence Rate Computational

More information

NONLINEAR IDENTIFICATION ON BASED RBF NEURAL NETWORK

NONLINEAR IDENTIFICATION ON BASED RBF NEURAL NETWORK DAAAM INTERNATIONAL SCIENTIFIC BOOK 2011 pp. 547-554 CHAPTER 44 NONLINEAR IDENTIFICATION ON BASED RBF NEURAL NETWORK BURLAK, V. & PIVONKA, P. Abstract: This article is focused on the off-line identification

More information

The Story So Far... The central problem of this course: Smartness( X ) arg max X. Possibly with some constraints on X.

The Story So Far... The central problem of this course: Smartness( X ) arg max X. Possibly with some constraints on X. Heuristic Search The Story So Far... The central problem of this course: arg max X Smartness( X ) Possibly with some constraints on X. (Alternatively: arg min Stupidness(X ) ) X Properties of Smartness(X)

More information

Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning

Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning LIU, LU, GU: GROUP SPARSE NMF FOR MULTI-MANIFOLD LEARNING 1 Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning Xiangyang Liu 1,2 liuxy@sjtu.edu.cn Hongtao Lu 1 htlu@sjtu.edu.cn

More information

ACTA UNIVERSITATIS APULENSIS No 11/2006

ACTA UNIVERSITATIS APULENSIS No 11/2006 ACTA UNIVERSITATIS APULENSIS No /26 Proceedings of the International Conference on Theory and Application of Mathematics and Informatics ICTAMI 25 - Alba Iulia, Romania FAR FROM EQUILIBRIUM COMPUTATION

More information

Shape of Gaussians as Feature Descriptors

Shape of Gaussians as Feature Descriptors Shape of Gaussians as Feature Descriptors Liyu Gong, Tianjiang Wang and Fang Liu Intelligent and Distributed Computing Lab, School of Computer Science and Technology Huazhong University of Science and

More information

Learning Tetris. 1 Tetris. February 3, 2009

Learning Tetris. 1 Tetris. February 3, 2009 Learning Tetris Matt Zucker Andrew Maas February 3, 2009 1 Tetris The Tetris game has been used as a benchmark for Machine Learning tasks because its large state space (over 2 200 cell configurations are

More information

Optimization 2. CS5240 Theoretical Foundations in Multimedia. Leow Wee Kheng

Optimization 2. CS5240 Theoretical Foundations in Multimedia. Leow Wee Kheng Optimization 2 CS5240 Theoretical Foundations in Multimedia Leow Wee Kheng Department of Computer Science School of Computing National University of Singapore Leow Wee Kheng (NUS) Optimization 2 1 / 38

More information

Classification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses about the label (Top-5 error) No Bounding Box

Classification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses about the label (Top-5 error) No Bounding Box ImageNet Classification with Deep Convolutional Neural Networks Alex Krizhevsky, Ilya Sutskever, Geoffrey E. Hinton Motivation Classification goals: Make 1 guess about the label (Top-1 error) Make 5 guesses

More information

Optimal conductor selection in radial distribution system using discrete particle swarm optimization

Optimal conductor selection in radial distribution system using discrete particle swarm optimization ISSN 1 746-7233, England, UK World Journal of Modelling and Simulation Vol. 5 (2009) No. 2, pp. 96-104 Optimal conductor selection in radial distribution system using discrete particle swarm optimization

More information