Image Segmentation and Compression using Neural Networks
|
|
- Mitchell Barton
- 6 years ago
- Views:
Transcription
1 Image Segmentaton and Compresson usng Neural Networks Constantno Carlos Reyes-Aldasoro, Ana Laura Aldeco Departamento de Sstemas Dgtales Insttuto Tecnológco Autónomo de Méxco Río Hondo No. 1, Tzapán San Angel, Méxco D.F. ABSTRACT Kohonen [1] has developed an algorthm wth self-organsng propertes for a network of adaptve elements. These elements receve an nput sgnal and the sgnal representatons are automatcally mapped onto a set of output responses so that these responses acqure the same topologcal order as the nput sgnal. Images can be used as nput sgnals and the networks can adjust to extract certan topologcal features. Image segmentaton can be performed satsfactorly. By emprcal knowledge, t can be supposed that as the number of neurones ncreases, so does the qualty of the segmentaton. Ths paper concentrates on the relatonshp between the qualty of segmentaton and the number of neurones that consttute a Kohonen Neural Network. Several experments were conducted and the Eucldean dstance between adjacent neurones measured the qualty of the segmentaton, whch tended to mantan constant after a certan optmum level. The amount of nformaton of the orgnal set of mages was compared wth the nformaton of the segmented structure and results were presented. Compresson rates hgher than 250:1 were obtaned. Keywords: Neural Networks, Self-Organsng Maps, Image Segmentaton. 1. Introducton Artfcal Neural Networks are software or hardware systems that try to smulate a smlar structure to the one that s beleved the human bran has. Most neural networks n the bran, especally n the cortex, are formed by two-dmensonal layers of cellular modules that are densely nterconnected between them. Ths area of the bran s organsed nto several sensory modaltes such as speech or hearng. The response sgnals of these areas are obtaned n the same topographcal order on the cortex n whch they were receved at the sensory organs. The theoretcal nvestgatons n the self-organsng maps (SOMs) [1] were motvated by the possblty that the representaton of the knowledge n a partcular category of thngs n general mght assume the form of a feature map that s geometrcally organsed over a part of the bran. In ths neural model, each neurone or node s densely nterconnected wth the rest of the neurones. The temporal status of a neurone, as well as the nput sgnal, s represented by ts topologcal poston x, y, z. The nterconnecton of neurones s consdered as a lateral couplng. The functon that defnes the couplng has two dfferent actons: exctatory and nhbtory. The exctatory nteracton exsts n a regon defned by a short range up to a certan radus wth the neurone as a centre, and the nhbtory regon surrounds the exctatory area up to a bgger radus. Outsde the nhbtory range, a weaker, and much bgger exctatory zone exsts. The ntensty of the acton decreases as the dstance from the neurone ncreases. A cluster or bubble, called the neghbourhood, around one partcular node of the network s formed because of the lateral couplng around a gven cell. The prmary nput receved by the network determnes a "wnner" neurone. Around ths wnner neurone, exctatory and nhbtory regons wll form. The wnner node wll adapt to the nput sgnal and then the neurones that le wthn exctatory and nhbtory regons wll adapt
2 themselves accordngly. Ths process of adaptaton wll contnue for a number of teratons untl a certan degree of adaptaton s reached. When the nput s an mage, certan features can be extracted from the fnal adaptaton of the neurones. The remanng of the document s organsed as follows: the next secton descrbes the mplementaton of the algorthm. Secton 3 deals wth the experments over medcal mages, n secton 4 a defnton of the qualty n terms of the number of neurones s presented, secton 5 dscusses the compresson rate of the algorthm. Fnally, conclusons are presented. 2. Implementaton of the Kohonen Algorthm The Self-Organsng algorthm proposed by Kohonen [1] follows two basc equatons: matchng and fndng the wnner neurone determned by the mnmum Eucldean dstance to the nput (1) and the update of the poston of neurones nsde the cluster (2). x( m ( = m n m ( t + 1) = m ( + α( m ( t + 1) = m ( c x( m ( [ x( m ( ] N c N c (1) (2) Where, for tme t, and a network wth n neurones: x s the nput N c s the neghbourhood of the wnner, 1< N c <n α s the gan sequence 0<α<1 m s any node, 1<<n, and s the wnner, m c It should be noted from equaton (2) that the nhbtory regon s not beng consdered and the ntensty of the acton nsde the exctatory regon was consdered constant. The orgnal "Mexcan Hat" functon was reduced to a gate functon wth satsfactory results (see Fgure 1). Acton Lateral dstance Fgure 1 Lateral degree of nteracton: Mexcan Hat and Step Functons Fgure 2 descrbes graphcally the Self-Organsng algorthm. Frst, an nput sgnal x, s receved and the network determnes a "wnner" neurone by calculatng the Eucldean dstances wth equaton 1. The updatng process of equaton 2 s a varaton of the topologcal locaton of the neurone, proportonal to the Eucldean dstance from the wnner node to the nput. The gan sequence α s a value between 0 and 1 that reduces wth tme. In Fgure 2.a only the wnner neurone adapts to the nput sgnal and n Fgure 2.b other neurones that le wthn the
3 neghbourhood of the wnner adapt ther topologcal co-ordnates. Neurones outsde the neghbourhood reman unaltered. Ths process s reterated untl certan crteron s satsfed. (a) N c ( (b) m c ( N c ( m c (t+1) x( m c (t+1) Fgure 2 Graphcal descrpton of Self-Organsng process x( Kohonen [1] stated that the neghbourhood should be shrnkng n tme and α s a lnearly decreasng functon and the process stops when α=0. Ths behavour allows a fast and coarse adaptaton at the begnnng of the process and a fne and slow adaptaton at the end. Fgure 3 shows an n-64 network (8 by 8) adapted to a square nput regon after 4000 teratons. The ntal values of the neghbourhood and the gan sequences and ther varaton wth tme are studed n [2]. Fgure 3. Adaptaton of an 8*8 network to square nput regon. 3. Experments over medcal mages Medcal Images have receved consderable attenton n several areas, beng segmentaton one of the most nterestng ones [3]. The SOM algorthm can be related wth medcal mage segmentaton n the followng way. The nput sgnal receved bye de SOM n fgure 3 was a smple square regon wth equal probablty for every poston. If an mage s to be used as nput sgnal for a self-organsng algorthm, a probablty and a weght can be assgned to each pxel of the mage.
4 Fgure 4 shows a Magnetc Resonance (MR) mage of a human head. Ths mage can be converted to a two dmensonal matrx where for each x, y poston, a z value s assgned accordng to the grey level ntensty of the current pxel. Ths transformaton allows the mage to be used as nput for a SOM. As the map receves the nput, a wnner s selected and then neghbourhood s updated to the mage. Ths process can present nterestng segmentaton results of dfferent structures of the mage. Fgure 4 Human head Magnetc Resonance Image As an example of segmentaton, The mage n fgure 4 s pre-segmented by grey level and then an annular SOM wth 80 neurones s used to segment the surface of the mage. The result s presented n Fgure 5. Fgure 5 Segmentaton of Magnetc Resonance mage of fgure 4. It can be noted, that the number of neurones play a crtcal role n the segmentaton process, and ntutvely, as the number of neurones ncreases, the dstance between them s reduced and therefore, the qualty of the segmentaton s also ncreased. In [4] a MR mages database of a human head s used to extract the border of the mages,.e. the shape of the head. In fgure 6 the segmentaton of 54 slces of a human head MR mages s shown, for each slce, 58 neurones are used n the segmentaton.
5 Fgure 6 Segmentaton of MR mages of a human head If the prevous ponts are used to reconstruct the surface of the head, several problems arse. The algorthm presented n [5] s used to obtan the surface defned by the 3D collecton of ponts depcted n fgure 6. Along the surface, several "holes" are observed due to the lack of nformaton n those specfc postons. Ths lack of nformaton s caused by the absence of a neurone n a crtcal x, y, z poston that could have been avoded f more neurones would have been used n the segmentaton. Fgure 7. Reconstructon of surface defned by the ponts n fgure 6 shown as shaded mage It s obvous that addng neurones to the SOM wll ncrease the computatonal complexty to the process defned by equatons (1) and (2), therefore the number of neurones should not be ncreased at wll. Indeed, the queston arses, s there a lmt n the qualty of the segmentaton as the number of neurones s ncreased? The next secton studes the segmentaton qualty, dependng on the number of neurones. 4. Number of neurones and qualty defnton The algorthm of the SOM that was presented n secton 2 depends on a seres of neurones that wll self adapt to a certan nput sgnal. As the neurones adapt to a sgnal, so does the Eucldean
6 dstance between the neurones. Ths dstance can defne the qualty of the fnal segmentaton as t can be seen on fgure 8. The fgure presents the fnal state of 4 SOMs wth dfferent number of neurones; 10, 20, 40 and 80. It can be observed that as the number of neurones ncreases, the shape defned by the neurones resembles better the shape of the human head as presented n the MR mage of fgure neurones 20 neurones 40 neurones 80 neurones Fgure 8 Segmentaton obtaned by SOM wth dfferent number of neurones The SOM wth 80 neurones s evdently better than the one wth 10, but the dfference between 80 and 40 neurones s not so clear, therefore, an analytc defnton of the qualty should be used. As the network adaptaton s closer to the nput sgnal, and the number of neurones ncreases, the dstance between neurones wll decrease. Ths dstance between adjacent neurones wll be used as a qualty parameter followng: d adj max = max m ( m+ 1( (3) Three dfferent MR mages were used as nput sgnals and equaton (3) was appled to the segmentatons obtaned wth SOMs of dfferent number of neurones. The experment was repeated numerous tmes to obtan an average measurement for each mage. For all the experments the parameters; Iteratons = 15000, α(0) = 0.2, and N c = 0.4 n were constant. In the three cases, the maxmum dstance between adjacent neurones tended to decrease asymptotcally as the number of neurones ncreased. The fnal number of neurones wll depend on the partcular mage, but n all cases a lmt value seemed to be obtaned closer to 300 neurones, after ths regon, the number of neurones make no dfference n the qualty of the segmentaton.
7 Maxmum Adjacent Dstance Neurones Image 1 Image 2 Image 3 Fgure 9 Maxmum dstance between adjacent neurones 5. Compresson Rate The process of segmentaton of a certan structure of the mage, as the external shape of a human head n the past examples, mples a loss of nformaton. The segmentaton delberately looses the nformaton that corresponds to all the nternal structures such as the bran, cerebellum, and ventrcles and retans the poston of the external contour. Nevertheless, through ths process, the amount of nformaton correspondng to the x, y, z postons of the neurones s consderably less than the orgnal mage. 10,000 1, Compresson rate Neurones / slce The mages consst of 512*512 pxels, each wth 256 levels of grey, or 8 bts/pxel. Each slce of the MR mples therefore: I mage = 512*512*8 = 2,097,152 [bts].
8 And the whole set of 54 MR mages of a human head: I head = 512*512*8*54 = 113,246,208 [bts]. The resultng SOM wll requre n turn the x, y, z poston for each neurone or 9 * 3 = 27 bts/neurone. Fgure 6 has 54 SOMs, each wth 58 neurones, 84,564 [bts], a compresson rate closer to 1340:1. Evdently, the orgnal set of mages contan the whole head, but at ths rate, other 1300 smlar structures could be segmented and treated as a sngle database wth the same amount of nformaton as the orgnal set of mages. Fgure 10 presents the relatonshp of the compresson rate and the number of neurones. If a number of 310 neurones s consdered as the optmum for the adjacent dstance, the compresson rate would be 250:1. 6. Conclusons The Kohonen Self-Organsng Algorthm was programmed to use medcal mages as nput sgnals. The use of an annular Self-Organsng Map allowed segmentng the external shape of a human head out of database of Magnetc Resonance mages. Through ths process of segmentaton, the nformaton descrbng certan structures of the mage s dscarded thus compressng the nformaton requred to descrbe the segmented structure. Numerous experments were conducted over dfferent mages to determne the qualty of the segmentaton measured as the maxmum dstance between adjacent neurones of a SOM. These experments showed that as the number of neurones ncreased, the dstance decreased up to a certan level where dstance tend to reman constant. An optmum number of neurones wll depend on the partcular type of mage to be processed. As the number of neurones ncrease, so does the amount of nformaton comprsed by the postons of the SOM, and therefore, the compresson rate decreases. The fnal compresson rate wll be determned by the partcular complexty mage to be segmented ts and the qualty desred, but even wth 310 neurones, the compresson rate would be 250:1. The results encourage the use of SOM for mage segmentaton. 7. Acknowledgements Keth A. Johnson and J. Alex Becker from Brgham and Women's Hosptal, Harvard Medcal School, provded the Magnetc Resonance mages, through The Whole Bran Atlas. The authors are grateful to them. Ths work was supported by CONACYT. 8. References [1] Kohonen T (1988). Self-Organzaton and Assocatve Memory, Sprnger-Verlag, Hedelberg. [2] Reyes-Aldasoro CC (1998). A Non-lnear Decrease Rate to Optmse the Convergence of the Kohonen Neural Network Self-Organsng Algorthm, ROCC99 Acapulco, Mexco, pp [3] Kapur T (1999). Model based three dmensonal Medcal Image Segmentaton, Ph.D. Thess, Artfcal Intellgence Laboratory, Massachusetts Insttute of Technology. [4] Reyes Aldasoro, CC, Algorr Guzmán, M.E. (2000) "A Combned Algorthm for Image Segmentaton usng Neural Networks and 3D Surface Reconstructon usng Dynamc Meshes", V IBERO-AMERICAN SYMPOSIUM ON PATTERN RECOGNITION, LISBON, Portugal, September [5] Algorr, M.E., F.Schmtt, (1996) ''Surface Reconstructon from Unstructured 3D Data'', Computer Graphcs Forum, 15(1), pp
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationCSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography
CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve
More informationInternet Engineering. Jacek Mazurkiewicz, PhD Softcomputing. Part 3: Recurrent Artificial Neural Networks Self-Organising Artificial Neural Networks
Internet Engneerng Jacek Mazurkewcz, PhD Softcomputng Part 3: Recurrent Artfcal Neural Networks Self-Organsng Artfcal Neural Networks Recurrent Artfcal Neural Networks Feedback sgnals between neurons Dynamc
More informationNumerical Heat and Mass Transfer
Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and
More informationGEMINI GEneric Multimedia INdexIng
GEMINI GEnerc Multmeda INdexIng Last lecture, LSH http://www.mt.edu/~andon/lsh/ Is there another possble soluton? Do we need to perform ANN? 1 GEnerc Multmeda INdexIng dstance measure Sub-pattern Match
More information8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS
SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 493 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces you have studed thus far n the text are real vector spaces because the scalars
More informationCHAPTER III Neural Networks as Associative Memory
CHAPTER III Neural Networs as Assocatve Memory Introducton One of the prmary functons of the bran s assocatve memory. We assocate the faces wth names, letters wth sounds, or we can recognze the people
More informationDesign and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm
Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:
More informationTutorial 2. COMP4134 Biometrics Authentication. February 9, Jun Xu, Teaching Asistant
Tutoral 2 COMP434 ometrcs uthentcaton Jun Xu, Teachng sstant csjunxu@comp.polyu.edu.hk February 9, 207 Table of Contents Problems Problem : nswer the questons Problem 2: Power law functon Problem 3: Convoluton
More informationChapter 8. Potential Energy and Conservation of Energy
Chapter 8 Potental Energy and Conservaton of Energy In ths chapter we wll ntroduce the followng concepts: Potental Energy Conservatve and non-conservatve forces Mechancal Energy Conservaton of Mechancal
More informationReport on Image warping
Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.
More informationModule 9. Lecture 6. Duality in Assignment Problems
Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept
More informationOn the Multicriteria Integer Network Flow Problem
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 5, No 2 Sofa 2005 On the Multcrtera Integer Network Flow Problem Vassl Vasslev, Marana Nkolova, Maryana Vassleva Insttute of
More informationA New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane
A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,
More informationNUMERICAL DIFFERENTIATION
NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the
More information1 GSW Iterative Techniques for y = Ax
1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn
More informationLecture 4: November 17, Part 1 Single Buffer Management
Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input
More informationImprovement of Histogram Equalization for Minimum Mean Brightness Error
Proceedngs of the 7 WSEAS Int. Conference on Crcuts, Systems, Sgnal and elecommuncatons, Gold Coast, Australa, January 7-9, 7 3 Improvement of Hstogram Equalzaton for Mnmum Mean Brghtness Error AAPOG PHAHUA*,
More informationInductance Calculation for Conductors of Arbitrary Shape
CRYO/02/028 Aprl 5, 2002 Inductance Calculaton for Conductors of Arbtrary Shape L. Bottura Dstrbuton: Internal Summary In ths note we descrbe a method for the numercal calculaton of nductances among conductors
More informationGrover s Algorithm + Quantum Zeno Effect + Vaidman
Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More information12. The Hamilton-Jacobi Equation Michael Fowler
1. The Hamlton-Jacob Equaton Mchael Fowler Back to Confguraton Space We ve establshed that the acton, regarded as a functon of ts coordnate endponts and tme, satsfes ( ) ( ) S q, t / t+ H qpt,, = 0, and
More informationCopyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor
Taylor Enterprses, Inc. Control Lmts for P Charts Copyrght 2017 by Taylor Enterprses, Inc., All Rghts Reserved. Control Lmts for P Charts Dr. Wayne A. Taylor Abstract: P charts are used for count data
More informationChapter Newton s Method
Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve
More informationSome modelling aspects for the Matlab implementation of MMA
Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton
More informationANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)
Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of
More informationAffine transformations and convexity
Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationTuring Machines (intro)
CHAPTER 3 The Church-Turng Thess Contents Turng Machnes defntons, examples, Turng-recognzable and Turng-decdable languages Varants of Turng Machne Multtape Turng machnes, non-determnstc Turng Machnes,
More informationOn balancing multiple video streams with distributed QoS control in mobile communications
On balancng multple vdeo streams wth dstrbuted QoS control n moble communcatons Arjen van der Schaaf, José Angel Lso Arellano, and R. (Inald) L. Lagendjk TU Delft, Mekelweg 4, 68 CD Delft, The Netherlands
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationStructure and Drive Paul A. Jensen Copyright July 20, 2003
Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.
More informationCONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING INTRODUCTION
CONTRAST ENHANCEMENT FOR MIMIMUM MEAN BRIGHTNESS ERROR FROM HISTOGRAM PARTITIONING N. Phanthuna 1,2, F. Cheevasuvt 2 and S. Chtwong 2 1 Department of Electrcal Engneerng, Faculty of Engneerng Rajamangala
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More informationThis column is a continuation of our previous column
Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationCommon loop optimizations. Example to improve locality. Why Dependence Analysis. Data Dependence in Loops. Goal is to find best schedule:
15-745 Lecture 6 Data Dependence n Loops Copyrght Seth Goldsten, 2008 Based on sldes from Allen&Kennedy Lecture 6 15-745 2005-8 1 Common loop optmzatons Hostng of loop-nvarant computatons pre-compute before
More informationThe Quadratic Trigonometric Bézier Curve with Single Shape Parameter
J. Basc. Appl. Sc. Res., (3541-546, 01 01, TextRoad Publcaton ISSN 090-4304 Journal of Basc and Appled Scentfc Research www.textroad.com The Quadratc Trgonometrc Bézer Curve wth Sngle Shape Parameter Uzma
More informationA New Algorithm for Training Multi-layered Morphological Networks
A New Algorthm for Tranng Mult-layered Morphologcal Networs Rcardo Barrón, Humberto Sossa, and Benamín Cruz Centro de Investgacón en Computacón-IPN Av. Juan de Dos Bátz esquna con Mguel Othón de Mendzábal
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationDifference Equations
Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1
More informationWeek3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity
Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle
More informationMarkov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement
Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs
More informationp 1 c 2 + p 2 c 2 + p 3 c p m c 2
Where to put a faclty? Gven locatons p 1,..., p m n R n of m houses, want to choose a locaton c n R n for the fre staton. Want c to be as close as possble to all the house. We know how to measure dstance
More informationAppendix B: Resampling Algorithms
407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles
More information1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations
Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys
More informationχ x B E (c) Figure 2.1.1: (a) a material particle in a body, (b) a place in space, (c) a configuration of the body
Secton.. Moton.. The Materal Body and Moton hyscal materals n the real world are modeled usng an abstract mathematcal entty called a body. Ths body conssts of an nfnte number of materal partcles. Shown
More informationTransfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system
Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng
More informationAGC Introduction
. Introducton AGC 3 The prmary controller response to a load/generaton mbalance results n generaton adjustment so as to mantan load/generaton balance. However, due to droop, t also results n a non-zero
More informationRotation Invariant Shape Contexts based on Feature-space Fourier Transformation
Fourth Internatonal Conference on Image and Graphcs Rotaton Invarant Shape Contexts based on Feature-space Fourer Transformaton Su Yang 1, Yuanyuan Wang Dept of Computer Scence and Engneerng, Fudan Unversty,
More informationBoostrapaggregating (Bagging)
Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod
More information9 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations
Physcs 171/271 - Chapter 9R -Davd Klenfeld - Fall 2005 9 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys a set
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationFinding Dense Subgraphs in G(n, 1/2)
Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationApproximate Smallest Enclosing Balls
Chapter 5 Approxmate Smallest Enclosng Balls 5. Boundng Volumes A boundng volume for a set S R d s a superset of S wth a smple shape, for example a box, a ball, or an ellpsod. Fgure 5.: Boundng boxes Q(P
More informationECE559VV Project Report
ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate
More informationfind (x): given element x, return the canonical element of the set containing x;
COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:
More informationx = , so that calculated
Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to
More informationAdmin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester
0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #
More information10-701/ Machine Learning, Fall 2005 Homework 3
10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40
More informationFoundations of Arithmetic
Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationGravitational Acceleration: A case of constant acceleration (approx. 2 hr.) (6/7/11)
Gravtatonal Acceleraton: A case of constant acceleraton (approx. hr.) (6/7/11) Introducton The gravtatonal force s one of the fundamental forces of nature. Under the nfluence of ths force all objects havng
More informationStatistics II Final Exam 26/6/18
Statstcs II Fnal Exam 26/6/18 Academc Year 2017/18 Solutons Exam duraton: 2 h 30 mn 1. (3 ponts) A town hall s conductng a study to determne the amount of leftover food produced by the restaurants n the
More informationComputational Fluid Dynamics. Smoothed Particle Hydrodynamics. Simulations. Smoothing Kernels and Basis of SPH
Computatonal Flud Dynamcs If you want to learn a bt more of the math behnd flud dynamcs, read my prevous post about the Naver- Stokes equatons and Newtonan fluds. The equatons derved n the post are the
More informationExperience with Automatic Generation Control (AGC) Dynamic Simulation in PSS E
Semens Industry, Inc. Power Technology Issue 113 Experence wth Automatc Generaton Control (AGC) Dynamc Smulaton n PSS E Lu Wang, Ph.D. Staff Software Engneer lu_wang@semens.com Dngguo Chen, Ph.D. Staff
More informationTemperature. Chapter Heat Engine
Chapter 3 Temperature In prevous chapters of these notes we ntroduced the Prncple of Maxmum ntropy as a technque for estmatng probablty dstrbutons consstent wth constrants. In Chapter 9 we dscussed the
More informationThe internal structure of natural numbers and one method for the definition of large prime numbers
The nternal structure of natural numbers and one method for the defnton of large prme numbers Emmanul Manousos APM Insttute for the Advancement of Physcs and Mathematcs 3 Poulou str. 53 Athens Greece Abstract
More informationText S1: Detailed proofs for The time scale of evolutionary innovation
Text S: Detaled proofs for The tme scale of evolutonary nnovaton Krshnendu Chatterjee Andreas Pavloganns Ben Adlam Martn A. Nowak. Overvew and Organzaton We wll present detaled proofs of all our results.
More informationLinear Feature Engineering 11
Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19
More informationMODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS
The 3 rd Internatonal Conference on Mathematcs and Statstcs (ICoMS-3) Insttut Pertanan Bogor, Indonesa, 5-6 August 28 MODELING TRAFFIC LIGHTS IN INTERSECTION USING PETRI NETS 1 Deky Adzkya and 2 Subono
More informationSimulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests
Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth
More informationOn the Interval Zoro Symmetric Single-step Procedure for Simultaneous Finding of Polynomial Zeros
Appled Mathematcal Scences, Vol. 5, 2011, no. 75, 3693-3706 On the Interval Zoro Symmetrc Sngle-step Procedure for Smultaneous Fndng of Polynomal Zeros S. F. M. Rusl, M. Mons, M. A. Hassan and W. J. Leong
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationThe Second Anti-Mathima on Game Theory
The Second Ant-Mathma on Game Theory Ath. Kehagas December 1 2006 1 Introducton In ths note we wll examne the noton of game equlbrum for three types of games 1. 2-player 2-acton zero-sum games 2. 2-player
More informationA particle in a state of uniform motion remain in that state of motion unless acted upon by external force.
The fundamental prncples of classcal mechancs were lad down by Galleo and Newton n the 16th and 17th centures. In 1686, Newton wrote the Prncpa where he gave us three laws of moton, one law of gravty,
More informationLecture Note 3. Eshelby s Inclusion II
ME340B Elastcty of Mcroscopc Structures Stanford Unversty Wnter 004 Lecture Note 3. Eshelby s Incluson II Chrs Wenberger and We Ca c All rghts reserved January 6, 004 Contents 1 Incluson energy n an nfnte
More informationComputing Correlated Equilibria in Multi-Player Games
Computng Correlated Equlbra n Mult-Player Games Chrstos H. Papadmtrou Presented by Zhanxang Huang December 7th, 2005 1 The Author Dr. Chrstos H. Papadmtrou CS professor at UC Berkley (taught at Harvard,
More informationCHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD
CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB
More informationOnline Appendix to: Axiomatization and measurement of Quasi-hyperbolic Discounting
Onlne Appendx to: Axomatzaton and measurement of Quas-hyperbolc Dscountng José Lus Montel Olea Tomasz Strzaleck 1 Sample Selecton As dscussed before our ntal sample conssts of two groups of subjects. Group
More informationCHAPTER-5 INFORMATION MEASURE OF FUZZY MATRIX AND FUZZY BINARY RELATION
CAPTER- INFORMATION MEASURE OF FUZZY MATRI AN FUZZY BINARY RELATION Introducton The basc concept of the fuzz matr theor s ver smple and can be appled to socal and natural stuatons A branch of fuzz matr
More informationUnsupervised Learning
Unsupervsed Learnng Kevn Swngler What s Unsupervsed Learnng? Most smply, t can be thought of as learnng to recognse and recall thngs Recognton I ve seen that before Recall I ve seen that before and I can
More informationConsistency & Convergence
/9/007 CHE 374 Computatonal Methods n Engneerng Ordnary Dfferental Equatons Consstency, Convergence, Stablty, Stffness and Adaptve and Implct Methods ODE s n MATLAB, etc Consstency & Convergence Consstency
More informationLOW BIAS INTEGRATED PATH ESTIMATORS. James M. Calvin
Proceedngs of the 007 Wnter Smulaton Conference S G Henderson, B Bller, M-H Hseh, J Shortle, J D Tew, and R R Barton, eds LOW BIAS INTEGRATED PATH ESTIMATORS James M Calvn Department of Computer Scence
More information1 Matrix representations of canonical matrices
1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:
More informationn α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0
MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector
More informationLinear Regression Analysis: Terminology and Notation
ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented
More informationCredit Card Pricing and Impact of Adverse Selection
Credt Card Prcng and Impact of Adverse Selecton Bo Huang and Lyn C. Thomas Unversty of Southampton Contents Background Aucton model of credt card solctaton - Errors n probablty of beng Good - Errors n
More informationLossy Compression. Compromise accuracy of reconstruction for increased compression.
Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost
More informationPhysics 5153 Classical Mechanics. Principle of Virtual Work-1
P. Guterrez 1 Introducton Physcs 5153 Classcal Mechancs Prncple of Vrtual Work The frst varatonal prncple we encounter n mechancs s the prncple of vrtual work. It establshes the equlbrum condton of a mechancal
More information/ n ) are compared. The logic is: if the two
STAT C141, Sprng 2005 Lecture 13 Two sample tests One sample tests: examples of goodness of ft tests, where we are testng whether our data supports predctons. Two sample tests: called as tests of ndependence
More informationOutline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]
DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm
More informationMA 323 Geometric Modelling Course Notes: Day 13 Bezier Curves & Bernstein Polynomials
MA 323 Geometrc Modellng Course Notes: Day 13 Bezer Curves & Bernsten Polynomals Davd L. Fnn Over the past few days, we have looked at de Casteljau s algorthm for generatng a polynomal curve, and we have
More informationP R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /
Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons
More informationDifferentiating Gaussian Processes
Dfferentatng Gaussan Processes Andrew McHutchon Aprl 17, 013 1 Frst Order Dervatve of the Posteror Mean The posteror mean of a GP s gven by, f = x, X KX, X 1 y x, X α 1 Only the x, X term depends on the
More information