Big Data Analytics! Special Topics for Computer Science CSE CSE Mar 31
|
|
- Fay Hancock
- 5 years ago
- Views:
Transcription
1 Bg Data Analytcs! Specal Tpcs fr Cmputer Scence CSE CSE ! Mar 31 Fe Wang Asscate Prfessr Department f Cmputer Scence and Engneerng fe_wang@ucnn.edu
2 Intrductn t Deep Learnng
3 Perceptrn In machne learnng, the perceptrn s an algrthm fr supervsed classfcatn f an nput nt ne f several pssble nn-bnary utputs. It s a type f lnear classfer,.e. a classfcatn algrthm that makes ts predctns based n a lnear predctr functn cmbnng a set f weghts wth the feature vectr Actvatn functn
4 Actvatn Functn Rectfer Functn Sgmd Functn Hyperblc Tangent Functn Step Functn
5 Perceptrn Learnng 1 Intalze the weghts and threshld t small randm numbers. 2 Present a vectr x t the neurn nputs and calculate the utput. 3 Update the weghts accrdng t:! where d s the desred utput, t s the teratn number, and eta s the gan r step sze, where 0.0 < n < 1.0! 4 Repeat steps 2 and 3 untl: the teratn errr s less than a user-specfed errr threshld r a predetermned number f teratns have been cmpleted.
6 Mult-Layer Perceptrn
7 Back-Prpagatn Hw t adust the weghts? Fr utput neurn the desred and target utput s knwn, s the adustment s smple Fr hdden neurns t s nt that bvus! Intutvely: f a hdden neurn s cnnected t utput wth large errr, adust ts weghts a lt, therwse dn t alter the weghts t much Mathematcally: weghts f a hdden neurn are adusted n drect prprtn t the errr n the neurn t whch t s cnnected
8 Back-Prpagatn a neural netwrk wth ne hdden layer; ndexes: ver utput neurns, ver hdden, k ver nputs, mse (ver all neurns, ver all patterns): 1 E ( d ) 2 d 2 target utput f neurn fr nput pattern actual utput f neurn fr nput pattern Express E n terms f weghts and nput sgnals ver nput patterns 1. Input fr the hdden neurn fr : net wk. k k b f ( net ) f ( wk. k b ) 2. Actvatn f neurn as functn f ts nput: k
9 Back-Prpagatn 3. Input fr the utput neurn : net f ( w. b w f ( wk. k t ) k 4. Output fr the utput neurn : w E f ( net w 1 2. f ) ( k f ( w k. w k. t b ) b 5. Substtutng 4 nt E: ) d f ( w. f ( wk. k t ) b )) k ) 6. Steepest gradent descent: adust the weghts s that the change mves the system dwn the errr surface n the drectn f the lcally steepest descent, gven by the negatve f the gradent: where 2 b E d f net w d f net fr utput neurn COMP4302/5322 Neural Netwrks, w4, s
10 Back-Prpagatn 8. Fr hdden neurn - calculatng the dervatves usng the chan rule: w k where E w d f net w f net w k E f k net k 9. In general, fr a cnnectn frm p t q : w pq w k f net w fr hdden neurn q p w w w nputpatterns k k pq new pq ld pq where s actvatn f an nput r hdden neurn and eq. 7 (utput neurn) r eq. 8 (hdden neurn) s gven ether by
11 Back-Prpagatn 10. Frm the frmulas fr => we must be able t calculate the dervatves fr f. Fr a sgmd transfer functn: utput neurn hdden neurn f net e l l net l ( ) 1 1 l l net net l net l l e e net e net l l l ) (1 d w Backprpagatn rule fr sgmd transfer functn: d w ) 1 ( k k k w w 1 1
12 Back-Prpagatn 1. Determne the archtecture hw many nput and utput neurns; what utput encdng hdden neurns and layers 2. Intalze all weghts and bases t small randm values, typcally [-1,1] 3. Repeat untl termnatn crtern satsfed: Present a tranng example and prpagate t thrugh the netwrk (frward pass) Calculate the actual utput Adapt weghts startng frm the utput layer and wrkng backwards (backward pass) w pq ( t 1) w ( t) w pq pq w pq (t) -weght frm nde p t nde q at tme t w pq q p d 1 ) ( 1 w -weght change -fr utput neurn -fr hdden neurn (the sum s ver the ndes n the layer abve the nde ) p p w pq q p
13 Back-Prpagatn The stppng crtera s checked at the end f each epch: The errr (mean abslute r mean square) at the end f an epch s belw a threshld All tranng examples are prpagated and the mean (abslute r square) errr s calculated The threshld s determned heurstcly e.g. 0.3 Maxmum number f epchs s reached Early stppng usng a valdatn set (TTS) It typcally takes hundreds r thusands f epchs fr an NN t cnverge
14 Neuralnetwrk Backprpagatn Nature 1986 g(x) w 1 w 2 w 3 x 1 x 2 x 3 f(net)
15 Neuralnetwrk Backprpagatn Nature 1986 Slvegenerallearnngprblems Tedwthblgcalsystem th l t Buttsgvenup Hardttran Insuffcentcmputatnalresurces Smalltranngsets Desntwrkwell
16 Neuralnetwrk Backprpagatn Nature SVM Flatstructures Bstng Decsntree KNN Lsetewthblgcalsystems th l t Specfcmethdsfrspecfctasks Handcraftedfeatures(GMMHMM,SIFT,LBP,HOG),, Krugeretal.TPAMI 13
17 Neuralnetwrk Backprpagatn Nature Deepbelefnet Scence Unsupervsed&Layerwsedpretranng Btt Betterdesgnsfrmdelngandtranng dl d t (nrmalzatn,nnlnearty,drput) Newdevelpmentfcmputerarchtectures f GPU Multcrecmputersystems Largescaledatabases
18 Neuralnetwrk Backprpagatn Nature Deepbelefnet Scence Speech deeplearnngresults Slvegenerallearnngprblems Tedwthblgcalsystem th l t Buttsgvenup
19 Neuralnetwrk Backprpagatn Nature Deepbelefnet Scence Speech Rank Name Errr rate Descrptn 1 U.Trnt Deeplearnng 2 U.Tky Handcrafted crafted 3 U.Oxfrd featuresand learnngmdels. 4 Xerx/INRIA Bttleneck. Obectrecgntnver1,000,000magesand1,000categres(2GPU) A. Krzhevsky, L. Sutskever, and G. E. Hntn, ImageNet Classfcatn wth Deep Cnvlutnal Neural Netwrks, NIPS, 2012.
20 Neuralnetwrk Backprpagatn Deepbelefnet Scence Speech ImageNet 2013 mageclassfcatnchallenge challenge Rank Name Errrrate Descrptn 1 NYU Deeplearnng 2 NUS Deeplearnng 3 Oxfrd Deeplearnng MSRA,IBM,Adbe,NEC,Clarfa,Berkley,U.Tky,UCLA,UIUC,Trnt.Tp20 grupsalluseddeeplearnng ImageNet 2013 bectdetectnchallenge Rank Name MeanAverage Precsn Descrptn 1 UvAEuvsn Handcraftedfeatures 2 NECMU Handcraftedfeatures df 3 NYU Deeplearnng
21 Neuralnetwrk Backprpagatn Deepbelefnet Scence Speech ImageNet 2014 Imageclassfcatnchallenge challenge Rank Name Errrrate Descrptn 1 Ggle Deeplearnng 2 Oxfrd Deeplearnng 3 MSRA Deeplearnng ImageNet 2014 bectdetectnchallenge Rank Name MeanAverage Precsn Descrptn 1 Ggle Deeplearnng 2 CUHK Deeplearnng 3 DeepInsght Deeplearnng 4 UvAEuvsn Deeplearnng 5 BerkleyVsn Deeplearnng
22 Neuralnetwrk Backprpagatn Deepbelefnet Scence Speech GgleandBadu annuncedtherdeep deep learnngbasedvsualsearchengnes(2013) Ggle nurtestsetwesawdubletheaverageprecsnwhen cmparedttherappracheswehadtred.weacqured ther appraches had tred acqured therghtstthetechnlgyandwentfullspeedahead adaptngttrunatlargescalenggle scmputers.we tkcuttngedgeresearchstraghtutfanacademc straght ut f an academc researchlabandlaunchedt,nustalttleversxmnths. Badu
23
24 Cnvlutnal NeuralNetwrks(CNN) FrstprpsedbyFukushman1980 ImprvedbyLeCun,Bttu,Beng andhaffner n1998 Cnvlutn Plng Learned flters
25
26
27 Cnvlutn
28 Plng
29 Estmate the utput 1 = L 1 (x) 2 = L 2 (L 1 (x)) 5 = L 5 ( L 4 ( L 3 ( L 2 ( L 1 (x) ) ) ) ) Cmpute the lss functn C = Lss( 5, y) Cmpute the gradent L 5 L 4 L 3 L 2 L 1 x
30 Estmate the utput (Frward prpagatn) 5 = L 5 ( L 4 ( L 3 ( L 2 ( L 1 (x) ) ) ) ) Cmpute the gradent (Backward prpagatn)
1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationMultilayer Perceptrons and Backpropagation. Perceptrons. Recap: Perceptrons. Informatics 1 CG: Lecture 6. Mirella Lapata
Multlayer Perceptrons and Informatcs CG: Lecture 6 Mrella Lapata School of Informatcs Unversty of Ednburgh mlap@nf.ed.ac.uk Readng: Kevn Gurney s Introducton to Neural Networks, Chapters 5 6.5 January,
More informationDrought Modelling based on Artificial Intelligence and Neural Network Algorithms: A case study in Queensland, Australia
Drught Mdellng based n Artfcal Intellgence and Neural Netwrk Algrthms: A case study n Queensland Australa Kavna S Dayal (PhD Canddate) Ravnesh C De Armand A Apan Unversty f Suthern Queensland Australa
More informationMulti-layer neural networks
Lecture 0 Mult-layer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Lnear regresson w Lnear unts f () Logstc regresson T T = w = p( y =, w) = g( w ) w z f () = p ( y = ) w d w d Gradent
More informationMultilayer neural networks
Lecture Multlayer neural networks Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Mdterm exam Mdterm Monday, March 2, 205 In-class (75 mnutes) closed book materal covered by February 25, 205 Multlayer
More informationENGI 4421 Probability & Statistics
Lecture Ntes fr ENGI 441 Prbablty & Statstcs by Dr. G.H. Gerge Asscate Prfessr, Faculty f Engneerng and Appled Scence Seventh Edtn, reprnted 018 Sprng http://www.engr.mun.ca/~ggerge/441/ Table f Cntents
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationFeedback Principle :-
Feedback Prncple : Feedback amplfer s that n whch a part f the utput f the basc amplfer s returned back t the nput termnal and mxed up wth the nternal nput sgnal. The sub netwrks f feedback amplfer are:
More informationLogistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI
Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton
More informationIntroduction to Electronic circuits.
Intrductn t Electrnc crcuts. Passve and Actve crcut elements. Capactrs, esstrs and Inductrs n AC crcuts. Vltage and current dvders. Vltage and current surces. Amplfers, and ther transfer characterstc.
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More informationCSC321 Tutorial 9: Review of Boltzmann machines and simulated annealing
CSC321 Tutoral 9: Revew of Boltzmann machnes and smulated annealng (Sldes based on Lecture 16-18 and selected readngs) Yue L Emal: yuel@cs.toronto.edu Wed 11-12 March 19 Fr 10-11 March 21 Outlne Boltzmann
More informationLecture 23: Artificial neural networks
Lecture 23: Artfcal neural networks Broad feld that has developed over the past 20 to 30 years Confluence of statstcal mechancs, appled math, bology and computers Orgnal motvaton: mathematcal modelng of
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More informationIGEE 401 Power Electronic Systems. Solution to Midterm Examination Fall 2004
Jós, G GEE 401 wer Electrnc Systems Slutn t Mdterm Examnatn Fall 2004 Specal nstructns: - Duratn: 75 mnutes. - Materal allwed: a crb sheet (duble sded 8.5 x 11), calculatr. - Attempt all questns. Make
More informationReproducing kernel Hilbert spaces. Nuno Vasconcelos ECE Department, UCSD
Reprucng ernel Hlbert spaces Nun Vascncels ECE Department UCSD Classfcatn a classfcatn prblem has tw tpes f varables X -vectr f bservatns features n the wrl Y - state class f the wrl Perceptrn: classfer
More informationAdmin NEURAL NETWORKS. Perceptron learning algorithm. Our Nervous System 10/25/16. Assignment 7. Class 11/22. Schedule for the rest of the semester
0/25/6 Admn Assgnment 7 Class /22 Schedule for the rest of the semester NEURAL NETWORKS Davd Kauchak CS58 Fall 206 Perceptron learnng algorthm Our Nervous System repeat untl convergence (or for some #
More informationCHAPTER 3: FEEDBACK. Dr. Wan Mahani Hafizah binti Wan Mahmud
CHPTER 3: FEEDBCK Dr. Wan Mahan Hafzah bnt Wan Mahmud Feedback ntrductn Types f Feedback dvantages, Characterstcs and effect f Negatve Feedback mplfers Crcuts wth negatve feedback Pstve feedback and Oscllatr
More informationCOMP9444 Neural Networks and Deep Learning 3. Backpropagation
COMP9444 Neural Netwrks and Deep Learning 3. Backprpagatin Tetbk, Sectins 4.3, 5.2, 6.5.2 COMP9444 17s2 Backprpagatin 1 Outline Supervised Learning Ockham s Razr (5.2) Multi-Layer Netwrks Gradient Descent
More informationChapter 3, Solution 1C.
COSMOS: Cmplete Onlne Slutns Manual Organzatn System Chapter 3, Slutn C. (a If the lateral surfaces f the rd are nsulated, the heat transfer surface area f the cylndrcal rd s the bttm r the tp surface
More information1 Input-Output Mappings. 2 Hebbian Failure. 3 Delta Rule Success.
Task Learnng 1 / 27 1 Input-Output Mappngs. 2 Hebban Falure. 3 Delta Rule Success. Input-Output Mappngs 2 / 27 0 1 2 3 4 5 6 7 8 9 Output 3 8 2 7 Input 5 6 0 9 1 4 Make approprate: Response gven stmulus.
More informationOther NN Models. Reinforcement learning (RL) Probabilistic neural networks
Other NN Models Renforcement learnng (RL) Probablstc neural networks Support vector machne (SVM) Renforcement learnng g( (RL) Basc deas: Supervsed dlearnng: (delta rule, BP) Samples (x, f(x)) to learn
More informationEvaluation of classifiers MLPs
Lecture Evaluaton of classfers MLPs Mlos Hausrecht mlos@cs.ptt.edu 539 Sennott Square Evaluaton For any data set e use to test the model e can buld a confuson matrx: Counts of examples th: class label
More informationMATH 567: Mathematical Techniques in Data Science Lab 8
1/14 MATH 567: Mathematcal Technques n Data Scence Lab 8 Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 11, 2017 Recall We have: a (2) 1 = f(w (1) 11 x 1 + W (1) 12 x 2 + W
More informationHopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen
Hopfeld networks and Boltzmann machnes Geoffrey Hnton et al. Presented by Tambet Matsen 18.11.2014 Hopfeld network Bnary unts Symmetrcal connectons http://www.nnwj.de/hopfeld-net.html Energy functon The
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationCircuits Op-Amp. Interaction of Circuit Elements. Quick Check How does closing the switch affect V o and I o?
Crcuts Op-Amp ENGG1015 1 st Semester, 01 Interactn f Crcut Elements Crcut desgn s cmplcated by nteractns amng the elements. Addng an element changes vltages & currents thrughut crcut. Example: clsng a
More information10-701/ Machine Learning, Fall 2005 Homework 3
10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40
More informationSection 3: Detailed Solutions of Word Problems Unit 1: Solving Word Problems by Modeling with Formulas
Sectn : Detaled Slutns f Wrd Prblems Unt : Slvng Wrd Prblems by Mdelng wth Frmulas Example : The factry nvce fr a mnvan shws that the dealer pad $,5 fr the vehcle. If the stcker prce f the van s $5,, hw
More informationWhich Separator? Spring 1
Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal
More informationCS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015
CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research
More informationthe results to larger systems due to prop'erties of the projection algorithm. First, the number of hidden nodes must
M.E. Aggune, M.J. Dambrg, M.A. El-Sharkawi, R.J. Marks II and L.E. Atlas, "Dynamic and static security assessment f pwer systems using artificial neural netwrks", Prceedings f the NSF Wrkshp n Applicatins
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More informationLogistic Classifier CISC 5800 Professor Daniel Leeds
lon 9/7/8 Logstc Classfer CISC 58 Professor Danel Leeds Classfcaton strategy: generatve vs. dscrmnatve Generatve, e.g., Bayes/Naïve Bayes: 5 5 Identfy probablty dstrbuton for each class Determne class
More informationWp/Lmin. Wn/Lmin 2.5V
UNIVERITY OF CALIFORNIA Cllege f Engneerng Department f Electrcal Engneerng and Cmputer cences Andre Vladmrescu Hmewrk #7 EEC Due Frday, Aprl 8 th, pm @ 0 Cry Prblem #.5V Wp/Lmn 0.0V Wp/Lmn n ut Wn/Lmn.5V
More informationCHAPTER 3 ANALYSIS OF KY BOOST CONVERTER
70 CHAPTER 3 ANALYSIS OF KY BOOST CONERTER 3.1 Intrductn The KY Bst Cnverter s a recent nventn made by K.I.Hwu et. al., (2007), (2009a), (2009b), (2009c), (2010) n the nn-slated DC DC cnverter segment,
More informationCourse 395: Machine Learning - Lectures
Course 395: Machne Learnng - Lectures Lecture 1-2: Concept Learnng (M. Pantc Lecture 3-4: Decson Trees & CC Intro (M. Pantc Lecture 5-6: Artfcal Neural Networks (S.Zaferou Lecture 7-8: Instance ased Learnng
More informationNeural Networks. Perceptrons and Backpropagation. Silke Bussen-Heyen. 5th of Novemeber Universität Bremen Fachbereich 3. Neural Networks 1 / 17
Neural Networks Perceptrons and Backpropagaton Slke Bussen-Heyen Unverstät Bremen Fachberech 3 5th of Novemeber 2012 Neural Networks 1 / 17 Contents 1 Introducton 2 Unts 3 Network structure 4 Snglelayer
More informationA New Method for Solving Integer Linear. Programming Problems with Fuzzy Variables
Appled Mathematcal Scences, Vl. 4, 00, n. 0, 997-004 A New Methd fr Slvng Integer Lnear Prgrammng Prblems wth Fuzzy Varables P. Pandan and M. Jayalakshm Department f Mathematcs, Schl f Advanced Scences,
More informationWeek 5: Neural Networks
Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple
More informationPattern Classification
Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher
More informationChapter 7. Neural Networks
Chapter 7. Neural Netwrks Wei Pan Divisin f Bistatistics, Schl f Public Health, University f Minnesta, Minneaplis, MN 55455 Email: weip@bistat.umn.edu PubH 7475/8475 c Wei Pan Intrductin Chapter 11. nly
More informationEE 221 Practice Problems for the Final Exam
EE 1 Practce Prblems fr the Fnal Exam 1. The netwrk functn f a crcut s 1.5 H. ω 1+ j 500 Ths table recrds frequency respnse data fr ths crcut. Fll n the blanks n the table:. The netwrk functn f a crcut
More informationGradient Descent Learning and Backpropagation
Artfcal Neural Networks (art 2) Chrstan Jacob Gradent Descent Learnng and Backpropagaton CSC 533 Wnter 200 Learnng by Gradent Descent Defnton of the Learnng roble Let us start wth the sple case of lnear
More informationReport on Image warping
Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.
More informationNeural networks. Nuno Vasconcelos ECE Department, UCSD
Neural networs Nuno Vasconcelos ECE Department, UCSD Classfcaton a classfcaton problem has two types of varables e.g. X - vector of observatons (features) n the world Y - state (class) of the world x X
More informationLinear Classification, SVMs and Nearest Neighbors
1 CSE 473 Lecture 25 (Chapter 18) Lnear Classfcaton, SVMs and Nearest Neghbors CSE AI faculty + Chrs Bshop, Dan Klen, Stuart Russell, Andrew Moore Motvaton: Face Detecton How do we buld a classfer to dstngush
More informationState-Space Model Based Generalized Predictive Control for Networked Control Systems
Prceedngs f the 7th Wrld Cngress he Internatnal Federatn f Autmatc Cntrl State-Space Mdel Based Generalzed Predctve Cntrl fr Netwred Cntrl Systems Bn ang* Gu-Png Lu** We-Hua Gu*** and Ya-Ln Wang**** *Schl
More informationDesign of Analog Integrated Circuits
Desgn f Analg Integrated Crcuts I. Amplfers Desgn f Analg Integrated Crcuts Fall 2012, Dr. Guxng Wang 1 Oerew Basc MOS amplfer structures Cmmn-Surce Amplfer Surce Fllwer Cmmn-Gate Amplfer Desgn f Analg
More informationA/2 l,k. Problem 1 STRATEGY. KNOWN Resistance of a complete spherical shell: r rk. Inner and outer radii
Prblem 1 STRATEGY KNOWN Resstance f a cmplete sphercal shell: R ( r r / (4 π r rk sphere Inner an uter ra r an r, SOLUTION Part 1: Resstance f a hemsphercal shell: T calculate the resstance f the hemsphere,
More informationNeural Networks. Neural Network Motivation. Why Neural Networks? Comments on Blue Gene. More Comments on Blue Gene
Motvaton for non-lnear Classfers Neural Networs CPS 27 Ron Parr Lnear methods are wea Mae strong assumptons Can only express relatvely smple functons of nputs Comng up wth good features can be hard Why
More informationChapter 7. Systems 7.1 INTRODUCTION 7.2 MATHEMATICAL MODELING OF LIQUID LEVEL SYSTEMS. Steady State Flow. A. Bazoune
Chapter 7 Flud Systems and Thermal Systems 7.1 INTODUCTION A. Bazune A flud system uses ne r mre fluds t acheve ts purpse. Dampers and shck absrbers are eamples f flud systems because they depend n the
More informationExploiting vector space properties for the global optimization of process networks
Exptng vectr space prpertes fr the gbal ptmzatn f prcess netwrks Juan ab Ruz Ignac Grssmann Enterprse Wde Optmzatn Meetng March 00 Mtvatn - The ptmzatn f prcess netwrks s ne f the mst frequent prblems
More informationPHYSICS 536 Experiment 12: Applications of the Golden Rules for Negative Feedback
PHYSICS 536 Experment : Applcatns f the Glden Rules fr Negatve Feedback The purpse f ths experment s t llustrate the glden rules f negatve feedback fr a varety f crcuts. These cncepts permt yu t create
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationData Mining: Concepts and Techniques. Classification and Prediction. Chapter February 8, 2007 CSE-4412: Data Mining 1
Data Mining: Cncepts and Techniques Classificatin and Predictin Chapter 6.4-6 February 8, 2007 CSE-4412: Data Mining 1 Chapter 6 Classificatin and Predictin 1. What is classificatin? What is predictin?
More informationMultigradient for Neural Networks for Equalizers 1
Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT
More informationManifold Learning for Complex Visual Analytics: Benefits from and to Neural Architectures
Manfold Learnng for Complex Vsual Analytcs: Benefts from and to Neural Archtectures Stephane Marchand-Mallet Vper group Unversty of Geneva Swtzerland Edgar Roman-Rangel, Ke Sun (Vper) A. Agocs, D. Dardans,
More informationelement k Using FEM to Solve Truss Problems
sng EM t Slve Truss Prblems A truss s an engneerng structure cmpsed straght members, a certan materal, that are tpcall pn-ned at ther ends. Such members are als called tw-rce members snce the can nl transmt
More informationA neural network with localized receptive fields for visual pattern classification
Unversty of Wollongong Research Onlne Faculty of Informatcs - Papers (Archve) Faculty of Engneerng and Informaton Scences 2005 A neural network wth localzed receptve felds for vsual pattern classfcaton
More informationArtificial Neural Networks MLP, Backpropagation
Artificial Neural Netwrks MLP, Backprpagatin 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001 00100000
More informationNonlinear Classifiers II
Nonlnear Classfers II Nonlnear Classfers: Introducton Classfers Supervsed Classfers Lnear Classfers Perceptron Least Squares Methods Lnear Support Vector Machne Nonlnear Classfers Part I: Mult Layer Neural
More information4DVAR, according to the name, is a four-dimensional variational method.
4D-Varatnal Data Assmlatn (4D-Var) 4DVAR, accrdng t the name, s a fur-dmensnal varatnal methd. 4D-Var s actually a smple generalzatn f 3D-Var fr bservatns that are dstrbuted n tme. he equatns are the same,
More informationUsing deep belief network modelling to characterize differences in brain morphometry in schizophrenia
Usng deep belef network modellng to characterze dfferences n bran morphometry n schzophrena Walter H. L. Pnaya * a ; Ary Gadelha b ; Orla M. Doyle c ; Crstano Noto b ; André Zugman d ; Qurno Cordero b,
More informationThe support vector machine. Nuno Vasconcelos ECE Department, UCSD
he supprt vectr machne Nun Vascncels ECE Department UCSD Outlne e have talked abut classfcatn and lnear dscrmnants then e dd a detur t talk abut kernels h d e mplement a nn-lnear bundar n the lnear dscrmnant
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationCOMP 551 Applied Machine Learning Lecture 11: Support Vector Machines
COMP 551 Applied Machine Learning Lecture 11: Supprt Vectr Machines Instructr: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/cmp551 Unless therwise nted, all material psted fr this curse
More informationIntroduction to the Introduction to Artificial Neural Network
Introducton to the Introducton to Artfcal Neural Netork Vuong Le th Hao Tang s sldes Part of the content of the sldes are from the Internet (possbly th modfcatons). The lecturer does not clam any onershp
More informationCHALMERS, GÖTEBORGS UNIVERSITET. SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD
CHALMERS, GÖTEBORGS UNIVERSITET SOLUTIONS to RE-EXAM for ARTIFICIAL NEURAL NETWORKS COURSE CODES: FFR 35, FIM 72 GU, PhD Tme: Place: Teachers: Allowed materal: Not allowed: January 2, 28, at 8 3 2 3 SB
More informationParameter Calibration of VISSIM Simulation Model Based on Genetic Algorithm
Internatnal Cnference n Advanced Cmputer Scence and Electrncs Infrmatn (ICACSEI 2013) Parameter Calbratn f VISSIM Smulatn Mdel Based n Genetc Algrthm Nuerlan Muhan1, Yng Qn2, Qnghua Zhang3, Yanfang Yang1
More informationFall 2013 Physics 172 Recitation 3 Momentum and Springs
Fall 03 Physics 7 Recitatin 3 Mmentum and Springs Purpse: The purpse f this recitatin is t give yu experience wrking with mmentum and the mmentum update frmula. Readings: Chapter.3-.5 Learning Objectives:.3.
More informationNUMERICAL DIFFERENTIATION
NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the
More informationCSE4210 Architecture and Hardware for DSP
4210 Archtecture and Hardware for DSP Lecture 1 Introducton & Number systems Admnstratve Stuff 4210 Archtecture and Hardware for DSP Text: VLSI Dgtal Sgnal Processng Systems: Desgn and Implementaton. K.
More informationWaveshapping Circuits and Data Converters. Lesson #17 Comparators and Schmitt Triggers Section BME 373 Electronics II J.
Waeshappg Crcuts and Data Cnerters Lessn #7 Cmparatrs and Schmtt Trggers Sectn. BME 7 Electrncs II 0 Waeshappg Crcuts and Data Cnerters Cmparatrs and Schmtt Trggers Astable Multbratrs and Tmers ectfers,
More informationPhysics 107 HOMEWORK ASSIGNMENT #20
Physcs 107 HOMEWORK ASSIGNMENT #0 Cutnell & Jhnsn, 7 th etn Chapter 6: Prblems 5, 7, 74, 104, 114 *5 Cncept Smulatn 6.4 prves the ptn f explrng the ray agram that apples t ths prblem. The stance between
More informationErratum: A Generalized Path Integral Control Approach to Reinforcement Learning
Journal of Machne Learnng Research 00-9 Submtted /0; Publshed 7/ Erratum: A Generalzed Path Integral Control Approach to Renforcement Learnng Evangelos ATheodorou Jonas Buchl Stefan Schaal Department of
More informationPT326 PROCESS TRAINER
PT326 PROCESS TRAINER 1. Descrptn f the Apparatus PT 326 Prcess Traner The PT 326 Prcess Traner mdels cmmn ndustral stuatns n whch temperature cntrl s requred n the presence f transprt delays and transfer
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More informationLecture 2 Solution of Nonlinear Equations ( Root Finding Problems )
Lecture Soluton o Nonlnear Equatons Root Fndng Problems Dentons Classcaton o Methods Analytcal Solutons Graphcal Methods Numercal Methods Bracketng Methods Open Methods Convergence Notatons Root Fndng
More informationHidden Markov Models
Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,
More informationProceedings of the Artificial Neural Networks in Engineering (ANNIE) Conference, St. Louis, MO, November 11-14, 2007
Prceedngs te Artcal Neural Netwrks n Engneerng (ANNIE) Cnerence, St. Lus, MO, Nvember 11-14, 27 NEURAL NEWORK BASED FAILURE PREDICION MODEL FOR COMPOSIE HYDROGEN SORAGE CYLINDERS J. Cen, J.Hu, V.G.K. Menta
More informationKernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan
Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems
More informationThermodynamics of Materials
Thermdynamcs f Materals 14th Lecture 007. 4. 8 (Mnday) FUGACITY dg = Vd SdT dg = Vd at cnstant T Fr an deal gas dg = (RT/)d = RT dln Ths s true fr deal gases nly, but t wuld be nce t have a smlar frm fr
More informationA Scalable Recurrent Neural Network Framework for Model-free
A Scalable Recurrent Neural Netwrk Framewrk fr Mdel-free POMDPs April 3, 2007 Zhenzhen Liu, Itamar Elhanany Machine Intelligence Lab Department f Electrical and Cmputer Engineering The University f Tennessee
More informationLecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff
Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised
More informationUsing T.O.M to Estimate Parameter of distributions that have not Single Exponential Family
IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran
More informationSDMML HT MSc Problem Sheet 4
SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be
More informationNeural Networks. Class 22: MLSP, Fall 2016 Instructor: Bhiksha Raj
Neural Networs Class 22: MLSP, Fall 2016 Instructor: Bhsha Raj IMPORTANT ADMINSTRIVIA Fnal wee. Project presentatons on 6th 18797/11755 2 Neural Networs are tang over! Neural networs have become one of
More informationLecture 2: Supervised vs. unsupervised learning, bias-variance tradeoff
Lecture 2: Supervised vs. unsupervised learning, bias-variance tradeff Reading: Chapter 2 STATS 202: Data mining and analysis September 27, 2017 1 / 20 Supervised vs. unsupervised learning In unsupervised
More informationo = /=] Epidemiologic Interpretation of Artificial Neural Networks
Amercan Jurnal f Epdemlgy Cpyrght 998 by The Jhns Hpkns Unversty Schl f Hygene and Publc Health All rghts reserved Vl. 47, N. 2 Prnted n U.S.A. Epdemlgc Interpretatn f Artfcal Neural Netwrks Me-Sheng Duh,
More informationMULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN
MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN S. Chtwong, S. Wtthayapradt, S. Intajag, and F. Cheevasuvt Faculty of Engneerng, Kng Mongkut s Insttute of Technology
More informationChapter 6 : Gibbs Free Energy
Wnter 01 Chem 54: ntrductry hermdynamcs Chapter 6 : Gbbs Free Energy... 64 Defntn f G, A... 64 Mawell Relatns... 65 Gbbs Free Energy G(,) (ure substances)... 67 Gbbs Free Energy fr Mtures... 68 ΔG f deal
More informationSupport Vector Machines. Vibhav Gogate The University of Texas at dallas
Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest
More informationCIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M
CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute
More information6 Supplementary Materials
6 Supplementar Materals 61 Proof of Theorem 31 Proof Let m Xt z 1:T : l m Xt X,z 1:t Wethenhave mxt z1:t ˆm HX Xt z 1:T mxt z1:t m HX Xt z 1:T + mxt z 1:T HX We consder each of the two terms n equaton
More information55:041 Electronic Circuits
55:04 Electrnc Crcuts Feedback & Stablty Sectns f Chapter 2. Kruger Feedback & Stablty Cnfguratn f Feedback mplfer S S S S fb Negate feedback S S S fb S S S S S β s the feedback transfer functn Implct
More informationSupport Vector Machines
CS 2750: Machne Learnng Support Vector Machnes Prof. Adrana Kovashka Unversty of Pttsburgh February 17, 2016 Announcement Homework 2 deadlne s now 2/29 We ll have covered everythng you need today or at
More information