Computational and Statistical Learning theory Assignment 4
|
|
- Hugo Marsh
- 5 years ago
- Views:
Transcription
1 Coputatonal and Statstcal Learnng theory Assgnent 4 Due: March 2nd Eal solutons to : karthk at ttc dot edu Notatons/Defntons Recall the defnton of saple based Radeacher coplexty : [ ] R S F) := E ɛ {±} n sup ɛ fx ) Defnton. Gven a saple S = {x,..., x }, and any > 0, a set V R s sad to be an -cover n l p ) of functon class F on saple S f ) /p f F, v V s.t. fx ) v p Specfcally for p = fx ) v p) /p s replaced by ax [] fx ) v. Also defne N p F,, S) := n{ V : V s an -cover of n l p ) off on saple S} and N p F,, n) := sup N p F,, {x,..., x }) x,...,x Defnton 2. A functon F s sad to -shatter a saple S = {x,..., x } f there exsts a sequence of thresholds, s,..., s R such that ɛ {±}, f F s.t. [], ɛ fx ) s ) /2
2 Probles. VC Lea for Real-valued Functon classes : We shall prove that for any functon class F assue functons n F are bounded by ) and scale > 0, the l coverng nuber at scale can be bounded usng fat shatterng denson at that scale by provng a stateent analogous to VC lea. We shall proceed by frst extendng the stateent to fnte specfcally {0,..., k}) valued functon classes and then usng ths to prove the fnal bound of for fat N F,, n) n ) a) Let F k {0,..., k} X be a functon class wth fat 2 F k ) = d, show that N F k, /2, ) d ) ) k Show the above stateent usng nducton on n + d very slar to frst proble on Assgnent 2). Hnt : In Assgnent 2 proble where we used H + S and H S use nstead, for all {0,..., k}, F = {f F : fx ) = } note that ths s a sple ultlable extenson and for k =, F 0, F are dentcal to H +, H ). Use the noton of 2-shatterng nstead of shatterng for the VC case and use /2-cover nstead of growth functon. b) Usng the dea of -dscretzng the output of functon class F we shall conclude the requred stateent. Do the followng :. Create a {0,..., k}-valued class G where k s of order /. Show that coverng G at scale /2 ples we can cover F at scale and hence conclude that we can bound N F,, ) n ters of coverng nuber at scale /2 for G.. Show that fat 2 G) fat F). Cobne wth the bound on N G, /2, ) fro prevous sub-proble and conclude that fat N F,, n) n ) 2. Dudley Vs Pollards Bounds : In class we saw that Radeacher coplexty can be bounded n ters of coverng nubers usng Pollard s bound, Dudley ntegral bound and the slghtly odfed verson of Dudley ) 2
3 ntegral bound as follows : { } { } N F,, ) N2 F,, ) R S F) nf + nf { } N2 F, τ, ) R S F) nf dτ 0 Pollard) Refned Dudley) In ths proble usng soe exaples we shall copare these bounds. a) Class wth fnte VC subgraph-denson : Assue that the VC subgraph-denson of functon class F s bounded by D. In ths case result n proble can be used to bound the coverng nuber of F n ters of D. Use ths bound on coverng nuber and copare Pollard s bound wth refned Dudley ntegral bound by wrtng down the bounds pled by each one. b) Lnear class wth bounded nor : Lnear classes n hgh densonal spaces s probably one of the ost portant and ost used functon class n achne learnng. Consder the specfc exaple where X = {x : x 2 } and F = {x w x : w 2 } In class we saw that for any ɛ > 0, fat ɛ F) 4. Usng ths wth the result n proble ɛ 2 we have that : en ) 4 ɛ N 2 F,, ) N F,, ) 2 ɛ Use the above bound on the coverng nuber and wrte down the bound on Radeacher coplexty pled by Pollard s bound. Wrte down the bound on Radeacher coplexty pled by the refned verson of the Dudley ntegral bound. 3. Data Dependent Bound : Recall the Radeacher coplexty bound we proved n class for functons F bounded by. For any δ > 0 wth probablty at least δ, ) [ sup E [fx)] ÊS[fx)] 2E S D RS F)] + log/δ) Note that we don t know the dstrbuton D. One way we used the above bound was by provdng [ upper bounds on R S F) for any saple of sze and usng ths nstead of E S D RS F)]. But deally we would lke to get tght bounds when the dstrbuton we are faced wth s ncer. The a of ths proble s to do ths. Prove that, for any δ > 0 wth probablty at least δ, over draw of saple S D, ) sup E [fx)] ÊS[fx)] 2 R log2/δ) S F) + K 3
4 provde explct value of constant K above). Notce that n the above bound the expected Radeacher coplexty s replaced by saple based one whch can be calculated fro the tranng saple. Hnt : Use McDard s nequalty on the expected Radeacher coplexty. 4. Learnablty and Fat-shatterng Denson Recall the settng of stochastc optzaton proble where objectve functon s appng r : H Z R. Saple S = {z,..., z } drawn d fro unknown dstrbuton D s provded to the learner and the a of the learner s to output ĥ H based on saple that has low expected objectve E [rh, z)]. a) Consder the stochastc optzaton proble wth r bounded by a,.e. rh, z) < a < for all h H and z Z. If functon class F := {z rh, z) h H} has fnte fat for all > 0, then show that the proble s learnable. b) Conclude that for a supervsed learnng proble wth bounded hypothess class H e. x X, hx) < a), and loss φ : Ŷ Y R that s L-Lpschtz n frst arguent), f H has fnte fat for all > 0, then the proble s learnable. c) Show a stochastc optzaton proble that s learnable even though t has nfnte fat for all 0. or any other cosntant of your choce). Explctly wrte down the hypothess class, and the learnng rule whch learns the class, argue that the proble s learnable, and explan why the fat s nfnte. Hnt : You can ake the learnng rule that s successful to even be ERM. d) Prove that for a supervsed learnng proble wth the absolute loss φŷ, y) = ŷ y, f the fat s nfnte for soe > 0, then the proble s not learnable. Hnt: as wth the bnary case, for every, construct a dstrbuton whch s concentrated on a set of ponts that can be fat-shattered. Challenge Probles. We saw that for any dstrbuton D, the expected Radeacher coplexty provded an upper bound on the axu devaton between ean and average unforly over functon class, specfcally we saw that E S D [ sup Prove the alost) converse that ) ] [ E [fx)] Ê [fx)] 2E S D RS F)] [ ] 2 E S D RS F) E S D 4 [ sup E [fx)] Ê [fx)] ) ]
5 Ths bascally establshes that Radeacher coplexty tghtly bounds the unfor axal devaton for every dstrbuton. 2. The worst case Radeacher coplexty s defned as R F) = e. supreu over saples of sze ). sup S={x,...,x } R S F) a) Prove that for any functon class F and any τ > R F), we have that fat τ F) 4 R F) 2 τ 2 Hnt : Frst start by provng the stateent for larger saple of sze = fat τ fat τ by takng fat τ saples and repeatng the approprate nuber of tes. You wll need to start wth a shattered set and you wll need to use Kntchne s nequalty whch states that for any n, [ ] n n E ɛ Unf{±} n ɛ 2 b) Cobne the above wth the refned verson of Dudley ntegral bound to prove that { } N2 F, τ, ) nf dτ 0 R F) Olog 3/2 ) Ths shows that the refned dudley ntegral bound s tght to wthn log factors of the Radeacher coplexty. Thus we have establshed that n the worst case all the coplexty easures for functon class lke Radeacher coplexty, coverng nubers and fat shatterng denson all tghtly govern the rate of unfor axal devaton for the functon class all to wthn log factor). 3. Bounded Dfference Inequalty, Stablty and Generalzaton : Recall that a functon G : X R s sad to satsfy the bounded dfference nequalty f for all [] and all x,..., x, x X, Gx,..., x,..., x ) Gx,..., x, x, x +,..., x ) c for soe c 0. In ths case the McDard s nequalty gave us that for any δ > 0, wth probablty at least δ, c)2 log/δ) Gx,..., x ) E [Gx,..., x )] + The bounded dfference property turns out to be quet useful to analyze learnng algorths drectly nstead of lookng at the unfor devaton over functon class). 5
6 A proper learnng algorth s A : = X F s sad to be a unforly β stable s for all [], and any x,..., x, x X, sup Ax,..., x,..., x )x) Ax,..., x, x, x +,..., x )x) β x Assung functons n F are bounded by we shall prove that the learnng algorth generalzes well expected loss s close to eprcal loss of the algorth). Specfcally we shall prove that for any δ > 0, wth probablty at least δ, RAS)) RAS)) 2 log/δ) + β + 2β + ) where RAS)) = E x [AS)x)] and RAS)) = AS)x ). a) Frst show that E S [RAS)) RAS)) ] β. Hnt : Use renang of varables to frst show that for any [], E S [RAS))] = E S,x [Ax,..., x, x, x +,..., x )x )] b) Show that the functon GS) = RAS)) RAS)) satsfes bounded dfference property wth c 2β + 2. Conclude the requred stateent usng McDard s nequalty. c) Consder the stochastc convex optzaton proble where saple z = x, y) where y s real valued and x s are fro the unt ball n soe Hlbert space and hypothess s weght vectors w fro the sae Hlbert space wth objectve rw, x, y)) = w, x y + λ w 2 Show that the ERM algorth s stable for ths proble and thus provde a bound for ths algorth. 4. L Neural Network : A k-layer -nor neural network s gven by functon class F k whch s n turn defned recursvely as follows. { } d F = x wj x j w B and further for each 2 k, { } d F = x wj σf j x)) j [d ], f j F, w B j= j= 6
7 where d s the nuber of nodes n the th layer of the network. Functon σ : R [, ] s called the squash functon and s generally a sooth onotonc non-decreasng functon typcal exaple s the tanh functon). Assue that nput space X = [0, ] d and that σ s L-Lpschtz. Prove that k R S F k ) 2B ) L k 2T log d Notce that the above bound the d s don t appear n the bound ndcatng the nuber of nodes n nteredate layers don t affect the upper bound on Radeacher coplexty. Hnt : prove bound on Radeacher coplexty of F recursvely n ters of radeacher coplexty of F. 7
Computational and Statistical Learning Theory
Coputational and Statistical Learning Theory Proble sets 5 and 6 Due: Noveber th Please send your solutions to learning-subissions@ttic.edu Notations/Definitions Recall the definition of saple based Radeacher
More information1 Definition of Rademacher Complexity
COS 511: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #9 Scrbe: Josh Chen March 5, 2013 We ve spent the past few classes provng bounds on the generalzaton error of PAClearnng algorths for the
More informationExcess Error, Approximation Error, and Estimation Error
E0 370 Statstcal Learnng Theory Lecture 10 Sep 15, 011 Excess Error, Approxaton Error, and Estaton Error Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton So far, we have consdered the fnte saple
More informationCOS 511: Theoretical Machine Learning
COS 5: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #0 Scrbe: José Sões Ferrera March 06, 203 In the last lecture the concept of Radeacher coplexty was ntroduced, wth the goal of showng that
More informationVapnik-Chervonenkis theory
Vapnk-Chervonenks theory Rs Kondor June 13, 2008 For the purposes of ths lecture, we restrct ourselves to the bnary supervsed batch learnng settng. We assume that we have an nput space X, and an unknown
More information1 Review From Last Time
COS 5: Foundatons of Machne Learnng Rob Schapre Lecture #8 Scrbe: Monrul I Sharf Aprl 0, 2003 Revew Fro Last Te Last te, we were talkng about how to odel dstrbutons, and we had ths setup: Gven - exaples
More informationLearning Theory: Lecture Notes
Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationOn the Calderón-Zygmund lemma for Sobolev functions
arxv:0810.5029v1 [ath.ca] 28 Oct 2008 On the Calderón-Zygund lea for Sobolev functons Pascal Auscher october 16, 2008 Abstract We correct an naccuracy n the proof of a result n [Aus1]. 2000 MSC: 42B20,
More informationAn Optimal Bound for Sum of Square Roots of Special Type of Integers
The Sxth Internatonal Syposu on Operatons Research and Its Applcatons ISORA 06 Xnang, Chna, August 8 12, 2006 Copyrght 2006 ORSC & APORC pp. 206 211 An Optal Bound for Su of Square Roots of Specal Type
More informationSlobodan Lakić. Communicated by R. Van Keer
Serdca Math. J. 21 (1995), 335-344 AN ITERATIVE METHOD FOR THE MATRIX PRINCIPAL n-th ROOT Slobodan Lakć Councated by R. Van Keer In ths paper we gve an teratve ethod to copute the prncpal n-th root and
More informationGradient Descent Learning and Backpropagation
Artfcal Neural Networks (art 2) Chrstan Jacob Gradent Descent Learnng and Backpropagaton CSC 533 Wnter 200 Learnng by Gradent Descent Defnton of the Learnng roble Let us start wth the sple case of lnear
More informationprinceton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg
prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there
More information10-701/ Machine Learning, Fall 2005 Homework 3
10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40
More informationXiangwen Li. March 8th and March 13th, 2001
CS49I Approxaton Algorths The Vertex-Cover Proble Lecture Notes Xangwen L March 8th and March 3th, 00 Absolute Approxaton Gven an optzaton proble P, an algorth A s an approxaton algorth for P f, for an
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationE0 370 Statistical Learning Theory Lecture 5 (Aug 25, 2011)
E0 370 Statistical Learning Theory Lecture 5 Aug 5, 0 Covering Nubers, Pseudo-Diension, and Fat-Shattering Diension Lecturer: Shivani Agarwal Scribe: Shivani Agarwal Introduction So far we have seen how
More informationXII.3 The EM (Expectation-Maximization) Algorithm
XII.3 The EM (Expectaton-Maxzaton) Algorth Toshnor Munaata 3/7/06 The EM algorth s a technque to deal wth varous types of ncoplete data or hdden varables. It can be appled to a wde range of learnng probles
More informationLeast Squares Fitting of Data
Least Squares Fttng of Data Davd Eberly Geoetrc Tools, LLC http://www.geoetrctools.co/ Copyrght c 1998-2014. All Rghts Reserved. Created: July 15, 1999 Last Modfed: February 9, 2008 Contents 1 Lnear Fttng
More informationTHE CHINESE REMAINDER THEOREM. We should thank the Chinese for their wonderful remainder theorem. Glenn Stevens
THE CHINESE REMAINDER THEOREM KEITH CONRAD We should thank the Chnese for ther wonderful remander theorem. Glenn Stevens 1. Introducton The Chnese remander theorem says we can unquely solve any par of
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More informationBAYESIAN CURVE FITTING USING PIECEWISE POLYNOMIALS. Dariusz Biskup
BAYESIAN CURVE FITTING USING PIECEWISE POLYNOMIALS Darusz Bskup 1. Introducton The paper presents a nonparaetrc procedure for estaton of an unknown functon f n the regresson odel y = f x + ε = N. (1) (
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationAN ANALYSIS OF A FRACTAL KINETICS CURVE OF SAVAGEAU
AN ANALYI OF A FRACTAL KINETIC CURE OF AAGEAU by John Maloney and Jack Hedel Departent of Matheatcs Unversty of Nebraska at Oaha Oaha, Nebraska 688 Eal addresses: aloney@unoaha.edu, jhedel@unoaha.edu Runnng
More informationE0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis
E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds
More informationChapter 12 Lyes KADEM [Thermodynamics II] 2007
Chapter 2 Lyes KDEM [Therodynacs II] 2007 Gas Mxtures In ths chapter we wll develop ethods for deternng therodynac propertes of a xture n order to apply the frst law to systes nvolvng xtures. Ths wll be
More informationLeast Squares Fitting of Data
Least Squares Fttng of Data Davd Eberly Geoetrc Tools, LLC http://www.geoetrctools.co/ Copyrght c 1998-2015. All Rghts Reserved. Created: July 15, 1999 Last Modfed: January 5, 2015 Contents 1 Lnear Fttng
More informationCOMP th April, 2007 Clement Pang
COMP 540 12 th Aprl, 2007 Cleent Pang Boostng Cobnng weak classers Fts an Addtve Model Is essentally Forward Stagewse Addtve Modelng wth Exponental Loss Loss Functons Classcaton: Msclasscaton, Exponental,
More informationExercise Solutions to Real Analysis
xercse Solutons to Real Analyss Note: References refer to H. L. Royden, Real Analyss xersze 1. Gven any set A any ɛ > 0, there s an open set O such that A O m O m A + ɛ. Soluton 1. If m A =, then there
More informationPHYS 705: Classical Mechanics. Calculus of Variations II
1 PHYS 705: Classcal Mechancs Calculus of Varatons II 2 Calculus of Varatons: Generalzaton (no constrant yet) Suppose now that F depends on several dependent varables : We need to fnd such that has a statonary
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationSystem in Weibull Distribution
Internatonal Matheatcal Foru 4 9 no. 9 94-95 Relablty Equvalence Factors of a Seres-Parallel Syste n Webull Dstrbuton M. A. El-Dacese Matheatcs Departent Faculty of Scence Tanta Unversty Tanta Egypt eldacese@yahoo.co
More informationSolutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution.
Solutons HW #2 Dual of general LP. Fnd the dual functon of the LP mnmze subject to c T x Gx h Ax = b. Gve the dual problem, and make the mplct equalty constrants explct. Soluton. 1. The Lagrangan s L(x,
More informationP exp(tx) = 1 + t 2k M 2k. k N
1. Subgaussan tals Defnton. Say that a random varable X has a subgaussan dstrbuton wth scale factor σ< f P exp(tx) exp(σ 2 t 2 /2) for all real t. For example, f X s dstrbuted N(,σ 2 ) then t s subgaussan.
More informationCS 229, Public Course Problem Set #3 Solutions: Learning Theory and Unsupervised Learning
CS9 Problem Set #3 Solutons CS 9, Publc Course Problem Set #3 Solutons: Learnng Theory and Unsupervsed Learnng. Unform convergence and Model Selecton In ths problem, we wll prove a bound on the error of
More informationOn the Eigenspectrum of the Gram Matrix and the Generalisation Error of Kernel PCA (Shawe-Taylor, et al. 2005) Ameet Talwalkar 02/13/07
On the Egenspectru of the Gra Matr and the Generalsaton Error of Kernel PCA Shawe-aylor, et al. 005 Aeet alwalar 0/3/07 Outlne Bacground Motvaton PCA, MDS Isoap Kernel PCA Generalsaton Error of Kernel
More informationThe Parity of the Number of Irreducible Factors for Some Pentanomials
The Party of the Nuber of Irreducble Factors for Soe Pentanoals Wolfra Koepf 1, Ryul K 1 Departent of Matheatcs Unversty of Kassel, Kassel, F. R. Gerany Faculty of Matheatcs and Mechancs K Il Sung Unversty,
More informationE Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities
Algorthms Non-Lecture E: Tal Inequaltes If you hold a cat by the tal you learn thngs you cannot learn any other way. Mar Twan E Tal Inequaltes The smple recursve structure of sp lsts made t relatvely easy
More informationDenote the function derivatives f(x) in given points. x a b. Using relationships (1.2), polynomials (1.1) are written in the form
SET OF METHODS FO SOUTION THE AUHY POBEM FO STIFF SYSTEMS OF ODINAY DIFFEENTIA EUATIONS AF atypov and YuV Nulchev Insttute of Theoretcal and Appled Mechancs SB AS 639 Novosbrs ussa Introducton A constructon
More informationFinal Exam Solutions, 1998
58.439 Fnal Exa Solutons, 1998 roble 1 art a: Equlbru eans that the therodynac potental of a consttuent s the sae everywhere n a syste. An exaple s the Nernst potental. If the potental across a ebrane
More informationPreference and Demand Examples
Dvson of the Huantes and Socal Scences Preference and Deand Exaples KC Border October, 2002 Revsed Noveber 206 These notes show how to use the Lagrange Karush Kuhn Tucker ultpler theores to solve the proble
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationWeek 5: Neural Networks
Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple
More informationOn Pfaff s solution of the Pfaff problem
Zur Pfaff scen Lösung des Pfaff scen Probles Mat. Ann. 7 (880) 53-530. On Pfaff s soluton of te Pfaff proble By A. MAYER n Lepzg Translated by D. H. Delpenc Te way tat Pfaff adopted for te ntegraton of
More informationLecture 4: Universal Hash Functions/Streaming Cont d
CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected
More informationRecap: the SVM problem
Machne Learnng 0-70/5-78 78 Fall 0 Advanced topcs n Ma-Margn Margn Learnng Erc Xng Lecture 0 Noveber 0 Erc Xng @ CMU 006-00 Recap: the SVM proble We solve the follong constraned opt proble: a s.t. J 0
More informationMultipoint Analysis for Sibling Pairs. Biostatistics 666 Lecture 18
Multpont Analyss for Sblng ars Bostatstcs 666 Lecture 8 revously Lnkage analyss wth pars of ndvduals Non-paraetrc BS Methods Maxu Lkelhood BD Based Method ossble Trangle Constrant AS Methods Covered So
More informationMultilayer Perceptron (MLP)
Multlayer Perceptron (MLP) Seungjn Cho Department of Computer Scence and Engneerng Pohang Unversty of Scence and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjn@postech.ac.kr 1 / 20 Outlne
More informationLinear, affine, and convex sets and hulls In the sequel, unless otherwise specified, X will denote a real vector space.
Lnear, affne, and convex sets and hulls In the sequel, unless otherwse specfed, X wll denote a real vector space. Lnes and segments. Gven two ponts x, y X, we defne xy = {x + t(y x) : t R} = {(1 t)x +
More informationNear Optimal Online Algorithms and Fast Approximation Algorithms for Resource Allocation Problems
Near Optal Onlne Algorths and Fast Approxaton Algorths for Resource Allocaton Probles Nkhl R Devanur Kaal Jan Balasubraanan Svan Chrstopher A Wlkens Abstract We present algorths for a class of resource
More information3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X
Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number
More informationU.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016
U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and
More informationLecture 14: Bandits with Budget Constraints
IEOR 8100-001: Learnng and Optmzaton for Sequental Decson Makng 03/07/16 Lecture 14: andts wth udget Constrants Instructor: Shpra Agrawal Scrbed by: Zhpeng Lu 1 Problem defnton In the regular Mult-armed
More informationFermi-Dirac statistics
UCC/Physcs/MK/EM/October 8, 205 Fer-Drac statstcs Fer-Drac dstrbuton Matter partcles that are eleentary ostly have a type of angular oentu called spn. hese partcles are known to have a agnetc oent whch
More informationHomework Assignment 3 Due in class, Thursday October 15
Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.
More informationCSCE 790S Background Results
CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each
More informationSeveral generation methods of multinomial distributed random number Tian Lei 1, a,linxihe 1,b,Zhigang Zhang 1,c
Internatonal Conference on Appled Scence and Engneerng Innovaton (ASEI 205) Several generaton ethods of ultnoal dstrbuted rando nuber Tan Le, a,lnhe,b,zhgang Zhang,c School of Matheatcs and Physcs, USTB,
More informationCOS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013
COS 511: heoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 15 Scrbe: Jemng Mao Aprl 1, 013 1 Bref revew 1.1 Learnng wth expert advce Last tme, we started to talk about learnng wth expert advce.
More information11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]
Algorthms Lecture 11: Tal Inequaltes [Fa 13] If you hold a cat by the tal you learn thngs you cannot learn any other way. Mark Twan 11 Tal Inequaltes The smple recursve structure of skp lsts made t relatvely
More information04 - Treaps. Dr. Alexander Souza
Algorths Theory 04 - Treaps Dr. Alexander Souza The dctonary proble Gven: Unverse (U,
More informationAPPENDIX A Some Linear Algebra
APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,
More informationCanonical transformations
Canoncal transformatons November 23, 2014 Recall that we have defned a symplectc transformaton to be any lnear transformaton M A B leavng the symplectc form nvarant, Ω AB M A CM B DΩ CD Coordnate transformatons,
More informationEstimation: Part 2. Chapter GREG estimation
Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More informationMore metrics on cartesian products
More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of
More informationprinceton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora
prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable
More informationFinding Dense Subgraphs in G(n, 1/2)
Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More information), it produces a response (output function g (x)
Lnear Systems Revew Notes adapted from notes by Mchael Braun Typcally n electrcal engneerng, one s concerned wth functons of tme, such as a voltage waveform System descrpton s therefore defned n the domans
More informationMath 217 Fall 2013 Homework 2 Solutions
Math 17 Fall 013 Homework Solutons Due Thursday Sept. 6, 013 5pm Ths homework conssts of 6 problems of 5 ponts each. The total s 30. You need to fully justfy your answer prove that your functon ndeed has
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationy new = M x old Feature Selection: Linear Transformations Constraint Optimization (insertion)
Feature Selecton: Lnear ransforatons new = M x old Constrant Optzaton (nserton) 3 Proble: Gven an objectve functon f(x) to be optzed and let constrants be gven b h k (x)=c k, ovng constants to the left,
More informationErratum: A Generalized Path Integral Control Approach to Reinforcement Learning
Journal of Machne Learnng Research 00-9 Submtted /0; Publshed 7/ Erratum: A Generalzed Path Integral Control Approach to Renforcement Learnng Evangelos ATheodorou Jonas Buchl Stefan Schaal Department of
More informationarxiv: v2 [math.co] 3 Sep 2017
On the Approxate Asyptotc Statstcal Independence of the Peranents of 0- Matrces arxv:705.0868v2 ath.co 3 Sep 207 Paul Federbush Departent of Matheatcs Unversty of Mchgan Ann Arbor, MI, 4809-043 Septeber
More informationStrong Markov property: Same assertion holds for stopping times τ.
Brownan moton Let X ={X t : t R + } be a real-valued stochastc process: a famlty of real random varables all defned on the same probablty space. Defne F t = nformaton avalable by observng the process up
More informationFirst Year Examination Department of Statistics, University of Florida
Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve
More information4 Column generation (CG) 4.1 Basics of column generation. 4.2 Applying CG to the Cutting-Stock Problem. Basic Idea of column generation
4 Colun generaton (CG) here are a lot of probles n nteger prograng where even the proble defnton cannot be effcently bounded Specfcally, the nuber of coluns becoes very large herefore, these probles are
More informationExpected Value and Variance
MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING
1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N
More informationLecture 21: Numerical methods for pricing American type derivatives
Lecture 21: Numercal methods for prcng Amercan type dervatves Xaoguang Wang STAT 598W Aprl 10th, 2014 (STAT 598W) Lecture 21 1 / 26 Outlne 1 Fnte Dfference Method Explct Method Penalty Method (STAT 598W)
More informationSTEINHAUS PROPERTY IN BANACH LATTICES
DEPARTMENT OF MATHEMATICS TECHNICAL REPORT STEINHAUS PROPERTY IN BANACH LATTICES DAMIAN KUBIAK AND DAVID TIDWELL SPRING 2015 No. 2015-1 TENNESSEE TECHNOLOGICAL UNIVERSITY Cookevlle, TN 38505 STEINHAUS
More informationRobust Algorithms for Preemptive Scheduling
DOI 0.007/s00453-0-978-3 Robust Algorths for Preeptve Schedulng Leah Epsten Asaf Levn Receved: 4 March 0 / Accepted: 9 Noveber 0 Sprnger Scence+Busness Meda New York 0 Abstract Preeptve schedulng probles
More informationA. Proofs for learning guarantees
Leanng Theoy and Algoths fo Revenue Optzaton n Second-Pce Auctons wth Reseve A. Poofs fo leanng guaantees A.. Revenue foula The sple expesson of the expected evenue (2) can be obtaned as follows: E b Revenue(,
More informationAn Accurate Measure for Multilayer Perceptron Tolerance to Weight Deviations
Neural Processng Letters 10: 121 130, 1999. 1999 Kluwer Acadec Publshers. Prnted n the Netherlands. 121 An Accurate Measure for Multlayer Perceptron Tolerance to Weght Devatons JOSE L. BERNIER, J. ORTEGA,
More informationOnline Classification: Perceptron and Winnow
E0 370 Statstcal Learnng Theory Lecture 18 Nov 8, 011 Onlne Classfcaton: Perceptron and Wnnow Lecturer: Shvan Agarwal Scrbe: Shvan Agarwal 1 Introducton In ths lecture we wll start to study the onlne learnng
More informationStructure and Drive Paul A. Jensen Copyright July 20, 2003
Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.
More informationSeries Expansion for L p Hardy Inequalities
Seres Expanson for L p Hardy Inequaltes G. BARBATIS, S.FILIPPAS, & A. TERTIKAS ABSTRACT. We consder a general class of sharp L p Hardy nequaltes n R N nvolvng dstance fro a surface of general codenson
More information1.3 Hence, calculate a formula for the force required to break the bond (i.e. the maximum value of F)
EN40: Dynacs and Vbratons Hoework 4: Work, Energy and Lnear Moentu Due Frday March 6 th School of Engneerng Brown Unversty 1. The Rydberg potental s a sple odel of atoc nteractons. It specfes the potental
More informationThe Second Anti-Mathima on Game Theory
The Second Ant-Mathma on Game Theory Ath. Kehagas December 1 2006 1 Introducton In ths note we wll examne the noton of game equlbrum for three types of games 1. 2-player 2-acton zero-sum games 2. 2-player
More informationAppendix B. Criterion of Riemann-Stieltjes Integrability
Appendx B. Crteron of Remann-Steltes Integrablty Ths note s complementary to [R, Ch. 6] and [T, Sec. 3.5]. The man result of ths note s Theorem B.3, whch provdes the necessary and suffcent condtons for
More informationCHAPTER 7 CONSTRAINED OPTIMIZATION 1: THE KARUSH-KUHN-TUCKER CONDITIONS
CHAPER 7 CONSRAINED OPIMIZAION : HE KARUSH-KUHN-UCKER CONDIIONS 7. Introducton We now begn our dscusson of gradent-based constraned optzaton. Recall that n Chapter 3 we looked at gradent-based unconstraned
More informationOn the number of regions in an m-dimensional space cut by n hyperplanes
6 On the nuber of regons n an -densonal space cut by n hyperplanes Chungwu Ho and Seth Zeran Abstract In ths note we provde a unfor approach for the nuber of bounded regons cut by n hyperplanes n general
More informationCentroid Uncertainty Bounds for Interval Type-2 Fuzzy Sets: Forward and Inverse Problems
Centrod Uncertanty Bounds for Interval Type-2 Fuzzy Sets: Forward and Inverse Probles Jerry M. Mendel and Hongwe Wu Sgnal and Iage Processng Insttute Departent of Electrcal Engneerng Unversty of Southern
More informationIntroduction to Random Variables
Introducton to Random Varables Defnton of random varable Defnton of random varable Dscrete and contnuous random varable Probablty functon Dstrbuton functon Densty functon Sometmes, t s not enough to descrbe
More informationGeneral Permanence Conditions for Nonlinear Difference Equations of Higher Order
JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS 213, 496 510 1997 ARTICLE NO. AY975553 General Peranence Condtons for Nonlnear Dfference Equatons of Hgher Order Hassan Sedaghat* Departent of Matheatcal
More informationLimit Cycle Bifurcations in a Class of Cubic System near a Nilpotent Center *
Appled Mateatcs 77-777 ttp://dxdoorg/6/a75 Publsed Onlne July (ttp://wwwscrporg/journal/a) Lt Cycle Bfurcatons n a Class of Cubc Syste near a Nlpotent Center * Jao Jang Departent of Mateatcs Sanga Marte
More informationCOS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture #16 Scribe: Yannan Wang April 3, 2014
COS 511: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture #16 Scrbe: Yannan Wang Aprl 3, 014 1 Introducton The goal of our onlne learnng scenaro from last class s C comparng wth best expert and
More information