Lecure Lnear Models
Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one goes hrough o desgn a learnng ssem, hch nvolves deermnng: Tpes of ranng eperence Targe funcon Represenaon of he learned funcon Learnng algorhm
Supervsed learnng Le s look a he problem of spam flerng No anone can learn ho o earn $00 - $943 per da or More! If ou can pe (hun and peck s ok o sar) and fll n forms, ou can score bg! So don' dela ang around for he ne opporun... s knockng no! Sar here: hp://redbluecruse.com//c/38/polohoo/z37957.hml Do ou Have Poer ha ou hnk should be orh $0,000.00 USD, e do!.. Ener our Inernaonal Open cones and see f ou have ha akes. To see deals or o ener our on poem, Clck lnk belo. hp://e-suscrber.com/ms?e=0saoo4q9s4zyyuoyq&m=79534&l=0 Ve m phoos! I nve ou o ve he follong phoo album(s): zak-monh7 He have ou seen m ne pcs e???? Me and m grlfrend ould love f ou ould come cha h us for a b.. Well jon us f ou neresed. Jon lve eb cam cha here: hp://e-commcenral.com/ms?e=0saoo4q9s4zyyuoyq&m=8534&l=0
Le s look a he desgn choces Learnng eperence? Pas emals and heher he are consdered spam or no (ou can also choose o use non-spam or spam emals onl, bu ha ll requre dfferen choces laer on) Targe funcon? Emal -> spam or no Represenaon of he funcon?? Learnng algorhm? We ll focus mosl on hese o aspecs n hs class. In some cases, ou ll also need o pa aenon o he frs o quesons.
Connue h he desgn choces Represenaon of he funcon (emal -> spam or no)? Frs of all, ho o represen an emal? Use bag-of-ords o represen an emal Ths ll urn an emal no a collecon of feaures, e.g., here each feaure descrbe heher a parcular ord s presen n he emal Ths gves us he sandard supervsed classfcaon problem pcall seen n e books and papers Tranng se: a se of eamples (nsances, objecs) h class labels, e.g., posve (spam) and negave (non spam) Inpu represenaon: an eample s descrbed b a se of arbues (e.g., heher $ s presen, ec.) Gven an unseen emal, and s npu represenaon, predc s label Ne queson: ha funcon forms o use?
Lnear Threshold Uns (McCulloch & Ps 943) 0 n f 0 = = oherse > 0 n n Assume each feaure j and egh j s a real number LTU compues and akes hreshold o produce he predcon Wh lnear model? Smples model feer parameers o learn Vsuall nuve - drang a sragh lne o separae posve from negave
Geomerc ve - - - - - - W=(,), pons o he posve sde - - = Referred o as decson boundar
A Canoncal Represenaon Gven a ranng eample: (<,,, m >, ) ransform o (<,,,, m >, ) The parameer vecor ll hen be = < 0,,,, m > Gven a ranng se, e need o learn g(,) = 0 m m = Or equvalenl h(,) = sgn(g(,)) Do (or nner) produc: akes o equal-lengh vecors, and reurns he sum of her componen-se produc To dfferenae he learned funcon and he rue underlng funcon, s common o refer o he learned funcon as a hpohess (each unque se of parameer values s one hpohess) A predcon s correc f g(,) >0 (or h(,)>0)
Geomercall, usng he canoncal represenaon ranslaes o o hngs:.i ll ncrease he npu space dmenson b, and.he decson boundar no alas passes hrough he orgn.
Geomerc ve - - -
Ho o learn: he percepron algorhm The equaon 0... m m = 0 defnes a lnear decson boundar ha separaes npu space no dfferen decson regons The goal of learnng s o fnd a egh vecor such ha s decson boundar correcl separae posve eamples from negave eamples. Ho can e acheve hs? Percepron s one approach. I sars h some vecor and ncremenall updae hen makes a msake. Le be curren egh vecor, and suppose makes a msake on eample <, >, ha s o sa <0. The percepron updae rule s: =
Percepron Algorhm Le (0,0,0,...,0) Repea Accep ranng eample u f u <= 0 : (, )
Effec of Percepron Updang Rule Mahemacall speakng = ( ) = > The updang rule makes more posve, hus can poenall correc he msake Geomercall _ Sep Sep _
Onlne vs Bach We call he above percepron algorhm an onlne algorhm Onlne algorhms perform learnng each me receves an ranng eample In conras, bach learnng algorhms collec a bach of ranng eamples and learn from hem all a once.
Bach Percepron Algorhm Gven : ranng eamples ( Le (0,0,0,...,0) do dela (0,0,0,...,0) for = o N do u f u <= 0 dela dela dela dela / N η dela unl dela < ε, ), =,..., N
Good nes If here s a lnear decson boundar ha correcl classf all ranng eamples, hs algorhm ll fnd Formall speakng, hs s he convergence Proper: For lnearl separable daa (.e., here ess an lnear decson boundar ha perfecl separaes posve and negave ranng eamples), he percepron algorhm converges n a fne number of seps. Wh? If ou are mahemacall curous, read he follong slde, ou ll fnd he anser. And ho man seps? If ou are praccall curous, read he follong slde, anser s n here oo. The furher good nes s ha ou are no requred o maser hs maeral, he are jus for he curous ones
Proof ne = ), ( cos ) ( = = eamples all for.e.,, h a margn eamples all classf ha Assume > h sep, a be our and be a soluon vecor, Le bounded amoun b a loer o a soluon vecor vecor closer egh he updae moves each need o sho ha jus e To sho convergence, = > > > > = 0... < = = < = = D b bounded are ha Assume... D D D < < < < < ), ( cos D ne > > = D D D < < < 3 3
Margn s referred o as he margn The bgger he margn, he easer he classfcaon problem s, he percepron algorhm ll lkel fnd he soluon faser! Sde sor: he bgger he margn, he more confden e are abou our predcon, hch makes desrable o fnd he one ha gves he mamum margn Laer n he course hs concep ll be core o one of he recen mos ecng developmens n he ML feld suppor vecor machne
Bad nes Wha abou non-lnearl separable cases! In such cases he algorhm ll never sop! Ho o f? One possble soluon: look for decson boundar ha make as fe msakes as possble NP-hard (refresh our 35 memor!)
Le (0,0,0,...,0) for =,...,N Take ranng eample : ( u f u < 0, )
Le c 0 = 0 0 u = (0,0,0,...,0) repea Take eample : (, ) f u n n n = n else c c n <= 0 n = 0 = c n n Sore a collecon of lnear separaors 0,,, along h her survval me c 0, c, The c s can be good measures of relabl of he s. For classfcaon, ake a eghed voe among all separaors: