Announcements Surveys reminder Pcmn in the News HW 8 edx / grdescope consolid7on Mid- semester survey P4 Due Mondy 4/6 t 11:59pm P5 Ghostusters Due Fridy 4/10 t 5pm Pcmn in the News CS 188: Ar7ficil Intelligence Pr7cle Filters nd A pplic7ons of HMMs Dwinelle Instructor: Chelse Finn University of Cliforni, Berkeley [These slides were creted y Dn Klein nd Pieter Aeel for CS188 Intro to AI t UC Berkeley. All CS188 mterils re ville t hwp://i.erkeley.edu.] [Demo: Ghostusters Mrkov Model (L15D1)] Tody HMMs Pr7cle filters Demos! Most- likely- expln7on queries Applic7ons of HMMs: Speech recogni7on I Know Why You Went to the Clinic: Risks nd Reliz7on of HTTPS Trffic Anlysis Recp: Resoning Over Time Mrkov models X1 X2 0.3 X3 0.7 X4 rin 0.7 Hidden Mrkov models X1 E1 X2 E2 X3 E3 X4 E4 0.3 X5 E5 X E rin umrell P 0.9 rin no umrell 0.1 umrell 0.2 no umrell 0.8
Video of Demo Ghostusters Mrkov Model (Reminder) Recp: Filtering Elpse (me: compute P( X t e 1:t- 1 ) Oserve: compute P( X t e 1:t ) Belief: <P(rin), P()> X 1 X 2 <0.5, 0.5> Prior on X 1 <0.82, 0.18> Oserve E 1 E 2 <0.63, 0.37> Elpse 2me <0.88, 0.12> Oserve [Demo: Ghostusters Exct Filtering (L15D2)] Video of Ghostusters Exct Filtering (Reminder) Pr7cle Filtering Pr7cle Filtering Represent7on: Pr7cles Filtering: pproximte solu7on Some7mes X is too ig to use exct inference X my e too ig to even store B(X) E.g. X is con7nuous Solu7on: pproximte inference Trck smples of X, not ll vlues Smples re clled pr7cles Time per step is liner in the numer of pr7cles But: numer needed my e lrge In memory: list of pr7cles, not sttes This is how root locliz7on works in prc7ce Pr7cle is just new nme for smple 0.0 0.1 0.0 0.0 0.0 0.2 0.0 0.2 0.5 Our represent7on of P(X) is now list of N pr7cles (smples) Generlly, N << X Storing mp from X to counts would defet the point P(x) pproximted y numer of pr7cles with vlue x So, mny x my hve P(x) = 0! More smples, more ccurcy For now, ll pr7cles hve weight of 1 Pr7cles: (1,2)
Pr7cle Filtering Algorithm: Elpse Time Pr7cle Filtering Algorithm: Oserve Ech pr7cle is moved y smpling its next posi7on from the trnsi7on model This is like prior smpling smples frequencies reflect the trnsi7on proili7es Here, most pr7cles move clockwise, ut some move in nother direc7on or sty in plce This cptures the pssge of 7me If enough smples, close to exct vlues efore nd ter (consistent) Pr7cles: (1,2) Pr7cles: (3,1) (1,3) (2,2) Slightly trickier: Don t smple oserv7on, fix it Similr to likelihood weigh7ng, weight smples sed on the evidence As efore, the proili7es don t sum to one, since ll hve een downweighted (in fct they now sum to (N 7mes) n pproxim7on of P(e)) Pr7cles: (3,1) (1,3) (2,2) Pr7cles: w=.9 w=.2 w=.9 (3,1) w=.4 w=.4 w=.9 (1,3) w=.1 w=.2 w=.9 (2,2) w=.4 Pr7cle Filtering Algorithm: Resmple Rther thn trcking weighted smples, we resmple N 7mes, we choose from our weighted smple distriu7on (i.e. drw with replcement) This is equivlent to renormlizing the distriu7on Now the updte is complete for this 7me step, con7nue with the next one Pr7cles: w=.9 w=.2 w=.9 (3,1) w=.4 w=.4 w=.9 (1,3) w=.1 w=.2 w=.9 (2,2) w=.4 (New) Pr7cles: (2,2) (1,3) Video of Demo Moderte Numer of Pr7cles Recp: Pr7cle Filtering Pr7cles: trck smples of sttes rther thn n explicit distriu7on Elpse Weight Resmple Pr7cles: Pr7cles: Pr7cles: (New) Pr7cles: w=.9 w=.2 (2,2) w=.9 (3,1) (3,1) w=.4 w=.4 w=.9 (1,2) (1,3) (1,3) w=.1 (1,3) w=.2 w=.9 (2,2) (2,2) w=.4 [Demos: ghostusters pr7cle filtering (L15D3,4,5)] Video of Demo One Pr7cle
Video of Demo Huge Numer of Pr7cles Demo Bonnz! Root Locliz7on Pr7cle Filter Locliz7on (Sonr) In root locliz7on: We know the mp, ut not the root s posi7on Oserv7ons my e vectors of rnge finder redings Stte spce nd redings re typiclly con7nuous (works siclly like very fine grid) nd so we cnnot store B(X) Pr7cle filtering is min technique Pr7cle Filter Locliz7on (Lser) Root Mpping [Video: glol- sonr- uw- nnotted.vi] SLAM: Simultneous Locliz7on And Mpping We do not know the mp or our loc7on Stte consists of posi7on AND mp! Min techniques: Klmn filtering (Gussin HMMs) nd pr7cle methods [Video: glol- floor.gif] DP- SLAM, Ron Prr [Demo: PARTICLES- SLAM- mpping1- new.vi]
Pr7cle Filter SLAM Video 1 Pr7cle Filter SLAM Video 2 Dynmic Byes Nets [Demo: PARTICLES- SLAM- mpping1- new.vi] Dynmic Byes Nets (DBNs) [Demo: PARTICLES- SLAM- fstslm.vi] We wnt to trck mul7ple vriles over 7me, using mul7ple sources of evidence Ide: Repet fixed Byes net structure t ech 7me Vriles from 7me t cn condi7on on those from t- 1 t =1 t =2 t =3 G 1 G 2 G 3 G 1 G 2 G 3 E 1 E 1 E 2 E 2 E 3 E 3 Video of Demo Pcmn Sonr Ghost DBN Model Dynmic Byes nets re generliz7on of HMMs Exct Inference in DBNs [Demo: pcmn sonr ghost DBN model (L15D6)] Vrile elimin7on pplies to dynmic Byes nets Procedure: unroll the network for T 7me steps, then eliminte vriles un7l P(X T e 1:T ) is computed t =1 t =2 t =3 G 1 G 2 G 3 G 1 G 2 G 3 E 1 E 1 E 2 E 2 E 3 E 3 Online elief updtes: Eliminte ll vriles from the previous 7me step; store fctors for current 7me only
DBN Pr7cle Filters Most Likely Expln7on A pr7cle is complete smple for 7me step Ini(lize: Generte prior smples for the t=1 Byes net Exmple pr7cle: G 1 = G 1 = (5,3) Elpse (me: Smple successor for ech pr7cle Exmple successor: G 2 = G 2 = (6,3) Oserve: Weight ech en2re smple y the likelihood of the evidence condi7oned on the smple Likelihood: P(E 1 G 1 ) * P(E 1 G 1 ) Resmple: Select prior smples (tuples of vlues) in propor7on to their likelihood HMMs: MLE Queries Stte Trellis HMMs defined y Sttes X Oserv7ons E Ini7l distriu7on: Trnsi7ons: Emissions: New query: most likely expln7on: New method: the Viteri lgorithm X 1 X 2 X 3 X 4 X 5 E 1 E 2 E 3 E 4 E 5 Stte trellis: grph of sttes nd trnsi7ons over 7me rin rin rin Ech rc represents some trnsi7on Ech rc hs weight Ech pth is sequence of sttes The product of weights on pth is tht sequence s proility long with the evidence Forwrd lgorithm computes sums of pths, Viteri computes est pths rin Forwrd / Viteri Algorithms Speech Recogni7on* rin rin rin rin Forwrd Algorithm (Sum) Viteri Algorithm (Mx)
Speech Recogni7on in Ac7on Digi7zing Speech Digi7zing Speech [Video: NLP ASR tvsmple.vi (from Lecture 1)] Speech in n Hour Speech input is n cous7c wveform s p ee ch l l to trnsi7on: Spectrl Anlysis Figure: Simon Arnfield, hwp://www.psyc.leeds.c.uk/reserch/cogn/speech/tutoril/ Prt of [e] from l Frequency gives pitch; mplitude gives volume Smpling t ~8 khz (phone), ~16 khz (mic) (khz=1000 cycles/sec) s p ee ch l frequency mplitude Fourier trnsform of wve displyed s spectrogrm Drkness indictes energy t ech frequency Complex wve repe7ng nine 7mes Plus smller wve tht repets 4x for every lrge cycle Lrge wve: freq of 250 Hz (9 7mes in. 036 seconds) Smll wve roughly 4 7mes this, or roughly 1000 Hz Humn er figure: depion.logspot.com
Why These Peks? Resonnces of the Vocl Trct Ar7cultor process: Vocl cord vir7ons crete hrmonics The mouth is n mplifier Depending on shpe of mouth, some hrmonics re mplified more thn others The humn vocl trct s n open tue Open end Closed end Length 17.5 cm. Air in tue of given length will tend to virte t resonnce frequency of tue Constrint: Pressure differen7l should e mximl t (closed) glowl end nd miniml t (open) lip end Spectrum Shpes Video of Demo Speech Synthesis Figure: W. Brry Speech Science slides Figure: Mrk Liermn [Demo: speech synthesis ] Vowel [i] g t successively higher pitches Acous7c Feture Sequence F#2 A2 C3 Time slices re trnslted into cous7c feture vectors (~39 rel numers per slice) F#3 A3 C4 (middle C) frequency A4..e 12 e 13 e 14 e 15 e 16.. These re the oserv7ons E, now we need the hidden sttes X Grphs: Rtree Wylnd
Speech Stte Spce Sttes in Word HMM Specific7on P(E X) encodes which cous7c vectors re pproprite for ech phoneme (ech kind of sound) P(X X ) encodes how sounds cn e strung together Stte Spce We will hve one stte for ech sound in ech word Mostly, sttes dvnce sound y sound Build liwle stte grph for ech word nd chin them together to form the stte spce X Trnsi7ons with Bigrm Model Decoding Trining Counts 198015222 the first 194623024 the sme 168504105 the following 158562063 the world 14112454 the door - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 23135851162 the * Finding the words given the cous7cs is n HMM inference prolem Which stte sequence x 1:T is most likely given the evidence e 1:T? From the sequence x, we cn simply red off the words Figure: Hung et l, p. 618 AI in the News* Chllenge Se~ng User we wnt to spy on use HTTPS to rowse the internet Mesurements IP ddress Sizes of pckets coming in Gol Infer rowsing sequence of tht user E.g.: medicl, finncil, legl, I Know Why You Went to the Clinic: Risks nd Reliz7on of HTTPS Trffic Anlysis Brd Miller, Ling Hung, A. D. Joseph, J. D. Tygr (UC Berkeley)
HMM Results Trnsi7on model Proility distriu7on over links on the current pge + some proility to nvigte to ny other pge on the site Noisy oserv7on model due to trffic vri7ons Cching Dynmiclly generted content User- specific content, including cookies à Proility distriu7on P( pcket size pge ) BoG = descried pproch, others re prior work Results Next Time: Mchine Lerning! Session Length Effect Accurcy 0 20 40 60 80 100 0 10 20 30 40 50 60 70 Length of Browsing Session