Probabilistic Reasoning. CS 188: Artificial Intelligence Spring Inference by Enumeration. Probability recap. Chain Rule à Bayes net

Size: px
Start display at page:

Download "Probabilistic Reasoning. CS 188: Artificial Intelligence Spring Inference by Enumeration. Probability recap. Chain Rule à Bayes net"

Transcription

1 CS 188: Artificil Intelligence Spring 2011 Finl Review 5/2/2011 Pieter Aeel UC Berkeley Proilistic Resoning Proility Rndom Vriles Joint nd Mrginl Distriutions Conditionl Distriution Inference y Enumertion Product Rule, Chin Rule, Byes Rule Independence Distriutions over ARGE Numers of Rndom Vriles à Byesin networks Representtion Inference Exct: Enumertion, Vrile Elimintion Approximte: Smpling erning Mximum likelihood prmeter estimtion plce smoothing iner interpoltion 2 Proility recp Inference y Enumertion Conditionl proility Product rule Chin rule, independent iff: equivlently, iff: equivlently, iff: x, y : P (x y) =P (x) x, y : P (y x) =P (y) nd re conditionlly independent given iff: equivlently, iff: x, y, z : P (x y, z) =P (x z) equivlently, iff: x, y, z : P (y x, z) =P (y z) 3 P(sun)? P(sun winter)? P(sun winter, hot)? S T W P summer hot sun 0.30 summer hot rin 0.05 summer cold sun 0.10 summer cold rin 0.05 winter hot sun 0.10 winter hot rin 0.05 winter cold sun 0.15 winter cold rin Chin Rule à Byes net Exmple: Alrm Network Chin rule: cn lwys write ny joint distriution s n incrementl product of conditionl distriutions B P(B) Burglry Erthqk E P(E) +e e Byes nets: mke conditionl independence ssumptions of the form: P (x i x 1 x i 1 )=P(x i prents( i )) B E A giving us: J M 5 John clls A J P(J A) + +j j 0.1 +j 0.05 j 0.95 Alrm Mry clls A M P(M A) + +m m 0.3 +m 0.01 m 0.99 B E A P(A B,E) + +e e e e e e 0.71 e e

2 Size of Byes Net for How ig is joint distriution over N Boolen vriles? 2 N Size of representtion if we use the chin rule 2 N How ig is n N-node net if nodes hve up to k prents? O(N * 2 k+1 ) Both give you the power to clculte BNs: Huge spce svings! Esier to elicit locl CPTs Fster to nswer queries 7 Byes Nets: Assumptions Assumptions we re required to mke to define the Byes net when given the grph: P (x i x 1 x i 1 )=P (x i prents( i )) Given Byes net grph dditionl conditionl independences cn e red off directly from the grph Question: Are two nodes necessrily independent given certin evidence? If no, cn prove with counter exmple I.e., pick set of CPT s, nd show tht the independence ssumption is violted y the resulting distriution If yes, cn prove with Alger (tedious) D-seprtion (nlyzes grph) 8 D-Seprtion D-Seprtion Question: Are nd conditionlly independent given evidence vrs {}? es, if nd seprted y Consider ll (undirected) pths from to No ctive pths = independence! A pth is ctive if ech triple is ctive: Cusl chin A B C where B is unoserved (either direction) Common cuse A B C where B is unoserved Common effect (k v-structure) A B C where B or one of its descendents is oserved All it tkes to lock pth is single inctive segment Active Triples Inctive Triples Given query i? j { k1,..., kn } Shde ll evidence nodes For ll (undirected!) pths etween nd Check whether pth is ctive If ctive return i j { k1,..., kn } (If reching this point ll pths hve een checked nd shown inctive) Return i j { k1,..., kn } 10 Exmple All Conditionl Independences es es R B Given Byes net structure, cn run d- seprtion to uild complete list of conditionl independences tht re necessrily true of the form D T i j { k1,..., kn } es T 11 This list determines the set of proility distriutions tht cn e represented y Byes nets with this grph structure 12 2

3 Topology imits Distriutions Given some grph topology G, only certin joint distriutions cn e encoded The grph structure gurntees certin (conditionl) independences (There might e more independence) Adding rcs increses the set of distriutions, ut hs severl costs Full conditioning cn encode ny distriution {,,,,, } { } {} 13 Byes Nets Sttus Representtion Inference erning Byes Nets from Dt 14 Inference y Enumertion Given unlimited time, inference in BNs is esy Recipe: Stte the mrginl proilities you need Figure out A the tomic proilities you need Clculte nd comine them Exmple: B E Exmple: Enumertion In this simple method, we only need the BN to synthesize the joint entries A J M Vrile Elimintion Why is inference y enumertion so slow? ou join up the whole joint distriution efore you sum out the hidden vriles ou end up repeting lot of work! Ide: interleve joining nd mrginlizing! Clled Vrile Elimintion Still NP-hrd, ut usully much fster thn inference y enumertion 17 Vrile Elimintion Outline Trck ojects clled fctors Initil fctors re locl CPTs (one per node) +r r 0.9 Any known vlues re selected E.g. if we know, the initil fctors re +r r 0.9 +r +t 0.8 +r - t r +t r - t 0.9 +r +t 0.8 +r - t r +t r - t 0.9 +t - l t - l 0.9 VE: Alterntely join fctors nd eliminte vriles R T 18 3

4 Vrile Elimintion Exmple Vrile Elimintion Exmple R T +r r 0.9 +r +t 0.8 +r - t r +t r - t 0.9 +t - l t - l 0.9 Join R +r +t r - t r +t r - t t - l t - l 0.9 Sum out R R, T +t t t - l t - l 0.9 T 19 T +t t t - l t - l 0.9 Join T T, Sum out T +t +l t - l t +l t - l l l * VE is vrile elimintion Exmple Exmple Choose E Choose A Finish with B Normlize Generl Vrile Elimintion Query: Strt with initil fctors: ocl CPTs (ut instntited y evidence) While there re still hidden vriles (not Q or evidence): Pick hidden vrile H Join ll fctors mentioning H Eliminte (sum out) H Join ll remining fctors nd normlize 23 Approximte Inference: Smpling Bsic ide: Drw N smples from smpling distriution S Compute n pproximte posterior proility Show this converges to the true proility P Why? Fster thn computing the exct nswer Prior smpling: Smple A vriles in topologicl order s this cn e done quickly Rejection smpling for query = like prior smpling, ut reject when vrile is smpled inconsistent with the query, in this cse when vrile E i is smpled differently from e i ikelihood weighting for query = like prior smpling ut vriles E i re not smpled, when it s their turn, they get set to e i, nd the smple gets weighted y P(e i vlue of prents(e i ) in current smple) Gis smpling: repetedly smples ech non-evidence vrile 24 conditioned on ll other vriles à cn incorporte downstrem evidence 4

5 Prior Smpling Exmple +c +s s c +s s 0.5 Sprinkler +s +r +w w r +w w s +r +w w r +w w c c 0.5 Cloudy WetGrss Rin Smples: +c +r r c +r r 0.8 +c, -s, +r, +w -c, +s, -r, +w 25 We ll get unch of smples from the BN: +c, -s, +r, +w +c, +s, +r, +w Cloudy C -c, +s, +r, -w Sprinkler S Rin R +c, -s, +r, +w -c, -s, -r, +w WetGrss W If we wnt to know P(W) We hve counts <+w:4, -w:1> Normlize to get P(W) = <+w:0.8, -w:0.2> This will get closer to the true distriution with more smples Cn estimte nything else, too Wht out P(C +w)? P(C +r, +w)? P(C -r, -w)? Fst: cn use fewer smples if less time 26 ikelihood Weighting ikelihood Weighting +c c 0.5 Smpling distriution if z smpled nd e fixed evidence +c +s s c +s s 0.5 Sprinkler Cloudy Rin +c +r r c +r r 0.8 Now, smples hve weights S Cloudy C W R +s +r +w w r +w w s +r +w w r +w w 0.99 WetGrss Smples: +c, +s, +r, +w 27 Together, weighted smpling distriution is consistent 28 Gis Smpling Ide: insted of smpling from scrtch, crete smples tht re ech like the lst one. Procedure: resmple one vrile t time, conditioned on ll the rest, ut keep evidence fixed. Properties: Now smples re not independent (in fct they re nerly identicl), ut smple verges re still consistent estimtors! Wht s the point: oth upstrem nd downstrem vriles condition on evidence. Mrkov Models A Mrkov model is chin-structured BN Ech node is identiclly distriuted (sttionrity) Vlue of t given time is clled the stte As BN: The chin is just (growing) BN We cn lwys use generic BN resoning on it if we truncte the chin t fixed length Sttionry distriutions For most chins, the distriution we end up in is independent of the initil distriution Clled the sttionry distriution of the chin P () = P t+1 t ( x)p (x) x 29 Exmple pplictions: We link nlysis (Pge Rnk) nd Gis Smpling 5

6 Hidden Mrkov Models Underlying Mrkov chin over sttes S ou oserve outputs (effects) t ech time step Online Belief Updtes Every time step, we strt with current P( evidence) We updte for time: E 1 E 2 E 3 E 4 E 5 We updte for evidence: Speech recognition HMMs: i : specific positions in specific words; E i : coustic signls Mchine trnsltion HMMs: i : trnsltion options; E i : Oservtions re words Root trcking: i : positions on mp; E i : rnge redings The forwrd lgorithm does oth t once (nd doesn t normlize) 2 E 2 Prticle Filtering Prticle Filtering = likelihood weighting + resmpling t ech time slice Why: sometimes is too ig to use exct inference Elpse time: Ech prticle is moved y smpling its next position from the trnsition model Prticle is just new nme for smple Oserve: We don t smple the oservtion, we fix it nd downweight our smples sed on the evidence This is like likelihood weighting, so we Resmple: Rther thn trcking weighted smples, we resmple N times, we choose from our weighted smple distriution Dynmic Byes Nets (DBNs) Byes Nets Sttus We wnt to trck multiple vriles over time, using multiple sources of evidence Ide: Repet fixed Byes net structure t ech time Vriles from time t cn condition on those from t-1 t =1 t =2 t =3 Representtion Inference G 1 G 1 G 2 G 2 G 3 G 3 erning Byes Nets from Dt E 1 E 1 E 2 E 2 E 3 E 3 Discrete vlued dynmic Byes nets re lso HMMs 36 6

7 Prmeter Estimtion Estimting distriution of rndom vriles like or Empiriclly: use trining dt For ech outcome x, look t the empiricl rte of tht vlue: r g g Representtion Inference Byes Nets Sttus This is the estimte tht mximizes the likelihood of the dt erning Byes Nets from Dt plce smoothing Pretend sw every outcome k extr times Smooth ech condition independently: 38 Clssifiction: Feture Vectors Clssifiction overview Hello, Do you wnt free printr crtriges? Why py more when you cn get them ABSOUTE FREE! Just # free : 2 OUR_NAME : 0 MISSPEED : 2 FROM_FRIEND : 0... PIE-7,12 : 1 PIE-7,13 : 0... NUM_OOPS : 1... SPAM or + 2 Nïve Byes: Builds model trining dt Gives prediction proilities Strong ssumptions out feture independence One pss through dt (counting) Perceptron: Mkes less ssumptions out dt Mistke-driven lerning Multiple psses through dt (prediction) Often more ccurte MIRA: ike perceptron, ut dptive scling of size of updte SVM: Properties similr to perceptron Convex optimiztion formultion Nerest-Neighor: Non-prmetric: more expressive with more trining dt Kernels Efficient wy to mke liner lerning rchitectures into nonliner ones Byes Nets for Clssifiction Generl Nïve Byes One method of clssifiction: Use proilistic model! Fetures re oserved rndom vriles F i is the query vrile Use proilistic inference to compute most likely A generl nive Byes model: x F n prmeters ou lredy know how to do this inference prmeters n x F x prmeters We only specify how ech feture depends on the clss Totl numer of prmeters is liner in n Our running exmple: digits F 1 F 2 F n 7

8 Bg-of-Words Nïve Byes iner Clssifier Genertive model Word t position i, not i th word in the dictionry! Binry liner clssifier: Bg-of-words Ech position is identiclly distriuted All positions shre the sme conditionl pros P(W C) à When lerning the prmeters, dt is shred over ll positions in the document (rther thn seprtely lerning distriution for ech position in the document) Multiclss liner clssifier: A weight vector for ech clss: Score (ctivtion) of clss y: Our running exmple: spm vs. hm Prediction highest score wins Binry = multiclss where the negtive clss hs weight zero Perceptron = lgorithm to lern weights w Strt with zero weights Pick up trining instnces one y one Clssify with current weights Prolems with the Perceptron Noise: if the dt isn t seprle, weights might thrsh Averging weight vectors over time cn help (verged perceptron) If correct, no chnge! If wrong: lower score of wrong nswer, rise score of right nswer Mediocre generliztion: finds rely seprting solution Overtrining: test / held-out ccurcy usully rises, then flls Overtrining is kind of overfitting 45 Fixing the Perceptron: MIRA Updte size tht fixes the current mistke nd lso minimizes the chnge to w Support Vector Mchines Mximizing the mrgin: good ccording to intuition, theory, prctice Support vector mchines (SVMs) find the seprtor with mx mrgin Bsiclly, SVMs re MIRA where you optimize over ll exmples t once Updte w y solving: SVM, C 8

9 Non-iner Seprtors Dt tht is linerly seprle (with some noise) works out gret: 0 x But wht re we going to do if the dtset is just too hrd? 0 x How out mpping dt to higher-dimensionl spce: x 2 Non-iner Seprtors Generl ide: the originl feture spce cn lwys e mpped to some higher-dimensionl feture spce where the trining set is seprle: Φ: x φ(x) 0 x 49 This nd next few slides dpted from Ry Mooney, UT 50 Some Kernels Kernels implicitly mp originl vectors to higher dimensionl spces, tke the dot product there, nd hnd the result ck Polynomil kernel: Some Kernels (2) iner kernel: Á(x) = x Qudrtic kernel: 51 Why nd When Kernels? Cn t you just dd these fetures on your own (e.g. dd ll pirs of fetures insted of using the qudrtic kernel)? es, in principle, just compute them No need to modify ny lgorithms But, numer of fetures cn get lrge (or infinite) Kernels let us compute with these fetures implicitly Exmple: implicit dot product in polynomil, Gussin nd string kernel tkes much less spce nd time per dot product When cn we use kernels? When our lerning lgorithm cn e reformulted in terms of only inner products etween feture vectors Exmples: perceptron, support vector mchine K-nerest neighors 1-NN: copy the lel of the most similr dt point K-NN: let the k nerest neighors vote (hve to devise weighting scheme) 2 Exmples 10 Exmples 100 Exmples Exmples Prmetric models: Fixed set of prmeters More dt mens etter settings Non-prmetric models: Complexity of the clssifier increses with dt Better in the limit, often worse in the non-limit (K)NN is non-prmetric Truth 54 9

10 Bsic Similrity Importnt Concepts Mny similrities sed on feture dot products: If fetures re just the pixels: Note: not ll similrities re of this form 55 Dt: leled instnces, e.g. emils mrked spm/hm Trining set Held out set Test set Fetures: ttriute-vlue pirs which chrcterize ech x Experimenttion cycle ern prmeters (e.g. model proilities) on trining set (Tune hyperprmeters on held-out set) Compute ccurcy of test set Very importnt: never peek t the test set! Evlution Accurcy: frction of instnces predicted correctly Overfitting nd generliztion Wnt clssifier which does well on test dt Overfitting: fitting the trining dt very closely, ut not generlizing well We ll investigte overfitting nd generliztion formlly in few lectures Trining Dt Held-Out Dt Test Dt Tuning on Held-Out Dt Extension: We Serch Now we ve got two kinds of unknowns Prmeters: the proilities P( ), P() Hyperprmeters: Amount of smoothing to do: k, α (nïve Byes) Numer of psses over trining dt (perceptron) Where to lern? ern prmeters from trining dt Informtion retrievl: Given informtion needs, produce informtion Includes, e.g. we serch, question nswering, nd clssic IR x = Apple Computers Must tune hyperprmeters on different dt For ech vlue of the hyperprmeters, trin nd test on the held-out dt Choose the est vlue nd do finl test on the test dt We serch: not exctly clssifiction, ut rther rnking Feture-Bsed Rnking x = Apple Computers x, Perceptron for Rnking Inputs Cndidtes Mny feture vectors: One weight vector: Prediction: x, Updte (if wrong): 10

CS 188: Artificial Intelligence Fall Announcements

CS 188: Artificial Intelligence Fall Announcements CS 188: Artificil Intelligence Fll 2009 Lecture 20: Prticle Filtering 11/5/2009 Dn Klein UC Berkeley Announcements Written 3 out: due 10/12 Project 4 out: due 10/19 Written 4 proly xed, Project 5 moving

More information

Today. Recap: Reasoning Over Time. Demo Bonanza! CS 188: Artificial Intelligence. Advanced HMMs. Speech recognition. HMMs. Start machine learning

Today. Recap: Reasoning Over Time. Demo Bonanza! CS 188: Artificial Intelligence. Advanced HMMs. Speech recognition. HMMs. Start machine learning CS 188: Artificil Intelligence Advnced HMMs Dn Klein, Pieter Aeel University of Cliforni, Berkeley Demo Bonnz! Tody HMMs Demo onnz! Most likely explntion queries Speech recognition A mssive HMM! Detils

More information

Reasoning over Time or Space. CS 188: Artificial Intelligence. Outline. Markov Models. Conditional Independence. Query: P(X 4 )

Reasoning over Time or Space. CS 188: Artificial Intelligence. Outline. Markov Models. Conditional Independence. Query: P(X 4 ) CS 88: Artificil Intelligence Lecture 7: HMMs nd Prticle Filtering Resoning over Time or Spce Often, we wnt to reson out sequence of oservtions Speech recognition Root locliztion User ttention Medicl monitoring

More information

Bayesian Networks: Approximate Inference

Bayesian Networks: Approximate Inference pproches to inference yesin Networks: pproximte Inference xct inference Vrillimintion Join tree lgorithm pproximte inference Simplify the structure of the network to mkxct inferencfficient (vritionl methods,

More information

Today. CS 188: Artificial Intelligence. Recap: Reasoning Over Time. Particle Filters and Applications of HMMs. HMMs

Today. CS 188: Artificial Intelligence. Recap: Reasoning Over Time. Particle Filters and Applications of HMMs. HMMs CS 188: Artificil Intelligence Prticle Filters nd Applictions of HMMs Tody HMMs Prticle filters Demo onnz! Most-likely-explntion queries Instructors: Jco Andres nd Dvis Foote University of Cliforni, Berkeley

More information

Decision Networks. CS 188: Artificial Intelligence Fall Example: Decision Networks. Decision Networks. Decisions as Outcome Trees

Decision Networks. CS 188: Artificial Intelligence Fall Example: Decision Networks. Decision Networks. Decisions as Outcome Trees CS 188: Artificil Intelligence Fll 2011 Decision Networks ME: choose the ction which mximizes the expected utility given the evidence mbrell Lecture 17: Decision Digrms 10/27/2011 Cn directly opertionlize

More information

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4 Intermedite Mth Circles Wednesdy, Novemer 14, 2018 Finite Automt II Nickols Rollick nrollick@uwterloo.c Regulr Lnguges Lst time, we were introduced to the ide of DFA (deterministic finite utomton), one

More information

Discrete Mathematics and Probability Theory Spring 2013 Anant Sahai Lecture 17

Discrete Mathematics and Probability Theory Spring 2013 Anant Sahai Lecture 17 EECS 70 Discrete Mthemtics nd Proility Theory Spring 2013 Annt Shi Lecture 17 I.I.D. Rndom Vriles Estimting the is of coin Question: We wnt to estimte the proportion p of Democrts in the US popultion,

More information

Review of Gaussian Quadrature method

Review of Gaussian Quadrature method Review of Gussin Qudrture method Nsser M. Asi Spring 006 compiled on Sundy Decemer 1, 017 t 09:1 PM 1 The prolem To find numericl vlue for the integrl of rel vlued function of rel vrile over specific rnge

More information

CS 188 Introduction to Artificial Intelligence Fall 2018 Note 7

CS 188 Introduction to Artificial Intelligence Fall 2018 Note 7 CS 188 Introduction to Artificil Intelligence Fll 2018 Note 7 These lecture notes re hevily bsed on notes originlly written by Nikhil Shrm. Decision Networks In the third note, we lerned bout gme trees

More information

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 17

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 17 CS 70 Discrete Mthemtics nd Proility Theory Summer 2014 Jmes Cook Note 17 I.I.D. Rndom Vriles Estimting the is of coin Question: We wnt to estimte the proportion p of Democrts in the US popultion, y tking

More information

CS103B Handout 18 Winter 2007 February 28, 2007 Finite Automata

CS103B Handout 18 Winter 2007 February 28, 2007 Finite Automata CS103B ndout 18 Winter 2007 Ferury 28, 2007 Finite Automt Initil text y Mggie Johnson. Introduction Severl childrens gmes fit the following description: Pieces re set up on plying ord; dice re thrown or

More information

Reasoning with Bayesian Networks

Reasoning with Bayesian Networks Complexity of Probbilistic Inference Compiling Byesin Networks Resoning with Byesin Networks Lecture 5: Complexity of Probbilistic Inference, Compiling Byesin Networks Jinbo Hung NICTA nd ANU Jinbo Hung

More information

Hidden Markov Models

Hidden Markov Models Hidden Mrkov Models Huptseminr Mchine Lerning 18.11.2003 Referent: Nikols Dörfler 1 Overview Mrkov Models Hidden Mrkov Models Types of Hidden Mrkov Models Applictions using HMMs Three centrl problems:

More information

NFA DFA Example 3 CMSC 330: Organization of Programming Languages. Equivalence of DFAs and NFAs. Equivalence of DFAs and NFAs (cont.

NFA DFA Example 3 CMSC 330: Organization of Programming Languages. Equivalence of DFAs and NFAs. Equivalence of DFAs and NFAs (cont. NFA DFA Exmple 3 CMSC 330: Orgniztion of Progrmming Lnguges NFA {B,D,E {A,E {C,D {E Finite Automt, con't. R = { {A,E, {B,D,E, {C,D, {E 2 Equivlence of DFAs nd NFAs Any string from {A to either {D or {CD

More information

Nondeterminism and Nodeterministic Automata

Nondeterminism and Nodeterministic Automata Nondeterminism nd Nodeterministic Automt 61 Nondeterminism nd Nondeterministic Automt The computtionl mchine models tht we lerned in the clss re deterministic in the sense tht the next move is uniquely

More information

Surface maps into free groups

Surface maps into free groups Surfce mps into free groups lden Wlker Novemer 10, 2014 Free groups wedge X of two circles: Set F = π 1 (X ) =,. We write cpitl letters for inverse, so = 1. e.g. () 1 = Commuttors Let x nd y e loops. The

More information

Review of Calculus, cont d

Review of Calculus, cont d Jim Lmbers MAT 460 Fll Semester 2009-10 Lecture 3 Notes These notes correspond to Section 1.1 in the text. Review of Clculus, cont d Riemnn Sums nd the Definite Integrl There re mny cses in which some

More information

Convert the NFA into DFA

Convert the NFA into DFA Convert the NF into F For ech NF we cn find F ccepting the sme lnguge. The numer of sttes of the F could e exponentil in the numer of sttes of the NF, ut in prctice this worst cse occurs rrely. lgorithm:

More information

CS 188: Artificial Intelligence Spring 2007

CS 188: Artificial Intelligence Spring 2007 CS 188: Artificil Intelligence Spring 2007 Lecture 3: Queue-Bsed Serch 1/23/2007 Srini Nrynn UC Berkeley Mny slides over the course dpted from Dn Klein, Sturt Russell or Andrew Moore Announcements Assignment

More information

Solution for Assignment 1 : Intro to Probability and Statistics, PAC learning

Solution for Assignment 1 : Intro to Probability and Statistics, PAC learning Solution for Assignment 1 : Intro to Probbility nd Sttistics, PAC lerning 10-701/15-781: Mchine Lerning (Fll 004) Due: Sept. 30th 004, Thursdy, Strt of clss Question 1. Bsic Probbility ( 18 pts) 1.1 (

More information

CSCI 340: Computational Models. Kleene s Theorem. Department of Computer Science

CSCI 340: Computational Models. Kleene s Theorem. Department of Computer Science CSCI 340: Computtionl Models Kleene s Theorem Chpter 7 Deprtment of Computer Science Unifiction In 1954, Kleene presented (nd proved) theorem which (in our version) sttes tht if lnguge cn e defined y ny

More information

LECTURE NOTE #12 PROF. ALAN YUILLE

LECTURE NOTE #12 PROF. ALAN YUILLE LECTURE NOTE #12 PROF. ALAN YUILLE 1. Clustering, K-mens, nd EM Tsk: set of unlbeled dt D = {x 1,..., x n } Decompose into clsses w 1,..., w M where M is unknown. Lern clss models p(x w)) Discovery of

More information

CS 373, Spring Solutions to Mock midterm 1 (Based on first midterm in CS 273, Fall 2008.)

CS 373, Spring Solutions to Mock midterm 1 (Based on first midterm in CS 273, Fall 2008.) CS 373, Spring 29. Solutions to Mock midterm (sed on first midterm in CS 273, Fll 28.) Prolem : Short nswer (8 points) The nswers to these prolems should e short nd not complicted. () If n NF M ccepts

More information

Properties of Integrals, Indefinite Integrals. Goals: Definition of the Definite Integral Integral Calculations using Antiderivatives

Properties of Integrals, Indefinite Integrals. Goals: Definition of the Definite Integral Integral Calculations using Antiderivatives Block #6: Properties of Integrls, Indefinite Integrls Gols: Definition of the Definite Integrl Integrl Clcultions using Antiderivtives Properties of Integrls The Indefinite Integrl 1 Riemnn Sums - 1 Riemnn

More information

How do we solve these things, especially when they get complicated? How do we know when a system has a solution, and when is it unique?

How do we solve these things, especially when they get complicated? How do we know when a system has a solution, and when is it unique? XII. LINEAR ALGEBRA: SOLVING SYSTEMS OF EQUATIONS Tody we re going to tlk out solving systems of liner equtions. These re prolems tht give couple of equtions with couple of unknowns, like: 6= x + x 7=

More information

First Midterm Examination

First Midterm Examination 24-25 Fll Semester First Midterm Exmintion ) Give the stte digrm of DFA tht recognizes the lnguge A over lphet Σ = {, } where A = {w w contins or } 2) The following DFA recognizes the lnguge B over lphet

More information

CMPSCI 250: Introduction to Computation. Lecture #31: What DFA s Can and Can t Do David Mix Barrington 9 April 2014

CMPSCI 250: Introduction to Computation. Lecture #31: What DFA s Can and Can t Do David Mix Barrington 9 April 2014 CMPSCI 250: Introduction to Computtion Lecture #31: Wht DFA s Cn nd Cn t Do Dvid Mix Brrington 9 April 2014 Wht DFA s Cn nd Cn t Do Deterministic Finite Automt Forml Definition of DFA s Exmples of DFA

More information

Recitation 3: More Applications of the Derivative

Recitation 3: More Applications of the Derivative Mth 1c TA: Pdric Brtlett Recittion 3: More Applictions of the Derivtive Week 3 Cltech 2012 1 Rndom Question Question 1 A grph consists of the following: A set V of vertices. A set E of edges where ech

More information

Math 1B, lecture 4: Error bounds for numerical methods

Math 1B, lecture 4: Error bounds for numerical methods Mth B, lecture 4: Error bounds for numericl methods Nthn Pflueger 4 September 0 Introduction The five numericl methods descried in the previous lecture ll operte by the sme principle: they pproximte the

More information

Continuous Random Variables Class 5, Jeremy Orloff and Jonathan Bloom

Continuous Random Variables Class 5, Jeremy Orloff and Jonathan Bloom Lerning Gols Continuous Rndom Vriles Clss 5, 8.05 Jeremy Orloff nd Jonthn Bloom. Know the definition of continuous rndom vrile. 2. Know the definition of the proility density function (pdf) nd cumultive

More information

Lecture 3: Equivalence Relations

Lecture 3: Equivalence Relations Mthcmp Crsh Course Instructor: Pdric Brtlett Lecture 3: Equivlence Reltions Week 1 Mthcmp 2014 In our lst three tlks of this clss, we shift the focus of our tlks from proof techniques to proof concepts

More information

Bases for Vector Spaces

Bases for Vector Spaces Bses for Vector Spces 2-26-25 A set is independent if, roughly speking, there is no redundncy in the set: You cn t uild ny vector in the set s liner comintion of the others A set spns if you cn uild everything

More information

Reinforcement learning II

Reinforcement learning II CS 1675 Introduction to Mchine Lerning Lecture 26 Reinforcement lerning II Milos Huskrecht milos@cs.pitt.edu 5329 Sennott Squre Reinforcement lerning Bsics: Input x Lerner Output Reinforcement r Critic

More information

More on automata. Michael George. March 24 April 7, 2014

More on automata. Michael George. March 24 April 7, 2014 More on utomt Michel George Mrch 24 April 7, 2014 1 Automt constructions Now tht we hve forml model of mchine, it is useful to mke some generl constructions. 1.1 DFA Union / Product construction Suppose

More information

1 Nondeterministic Finite Automata

1 Nondeterministic Finite Automata 1 Nondeterministic Finite Automt Suppose in life, whenever you hd choice, you could try oth possiilities nd live your life. At the end, you would go ck nd choose the one tht worked out the est. Then you

More information

A-Level Mathematics Transition Task (compulsory for all maths students and all further maths student)

A-Level Mathematics Transition Task (compulsory for all maths students and all further maths student) A-Level Mthemtics Trnsition Tsk (compulsory for ll mths students nd ll further mths student) Due: st Lesson of the yer. Length: - hours work (depending on prior knowledge) This trnsition tsk provides revision

More information

I1 = I2 I1 = I2 + I3 I1 + I2 = I3 + I4 I 3

I1 = I2 I1 = I2 + I3 I1 + I2 = I3 + I4 I 3 2 The Prllel Circuit Electric Circuits: Figure 2- elow show ttery nd multiple resistors rrnged in prllel. Ech resistor receives portion of the current from the ttery sed on its resistnce. The split is

More information

CMDA 4604: Intermediate Topics in Mathematical Modeling Lecture 19: Interpolation and Quadrature

CMDA 4604: Intermediate Topics in Mathematical Modeling Lecture 19: Interpolation and Quadrature CMDA 4604: Intermedite Topics in Mthemticl Modeling Lecture 19: Interpoltion nd Qudrture In this lecture we mke brief diversion into the res of interpoltion nd qudrture. Given function f C[, b], we sy

More information

Chapters Five Notes SN AA U1C5

Chapters Five Notes SN AA U1C5 Chpters Five Notes SN AA U1C5 Nme Period Section 5-: Fctoring Qudrtic Epressions When you took lger, you lerned tht the first thing involved in fctoring is to mke sure to fctor out ny numers or vriles

More information

Decision Networks. CS 188: Artificial Intelligence. Decision Networks. Decision Networks. Decision Networks and Value of Information

Decision Networks. CS 188: Artificial Intelligence. Decision Networks. Decision Networks. Decision Networks and Value of Information CS 188: Artificil Intelligence nd Vlue of Informtion Instructors: Dn Klein nd Pieter Abbeel niversity of Cliforni, Berkeley [These slides were creted by Dn Klein nd Pieter Abbeel for CS188 Intro to AI

More information

p-adic Egyptian Fractions

p-adic Egyptian Fractions p-adic Egyptin Frctions Contents 1 Introduction 1 2 Trditionl Egyptin Frctions nd Greedy Algorithm 2 3 Set-up 3 4 p-greedy Algorithm 5 5 p-egyptin Trditionl 10 6 Conclusion 1 Introduction An Egyptin frction

More information

Lecture 2e Orthogonal Complement (pages )

Lecture 2e Orthogonal Complement (pages ) Lecture 2e Orthogonl Complement (pges -) We hve now seen tht n orthonorml sis is nice wy to descrie suspce, ut knowing tht we wnt n orthonorml sis doesn t mke one fll into our lp. In theory, the process

More information

CS 310 (sec 20) - Winter Final Exam (solutions) SOLUTIONS

CS 310 (sec 20) - Winter Final Exam (solutions) SOLUTIONS CS 310 (sec 20) - Winter 2003 - Finl Exm (solutions) SOLUTIONS 1. (Logic) Use truth tles to prove the following logicl equivlences: () p q (p p) (q q) () p q (p q) (p q) () p q p q p p q q (q q) (p p)

More information

1B40 Practical Skills

1B40 Practical Skills B40 Prcticl Skills Comining uncertinties from severl quntities error propgtion We usully encounter situtions where the result of n experiment is given in terms of two (or more) quntities. We then need

More information

Chapter 5 Plan-Space Planning

Chapter 5 Plan-Space Planning Lecture slides for Automted Plnning: Theory nd Prctice Chpter 5 Pln-Spce Plnning Dn S. Nu CMSC 722, AI Plnning University of Mrylnd, Spring 2008 1 Stte-Spce Plnning Motivtion g 1 1 g 4 4 s 0 g 5 5 g 2

More information

First Midterm Examination

First Midterm Examination Çnky University Deprtment of Computer Engineering 203-204 Fll Semester First Midterm Exmintion ) Design DFA for ll strings over the lphet Σ = {,, c} in which there is no, no nd no cc. 2) Wht lnguge does

More information

Learning Moore Machines from Input-Output Traces

Learning Moore Machines from Input-Output Traces Lerning Moore Mchines from Input-Output Trces Georgios Gintmidis 1 nd Stvros Tripkis 1,2 1 Alto University, Finlnd 2 UC Berkeley, USA Motivtion: lerning models from blck boxes Inputs? Lerner Forml Model

More information

CS 188: Artificial Intelligence Fall 2010

CS 188: Artificial Intelligence Fall 2010 CS 188: Artificil Intelligence Fll 2010 Lecture 18: Decision Digrms 10/28/2010 Dn Klein C Berkeley Vlue of Informtion 1 Decision Networks ME: choose the ction which mximizes the expected utility given

More information

MA123, Chapter 10: Formulas for integrals: integrals, antiderivatives, and the Fundamental Theorem of Calculus (pp.

MA123, Chapter 10: Formulas for integrals: integrals, antiderivatives, and the Fundamental Theorem of Calculus (pp. MA123, Chpter 1: Formuls for integrls: integrls, ntiderivtives, nd the Fundmentl Theorem of Clculus (pp. 27-233, Gootmn) Chpter Gols: Assignments: Understnd the sttement of the Fundmentl Theorem of Clculus.

More information

Week 10: Line Integrals

Week 10: Line Integrals Week 10: Line Integrls Introduction In this finl week we return to prmetrised curves nd consider integrtion long such curves. We lredy sw this in Week 2 when we integrted long curve to find its length.

More information

CS 188: Artificial Intelligence

CS 188: Artificial Intelligence CS 188: Artificil Intelligence Lecture 19: Decision Digrms Pieter Abbeel --- C Berkeley Mny slides over this course dpted from Dn Klein, Sturt Russell, Andrew Moore Decision Networks ME: choose the ction

More information

CMSC 330: Organization of Programming Languages

CMSC 330: Organization of Programming Languages CMSC 330: Orgniztion of Progrmming Lnguges Finite Automt 2 CMSC 330 1 Types of Finite Automt Deterministic Finite Automt (DFA) Exctly one sequence of steps for ech string All exmples so fr Nondeterministic

More information

Monte Carlo method in solving numerical integration and differential equation

Monte Carlo method in solving numerical integration and differential equation Monte Crlo method in solving numericl integrtion nd differentil eqution Ye Jin Chemistry Deprtment Duke University yj66@duke.edu Abstrct: Monte Crlo method is commonly used in rel physics problem. The

More information

Numerical Analysis: Trapezoidal and Simpson s Rule

Numerical Analysis: Trapezoidal and Simpson s Rule nd Simpson s Mthemticl question we re interested in numericlly nswering How to we evlute I = f (x) dx? Clculus tells us tht if F(x) is the ntiderivtive of function f (x) on the intervl [, b], then I =

More information

The Minimum Label Spanning Tree Problem: Illustrating the Utility of Genetic Algorithms

The Minimum Label Spanning Tree Problem: Illustrating the Utility of Genetic Algorithms The Minimum Lel Spnning Tree Prolem: Illustrting the Utility of Genetic Algorithms Yupei Xiong, Univ. of Mrylnd Bruce Golden, Univ. of Mrylnd Edwrd Wsil, Americn Univ. Presented t BAE Systems Distinguished

More information

Section 4: Integration ECO4112F 2011

Section 4: Integration ECO4112F 2011 Reding: Ching Chpter Section : Integrtion ECOF Note: These notes do not fully cover the mteril in Ching, ut re ment to supplement your reding in Ching. Thus fr the optimistion you hve covered hs een sttic

More information

Quadratic Forms. Quadratic Forms

Quadratic Forms. Quadratic Forms Qudrtic Forms Recll the Simon & Blume excerpt from n erlier lecture which sid tht the min tsk of clculus is to pproximte nonliner functions with liner functions. It s ctully more ccurte to sy tht we pproximte

More information

Finite Automata. Informatics 2A: Lecture 3. John Longley. 22 September School of Informatics University of Edinburgh

Finite Automata. Informatics 2A: Lecture 3. John Longley. 22 September School of Informatics University of Edinburgh Lnguges nd Automt Finite Automt Informtics 2A: Lecture 3 John Longley School of Informtics University of Edinburgh jrl@inf.ed.c.uk 22 September 2017 1 / 30 Lnguges nd Automt 1 Lnguges nd Automt Wht is

More information

Quantum Nonlocality Pt. 2: No-Signaling and Local Hidden Variables May 1, / 16

Quantum Nonlocality Pt. 2: No-Signaling and Local Hidden Variables May 1, / 16 Quntum Nonloclity Pt. 2: No-Signling nd Locl Hidden Vriles My 1, 2018 Quntum Nonloclity Pt. 2: No-Signling nd Locl Hidden Vriles My 1, 2018 1 / 16 Non-Signling Boxes The primry lesson from lst lecture

More information

Connected-components. Summary of lecture 9. Algorithms and Data Structures Disjoint sets. Example: connected components in graphs

Connected-components. Summary of lecture 9. Algorithms and Data Structures Disjoint sets. Example: connected components in graphs Prm University, Mth. Deprtment Summry of lecture 9 Algorithms nd Dt Structures Disjoint sets Summry of this lecture: (CLR.1-3) Dt Structures for Disjoint sets: Union opertion Find opertion Mrco Pellegrini

More information

SUMMER KNOWHOW STUDY AND LEARNING CENTRE

SUMMER KNOWHOW STUDY AND LEARNING CENTRE SUMMER KNOWHOW STUDY AND LEARNING CENTRE Indices & Logrithms 2 Contents Indices.2 Frctionl Indices.4 Logrithms 6 Exponentil equtions. Simplifying Surds 13 Opertions on Surds..16 Scientific Nottion..18

More information

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true.

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true. York University CSE 2 Unit 3. DFA Clsses Converting etween DFA, NFA, Regulr Expressions, nd Extended Regulr Expressions Instructor: Jeff Edmonds Don t chet y looking t these nswers premturely.. For ech

More information

Designing Information Devices and Systems I Spring 2018 Homework 7

Designing Information Devices and Systems I Spring 2018 Homework 7 EECS 16A Designing Informtion Devices nd Systems I Spring 2018 omework 7 This homework is due Mrch 12, 2018, t 23:59. Self-grdes re due Mrch 15, 2018, t 23:59. Sumission Formt Your homework sumission should

More information

CS 301. Lecture 04 Regular Expressions. Stephen Checkoway. January 29, 2018

CS 301. Lecture 04 Regular Expressions. Stephen Checkoway. January 29, 2018 CS 301 Lecture 04 Regulr Expressions Stephen Checkowy Jnury 29, 2018 1 / 35 Review from lst time NFA N = (Q, Σ, δ, q 0, F ) where δ Q Σ P (Q) mps stte nd n lphet symol (or ) to set of sttes We run n NFA

More information

CS667 Lecture 6: Monte Carlo Integration 02/10/05

CS667 Lecture 6: Monte Carlo Integration 02/10/05 CS667 Lecture 6: Monte Crlo Integrtion 02/10/05 Venkt Krishnrj Lecturer: Steve Mrschner 1 Ide The min ide of Monte Crlo Integrtion is tht we cn estimte the vlue of n integrl by looking t lrge number of

More information

Finite Automata. Informatics 2A: Lecture 3. Mary Cryan. 21 September School of Informatics University of Edinburgh

Finite Automata. Informatics 2A: Lecture 3. Mary Cryan. 21 September School of Informatics University of Edinburgh Finite Automt Informtics 2A: Lecture 3 Mry Cryn School of Informtics University of Edinburgh mcryn@inf.ed.c.uk 21 September 2018 1 / 30 Lnguges nd Automt Wht is lnguge? Finite utomt: recp Some forml definitions

More information

Genetic Programming. Outline. Evolutionary Strategies. Evolutionary strategies Genetic programming Summary

Genetic Programming. Outline. Evolutionary Strategies. Evolutionary strategies Genetic programming Summary Outline Genetic Progrmming Evolutionry strtegies Genetic progrmming Summry Bsed on the mteril provided y Professor Michel Negnevitsky Evolutionry Strtegies An pproch simulting nturl evolution ws proposed

More information

Finite Automata-cont d

Finite Automata-cont d Automt Theory nd Forml Lnguges Professor Leslie Lnder Lecture # 6 Finite Automt-cont d The Pumping Lemm WEB SITE: http://ingwe.inghmton.edu/ ~lnder/cs573.html Septemer 18, 2000 Exmple 1 Consider L = {ww

More information

Best Approximation. Chapter The General Case

Best Approximation. Chapter The General Case Chpter 4 Best Approximtion 4.1 The Generl Cse In the previous chpter, we hve seen how n interpolting polynomil cn be used s n pproximtion to given function. We now wnt to find the best pproximtion to given

More information

Bridging the gap: GCSE AS Level

Bridging the gap: GCSE AS Level Bridging the gp: GCSE AS Level CONTENTS Chpter Removing rckets pge Chpter Liner equtions Chpter Simultneous equtions 8 Chpter Fctors 0 Chpter Chnge the suject of the formul Chpter 6 Solving qudrtic equtions

More information

Lecture 20: Numerical Integration III

Lecture 20: Numerical Integration III cs4: introduction to numericl nlysis /8/0 Lecture 0: Numericl Integrtion III Instructor: Professor Amos Ron Scribes: Mrk Cowlishw, Yunpeng Li, Nthnel Fillmore For the lst few lectures we hve discussed

More information

5: The Definite Integral

5: The Definite Integral 5: The Definite Integrl 5.: Estimting with Finite Sums Consider moving oject its velocity (meters per second) t ny time (seconds) is given y v t = t+. Cn we use this informtion to determine the distnce

More information

2.4 Linear Inequalities and Interval Notation

2.4 Linear Inequalities and Interval Notation .4 Liner Inequlities nd Intervl Nottion We wnt to solve equtions tht hve n inequlity symol insted of n equl sign. There re four inequlity symols tht we will look t: Less thn , Less thn or

More information

f(x) dx, If one of these two conditions is not met, we call the integral improper. Our usual definition for the value for the definite integral

f(x) dx, If one of these two conditions is not met, we call the integral improper. Our usual definition for the value for the definite integral Improper Integrls Every time tht we hve evluted definite integrl such s f(x) dx, we hve mde two implicit ssumptions bout the integrl:. The intervl [, b] is finite, nd. f(x) is continuous on [, b]. If one

More information

MATH 144: Business Calculus Final Review

MATH 144: Business Calculus Final Review MATH 144: Business Clculus Finl Review 1 Skills 1. Clculte severl limits. 2. Find verticl nd horizontl symptotes for given rtionl function. 3. Clculte derivtive by definition. 4. Clculte severl derivtives

More information

5. (±±) Λ = fw j w is string of even lengthg [ 00 = f11,00g 7. (11 [ 00)± Λ = fw j w egins with either 11 or 00g 8. (0 [ ffl)1 Λ = 01 Λ [ 1 Λ 9.

5. (±±) Λ = fw j w is string of even lengthg [ 00 = f11,00g 7. (11 [ 00)± Λ = fw j w egins with either 11 or 00g 8. (0 [ ffl)1 Λ = 01 Λ [ 1 Λ 9. Regulr Expressions, Pumping Lemm, Right Liner Grmmrs Ling 106 Mrch 25, 2002 1 Regulr Expressions A regulr expression descries or genertes lnguge: it is kind of shorthnd for listing the memers of lnguge.

More information

CS415 Compilers. Lexical Analysis and. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University

CS415 Compilers. Lexical Analysis and. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University CS415 Compilers Lexicl Anlysis nd These slides re sed on slides copyrighted y Keith Cooper, Ken Kennedy & Lind Torczon t Rice University First Progrmming Project Instruction Scheduling Project hs een posted

More information

Math Lecture 23

Math Lecture 23 Mth 8 - Lecture 3 Dyln Zwick Fll 3 In our lst lecture we delt with solutions to the system: x = Ax where A is n n n mtrix with n distinct eigenvlues. As promised, tody we will del with the question of

More information

Goals: Determine how to calculate the area described by a function. Define the definite integral. Explore the relationship between the definite

Goals: Determine how to calculate the area described by a function. Define the definite integral. Explore the relationship between the definite Unit #8 : The Integrl Gols: Determine how to clculte the re described by function. Define the definite integrl. Eplore the reltionship between the definite integrl nd re. Eplore wys to estimte the definite

More information

Thomas Whitham Sixth Form

Thomas Whitham Sixth Form Thoms Whithm Sith Form Pure Mthemtics Unit C Alger Trigonometry Geometry Clculus Vectors Trigonometry Compound ngle formule sin sin cos cos Pge A B sin Acos B cos Asin B A B sin Acos B cos Asin B A B cos

More information

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. NFA for (a b)*abb.

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. NFA for (a b)*abb. CMSC 330: Orgniztion of Progrmming Lnguges Finite Automt 2 Types of Finite Automt Deterministic Finite Automt () Exctly one sequence of steps for ech string All exmples so fr Nondeterministic Finite Automt

More information

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. Comparing DFAs and NFAs (cont.) Finite Automata 2

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. Comparing DFAs and NFAs (cont.) Finite Automata 2 CMSC 330: Orgniztion of Progrmming Lnguges Finite Automt 2 Types of Finite Automt Deterministic Finite Automt () Exctly one sequence of steps for ech string All exmples so fr Nondeterministic Finite Automt

More information

Math 8 Winter 2015 Applications of Integration

Math 8 Winter 2015 Applications of Integration Mth 8 Winter 205 Applictions of Integrtion Here re few importnt pplictions of integrtion. The pplictions you my see on n exm in this course include only the Net Chnge Theorem (which is relly just the Fundmentl

More information

Physics 1402: Lecture 7 Today s Agenda

Physics 1402: Lecture 7 Today s Agenda 1 Physics 1402: Lecture 7 Tody s gend nnouncements: Lectures posted on: www.phys.uconn.edu/~rcote/ HW ssignments, solutions etc. Homework #2: On Msterphysics tody: due Fridy Go to msteringphysics.com Ls:

More information

COSC 3361 Numerical Analysis I Numerical Integration and Differentiation (III) - Gauss Quadrature and Adaptive Quadrature

COSC 3361 Numerical Analysis I Numerical Integration and Differentiation (III) - Gauss Quadrature and Adaptive Quadrature COSC 336 Numericl Anlysis I Numericl Integrtion nd Dierentition III - Guss Qudrture nd Adptive Qudrture Edgr Griel Fll 5 COSC 336 Numericl Anlysis I Edgr Griel Summry o the lst lecture I For pproximting

More information

Parse trees, ambiguity, and Chomsky normal form

Parse trees, ambiguity, and Chomsky normal form Prse trees, miguity, nd Chomsky norml form In this lecture we will discuss few importnt notions connected with contextfree grmmrs, including prse trees, miguity, nd specil form for context-free grmmrs

More information

How can we approximate the area of a region in the plane? What is an interpretation of the area under the graph of a velocity function?

How can we approximate the area of a region in the plane? What is an interpretation of the area under the graph of a velocity function? Mth 125 Summry Here re some thoughts I ws hving while considering wht to put on the first midterm. The core of your studying should be the ssigned homework problems: mke sure you relly understnd those

More information

Theoretical foundations of Gaussian quadrature

Theoretical foundations of Gaussian quadrature Theoreticl foundtions of Gussin qudrture 1 Inner product vector spce Definition 1. A vector spce (or liner spce) is set V = {u, v, w,...} in which the following two opertions re defined: (A) Addition of

More information

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return

More information

Module 9: Tries and String Matching

Module 9: Tries and String Matching Module 9: Tries nd String Mtching CS 240 - Dt Structures nd Dt Mngement Sjed Hque Veronik Irvine Tylor Smith Bsed on lecture notes by mny previous cs240 instructors Dvid R. Cheriton School of Computer

More information

Module 9: Tries and String Matching

Module 9: Tries and String Matching Module 9: Tries nd String Mtching CS 240 - Dt Structures nd Dt Mngement Sjed Hque Veronik Irvine Tylor Smith Bsed on lecture notes by mny previous cs240 instructors Dvid R. Cheriton School of Computer

More information

Lecture 3. In this lecture, we will discuss algorithms for solving systems of linear equations.

Lecture 3. In this lecture, we will discuss algorithms for solving systems of linear equations. Lecture 3 3 Solving liner equtions In this lecture we will discuss lgorithms for solving systems of liner equtions Multiplictive identity Let us restrict ourselves to considering squre mtrices since one

More information

First midterm topics Second midterm topics End of quarter topics. Math 3B Review. Steve. 18 March 2009

First midterm topics Second midterm topics End of quarter topics. Math 3B Review. Steve. 18 March 2009 Mth 3B Review Steve 18 Mrch 2009 About the finl Fridy Mrch 20, 3pm-6pm, Lkretz 110 No notes, no book, no clcultor Ten questions Five review questions (Chpters 6,7,8) Five new questions (Chpters 9,10) No

More information

Things to Memorize: A Partial List. January 27, 2017

Things to Memorize: A Partial List. January 27, 2017 Things to Memorize: A Prtil List Jnury 27, 2017 Chpter 2 Vectors - Bsic Fcts A vector hs mgnitude (lso clled size/length/norm) nd direction. It does not hve fixed position, so the sme vector cn e moved

More information

Lecture 2: January 27

Lecture 2: January 27 CS 684: Algorithmic Gme Theory Spring 217 Lecturer: Év Trdos Lecture 2: Jnury 27 Scrie: Alert Julius Liu 2.1 Logistics Scrie notes must e sumitted within 24 hours of the corresponding lecture for full

More information

10. AREAS BETWEEN CURVES

10. AREAS BETWEEN CURVES . AREAS BETWEEN CURVES.. Ares etween curves So res ove the x-xis re positive nd res elow re negtive, right? Wrong! We lied! Well, when you first lern out integrtion it s convenient fiction tht s true in

More information

AUTOMATA AND LANGUAGES. Definition 1.5: Finite Automaton

AUTOMATA AND LANGUAGES. Definition 1.5: Finite Automaton 25. Finite Automt AUTOMATA AND LANGUAGES A system of computtion tht only hs finite numer of possile sttes cn e modeled using finite utomton A finite utomton is often illustrted s stte digrm d d d. d q

More information

The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 2 MODULE, SPRING SEMESTER LANGUAGES AND COMPUTATION ANSWERS

The University of Nottingham SCHOOL OF COMPUTER SCIENCE A LEVEL 2 MODULE, SPRING SEMESTER LANGUAGES AND COMPUTATION ANSWERS The University of Nottinghm SCHOOL OF COMPUTER SCIENCE LEVEL 2 MODULE, SPRING SEMESTER 2016 2017 LNGUGES ND COMPUTTION NSWERS Time llowed TWO hours Cndidtes my complete the front cover of their nswer ook

More information

12.1 Nondeterminism Nondeterministic Finite Automata. a a b ε. CS125 Lecture 12 Fall 2016

12.1 Nondeterminism Nondeterministic Finite Automata. a a b ε. CS125 Lecture 12 Fall 2016 CS125 Lecture 12 Fll 2016 12.1 Nondeterminism The ide of nondeterministic computtions is to llow our lgorithms to mke guesses, nd only require tht they ccept when the guesses re correct. For exmple, simple

More information