Randomized Weighted Majority Algorithm And Classification Problem
|
|
- Adele Norris
- 5 years ago
- Views:
Transcription
1 CSL 758: Advanced Algorithms Scribe: Sandeep Goyal, Saurabh Agrawal Lecturer: Naveen Garg, Kavitha Telikepalli Date: March 31, 2008 Randomized Weighted Majority Algorithm And Classification Problem 1 Randomized Weighted Majority Algorithm 1.1 Introduction Given: We have 2 horses A and B to bet on and N experts E 1, E 2, E 3,...E N to take advice from. W 1, W 2,..., W N are the corresponding weights of the experts. Aim: To bound the number of mistakes I make as compared to the best expert. The algorithm we discussed in the last lecture, the Weighted Majority algorithm, gives the following bound. m < M log 1 log 1 2 log N log 1 2 where, m is Total number of mistakes made. M is Number of mistakes made by the Best Expert is the facor by which we modify the weights of the experts who predicted wrongly. Special Case 1 : = 1 2 Special Case 2 : very small m M log 2 log log N log 4 3 m 2 M + 2 log N In this lecture we will look at Randomized Weighted Majority Algorithm that will improve the bound on m. 1.2 Algorithm Let us define : W A : Total weight of the experts predicting that horse A will win. W B : Total weight of the experts predicting that horse B will win. : Total weight of all experts on day t Scribe Notes, CSL 758, Advanced Algorithms - 1
2 W i : Total weight of all experts initially, which is N. W f : Total weight of all experts finally F t : Total weight of all experts who predict wrongly on day t : Facor by which we modify F t. Now, we bet on horse A with probability Each time we reduce F t by a factor of, so +1 can be computed as: W A W A +W B +1 = F t = (1 F t W f = W i (1 F t = N (1 F t log W f = log N + log(1 F t (1 Using approximation: log(1 + x = x x 2 / log(1 + x x, ifx << 1 (2 From 1 and 2, we have, log W f log N ( F t ( F t log N log W f (3 Now, if the best expert makes M mistakes then, W f (1 M log W f M log(1 (4 Scribe Notes, CSL 758, Advanced Algorithms - 2
3 ExpectedNumberofMistakes = P robabiltyofmistake 1 = ( F t log N log W f log N log(1 M log N + M (5 This new bound is tighter than the earlier bounds. 2 Classification Problem 2.1 Some Definitions A labeled example is an example together with a labeling e.g. 0 or 1 A concept is a boolean function over an instance space. For instance, the concept x 1 x 2 over {0, 1} n is the boolean function that outputs 1 on any input in which the first two bits are set to 1. Sometimes we will think of a concept c as the set of its positive examples, i.e. x c means that c(x = 1. A concept class is a set of concepts, typically with an associated representation. For instance, the class of monotone conjunctions consists of all concepts that can be expressed as a conjunction of variables 2.2 Consistancy Model We say that algorithm A learns class C in the consistency model if given any set of labeled examples S, the algorithm produces a concept c C consistent with S if one exists, and outputs there is no consistent concept, otherwise. 2.3 Algorithm f 1 f 2 f 3... f n E E E The class of conjunctions is the class of ANDs of literals, where a literal is a variable. E.g. a typical concept might be x 1 x 4 x 6. This class is learnable by the following algorithm: Scribe Notes, CSL 758, Advanced Algorithms - 3
4 Produce the conjunction of all literals that are satisfied by every positive example. (I.e. throw out any literal falsified by some positive example. By definition, this is consistent with all the positive examples. In fact, this is the most specific conjunction consistent with the positive examples because we only throw out literals when absolutely necessary; therefore, if any conjunction is consistent with all the examples, this one is. If the above conjunction is also consistent with the negative examples, produce it as output. Otherwise halt with failure. 2.4 Proof of correctness of Algorithm Case 1: Output conjunction is consistent with the negative examples This conjuction satisfies both the positive and negetive exammples, so the algorithm holds valid. Case 2: Algorithm halts with failure Required to show if the algorithm fails, there doesn t exist any conjuction that satisfies all the examples. Without loss of generality, let x 1 x 2 x 3... x i be the conjuction obtained from the positive examples. Since, the algorithm fails, there exists some feature vector among the negative examples that is not satisfied by the conjuction. Let that feature vector be F = f 1 f 2 f 3... f i... f n, where f 1 = f 2 = f 3 =... = f i = 1. Suppose that there exists a conjuction C, that satisfies all the negetive and positive examples. Let, C = X 1 X 2, wherex 1 {x 1, x 2, x 3,..., x i } and X 2 {x i+1, x i+2,..., x n }. But X 2 must not be empty set, otherwise C will fail on the feature vector F itself. Hence, it should contain atleast one element from { x i+1, x i+2... x n }. Now this formula should satisfy all the positive examples also. However, each feature element from the set { x i+1, x i+2... x n } is zero in atleast one of the positive examples as our output formula doesnt contain them, i.e. they were dropped during the course of the algorithm as they must have been falsified by some positive example. Hence, C will not satisfy all those positive examples. i.e., C doesn t satisfy all the examples. This brings us to a contradiction. This proves the correctness of our algorithm. 2.5 Extensions Introduction of Negations New vaiables {y 1, y 2,..., y n } are added to the previous set of variables to generate a conjuction, where y i = x i. So, {x 1, x 2,..., x n, y 1, y 2,..., y n } is the new set of variables. Using the same algorithm we can generate a conjuction. Scribe Notes, CSL 758, Advanced Algorithms - 4
5 2-CNF We will introdunce n C 2 variables of the form x i x j, where i j and proceed with this set of variables with the same algorithm. This can be extended to k-cnf form, for any integer k 2. Scribe Notes, CSL 758, Advanced Algorithms - 5
Hierarchical Concept Learning
COMS 6998-4 Fall 2017 Octorber 30, 2017 Hierarchical Concept Learning Presenter: Xuefeng Hu Scribe: Qinyao He 1 Introduction It has been shown that learning arbitrary polynomial-size circuits is computationally
More informationComputational Learning Theory
CS 446 Machine Learning Fall 2016 OCT 11, 2016 Computational Learning Theory Professor: Dan Roth Scribe: Ben Zhou, C. Cervantes 1 PAC Learning We want to develop a theory to relate the probability of successful
More informationClasses of Boolean Functions
Classes of Boolean Functions Nader H. Bshouty Eyal Kushilevitz Abstract Here we give classes of Boolean functions that considered in COLT. Classes of Functions Here we introduce the basic classes of functions
More information8.1 Polynomial Threshold Functions
CS 395T Computational Learning Theory Lecture 8: September 22, 2008 Lecturer: Adam Klivans Scribe: John Wright 8.1 Polynomial Threshold Functions In the previous lecture, we proved that any function over
More informationPropositional Logic Basics Propositional Equivalences Normal forms Boolean functions and digital circuits. Propositional Logic.
Propositional Logic Winter 2012 Propositional Logic: Section 1.1 Proposition A proposition is a declarative sentence that is either true or false. Which ones of the following sentences are propositions?
More informationComputational Learning Theory (COLT)
Computational Learning Theory (COLT) Goals: Theoretical characterization of 1 Difficulty of machine learning problems Under what conditions is learning possible and impossible? 2 Capabilities of machine
More informationCS Introduction to Complexity Theory. Lecture #11: Dec 8th, 2015
CS 2401 - Introduction to Complexity Theory Lecture #11: Dec 8th, 2015 Lecturer: Toniann Pitassi Scribe Notes by: Xu Zhao 1 Communication Complexity Applications Communication Complexity (CC) has many
More informationCS 395T Computational Learning Theory. Scribe: Mike Halcrow. x 4. x 2. x 6
CS 395T Computational Learning Theory Lecture 3: September 0, 2007 Lecturer: Adam Klivans Scribe: Mike Halcrow 3. Decision List Recap In the last class, we determined that, when learning a t-decision list,
More informationCS446: Machine Learning Spring Problem Set 4
CS446: Machine Learning Spring 2017 Problem Set 4 Handed Out: February 27 th, 2017 Due: March 11 th, 2017 Feel free to talk to other members of the class in doing the homework. I am more concerned that
More informationMidterm: CS 6375 Spring 2015 Solutions
Midterm: CS 6375 Spring 2015 Solutions The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run out of room for an
More informationCSC 2429 Approaches to the P vs. NP Question and Related Complexity Questions Lecture 2: Switching Lemma, AC 0 Circuit Lower Bounds
CSC 2429 Approaches to the P vs. NP Question and Related Complexity Questions Lecture 2: Switching Lemma, AC 0 Circuit Lower Bounds Lecturer: Toniann Pitassi Scribe: Robert Robere Winter 2014 1 Switching
More informationBoolean circuits. Lecture Definitions
Lecture 20 Boolean circuits In this lecture we will discuss the Boolean circuit model of computation and its connection to the Turing machine model. Although the Boolean circuit model is fundamentally
More informationA Lower Bound of 2 n Conditional Jumps for Boolean Satisfiability on A Random Access Machine
A Lower Bound of 2 n Conditional Jumps for Boolean Satisfiability on A Random Access Machine Samuel C. Hsieh Computer Science Department, Ball State University July 3, 2014 Abstract We establish a lower
More informationLecture 5: Efficient PAC Learning. 1 Consistent Learning: a Bound on Sample Complexity
Universität zu Lübeck Institut für Theoretische Informatik Lecture notes on Knowledge-Based and Learning Systems by Maciej Liśkiewicz Lecture 5: Efficient PAC Learning 1 Consistent Learning: a Bound on
More informationAI Programming CS S-09 Knowledge Representation
AI Programming CS662-2013S-09 Knowledge Representation David Galles Department of Computer Science University of San Francisco 09-0: Overview So far, we ve talked about search, which is a means of considering
More informationComputational Learning Theory - Hilary Term : Introduction to the PAC Learning Framework
Computational Learning Theory - Hilary Term 2018 1 : Introduction to the PAC Learning Framework Lecturer: Varun Kanade 1 What is computational learning theory? Machine learning techniques lie at the heart
More informationSolutions to Sample Problems for Midterm
Solutions to Sample Problems for Midterm Problem 1. The dual of a proposition is defined for which contains only,,. It is For a compound proposition that only uses,, as operators, we obtained the dual
More informationSupplementary exercises in propositional logic
Supplementary exercises in propositional logic The purpose of these exercises is to train your ability to manipulate and analyze logical formulas. Familiarize yourself with chapter 7.3-7.5 in the course
More informationCS 446: Machine Learning Lecture 4, Part 2: On-Line Learning
CS 446: Machine Learning Lecture 4, Part 2: On-Line Learning 0.1 Linear Functions So far, we have been looking at Linear Functions { as a class of functions which can 1 if W1 X separate some data and not
More informationFoundations of Machine Learning and Data Science. Lecturer: Avrim Blum Lecture 9: October 7, 2015
10-806 Foundations of Machine Learning and Data Science Lecturer: Avrim Blum Lecture 9: October 7, 2015 1 Computational Hardness of Learning Today we will talk about some computational hardness results
More informationNP-Complete Reductions 2
x 1 x 1 x 2 x 2 x 3 x 3 x 4 x 4 12 22 32 CS 447 11 13 21 23 31 33 Algorithms NP-Complete Reductions 2 Prof. Gregory Provan Department of Computer Science University College Cork 1 Lecture Outline NP-Complete
More informationCSE 20 DISCRETE MATH WINTER
CSE 20 DISCRETE MATH WINTER 2017 http://cseweb.ucsd.edu/classes/wi17/cse20-ab/ Reminders Homework 3 due Sunday at noon Exam 1 in one week One note card can be used. Bring photo ID. Review sessions Thursday
More informationNP Complete Problems. COMP 215 Lecture 20
NP Complete Problems COMP 215 Lecture 20 Complexity Theory Complexity theory is a research area unto itself. The central project is classifying problems as either tractable or intractable. Tractable Worst
More informationLecture 3: AC 0, the switching lemma
Lecture 3: AC 0, the switching lemma Topics in Complexity Theory and Pseudorandomness (Spring 2013) Rutgers University Swastik Kopparty Scribes: Meng Li, Abdul Basit 1 Pseudorandom sets We start by proving
More informationCHAPTER 6. Copyright Cengage Learning. All rights reserved.
CHAPTER 6 SET THEORY Copyright Cengage Learning. All rights reserved. SECTION 6.4 Boolean Algebras, Russell s Paradox, and the Halting Problem Copyright Cengage Learning. All rights reserved. Boolean Algebras,
More information10.1 The Formal Model
67577 Intro. to Machine Learning Fall semester, 2008/9 Lecture 10: The Formal (PAC) Learning Model Lecturer: Amnon Shashua Scribe: Amnon Shashua 1 We have see so far algorithms that explicitly estimate
More informationComputational Learning Theory
Computational Learning Theory [read Chapter 7] [Suggested exercises: 7.1, 7.2, 7.5, 7.8] Computational learning theory Setting 1: learner poses queries to teacher Setting 2: teacher chooses examples Setting
More informationIntro to Theory of Computation
Intro to Theory of Computation LECTURE 25 Last time Class NP Today Polynomial-time reductions Adam Smith; Sofya Raskhodnikova 4/18/2016 L25.1 The classes P and NP P is the class of languages decidable
More informationA Lower Bound for Boolean Satisfiability on Turing Machines
A Lower Bound for Boolean Satisfiability on Turing Machines arxiv:1406.5970v1 [cs.cc] 23 Jun 2014 Samuel C. Hsieh Computer Science Department, Ball State University March 16, 2018 Abstract We establish
More informationA Tutorial on Computational Learning Theory Presented at Genetic Programming 1997 Stanford University, July 1997
A Tutorial on Computational Learning Theory Presented at Genetic Programming 1997 Stanford University, July 1997 Vasant Honavar Artificial Intelligence Research Laboratory Department of Computer Science
More informationLecture 6 : Induction DRAFT
CS/Math 40: Introduction to Discrete Mathematics /8/011 Lecture 6 : Induction Instructor: Dieter van Melkebeek Scribe: Dalibor Zelený DRAFT Last time we began discussing proofs. We mentioned some proof
More informationChapter 2. Reductions and NP. 2.1 Reductions Continued The Satisfiability Problem (SAT) SAT 3SAT. CS 573: Algorithms, Fall 2013 August 29, 2013
Chapter 2 Reductions and NP CS 573: Algorithms, Fall 2013 August 29, 2013 2.1 Reductions Continued 2.1.1 The Satisfiability Problem SAT 2.1.1.1 Propositional Formulas Definition 2.1.1. Consider a set of
More informationCS 151 Complexity Theory Spring Solution Set 5
CS 151 Complexity Theory Spring 2017 Solution Set 5 Posted: May 17 Chris Umans 1. We are given a Boolean circuit C on n variables x 1, x 2,..., x n with m, and gates. Our 3-CNF formula will have m auxiliary
More informationLecture 24 : Even more reductions
COMPSCI 330: Design and Analysis of Algorithms December 5, 2017 Lecture 24 : Even more reductions Lecturer: Yu Cheng Scribe: Will Wang 1 Overview Last two lectures, we showed the technique of reduction
More information6.895 Randomness and Computation March 19, Lecture Last Lecture: Boosting Weak Learners Into Strong Learners
6.895 Randomness and Computation March 9, 2008 Lecture 3 Lecturer: Ronitt Rubinfeld Scribe: Edwin Chen Overview. Last Lecture: Boosting Weak Learners Into Strong Learners In the last two lectures, we showed
More informationLinear Classifiers: Expressiveness
Linear Classifiers: Expressiveness Machine Learning Spring 2018 The slides are mainly from Vivek Srikumar 1 Lecture outline Linear classifiers: Introduction What functions do linear classifiers express?
More informationComputational Learning Theory
0. Computational Learning Theory Based on Machine Learning, T. Mitchell, McGRAW Hill, 1997, ch. 7 Acknowledgement: The present slides are an adaptation of slides drawn by T. Mitchell 1. Main Questions
More informationIntroduction to Computational Learning Theory
Introduction to Computational Learning Theory The classification problem Consistent Hypothesis Model Probably Approximately Correct (PAC) Learning c Hung Q. Ngo (SUNY at Buffalo) CSE 694 A Fun Course 1
More informationPropositional Logic Resolution (6A) Young W. Lim 12/12/16
Propositional Logic Resolution (6A) Young W. Lim Copyright (c) 2016 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License,
More informationCSE20: Discrete Mathematics for Computer Science. Lecture Unit 2: Boolan Functions, Logic Circuits, and Implication
CSE20: Discrete Mathematics for Computer Science Lecture Unit 2: Boolan Functions, Logic Circuits, and Implication Disjunctive normal form Example: Let f (x, y, z) =xy z. Write this function in DNF. Minterm
More information1 The Probably Approximately Correct (PAC) Model
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #3 Scribe: Yuhui Luo February 11, 2008 1 The Probably Approximately Correct (PAC) Model A target concept class C is PAC-learnable by
More informationNotes for Lecture 2. Statement of the PCP Theorem and Constraint Satisfaction
U.C. Berkeley Handout N2 CS294: PCP and Hardness of Approximation January 23, 2006 Professor Luca Trevisan Scribe: Luca Trevisan Notes for Lecture 2 These notes are based on my survey paper [5]. L.T. Statement
More informationLecture 29: Computational Learning Theory
CS 710: Complexity Theory 5/4/2010 Lecture 29: Computational Learning Theory Instructor: Dieter van Melkebeek Scribe: Dmitri Svetlov and Jake Rosin Today we will provide a brief introduction to computational
More informationWeek 3: Reductions and Completeness
Computational Complexity Theory Summer HSSP 2018 Week 3: Reductions and Completeness Dylan Hendrickson MIT Educational Studies Program 3.1 Reductions Suppose I know how to solve some problem quickly. How
More informationPreparing for the CS 173 (A) Fall 2018 Midterm 1
Preparing for the CS 173 (A) Fall 2018 Midterm 1 1 Basic information Midterm 1 is scheduled from 7:15-8:30 PM. We recommend you arrive early so that you can start exactly at 7:15. Exams will be collected
More informationPropositional Logic Resolution (6A) Young W. Lim 12/31/16
Propositional Logic Resolution (6A) Young W. Lim Copyright (c) 2016 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License,
More informationIntroduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Intro to Learning Theory Date: 12/8/16
600.463 Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Intro to Learning Theory Date: 12/8/16 25.1 Introduction Today we re going to talk about machine learning, but from an
More informationIntroduction to Machine Learning
Introduction to Machine Learning PAC Learning and VC Dimension Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE
More informationCOMPUTATIONAL LEARNING THEORY
COMPUTATIONAL LEARNING THEORY XIAOXU LI Abstract. This paper starts with basic concepts of computational learning theory and moves on with examples in monomials and learning rays to the final discussion
More informationAutomata Theory CS Complexity Theory I: Polynomial Time
Automata Theory CS411-2015-17 Complexity Theory I: Polynomial Time David Galles Department of Computer Science University of San Francisco 17-0: Tractable vs. Intractable If a problem is recursive, then
More informationLanguage of Propositional Logic
Logic A logic has: 1. An alphabet that contains all the symbols of the language of the logic. 2. A syntax giving the rules that define the well formed expressions of the language of the logic (often called
More information2 P vs. NP and Diagonalization
2 P vs NP and Diagonalization CS 6810 Theory of Computing, Fall 2012 Instructor: David Steurer (sc2392) Date: 08/28/2012 In this lecture, we cover the following topics: 1 3SAT is NP hard; 2 Time hierarchies;
More informationLecture 59 : Instance Compression and Succinct PCP s for NP
IITM-CS6840: Advanced Complexity Theory March 31, 2012 Lecture 59 : Instance Compression and Succinct PCP s for NP Lecturer: Sivaramakrishnan N.R. Scribe: Prashant Vasudevan 1 Introduction Classical Complexity
More informationPAC Learning. prof. dr Arno Siebes. Algorithmic Data Analysis Group Department of Information and Computing Sciences Universiteit Utrecht
PAC Learning prof. dr Arno Siebes Algorithmic Data Analysis Group Department of Information and Computing Sciences Universiteit Utrecht Recall: PAC Learning (Version 1) A hypothesis class H is PAC learnable
More informationPropositional Logic. Logic. Propositional Logic Syntax. Propositional Logic
Propositional Logic Reading: Chapter 7.1, 7.3 7.5 [ased on slides from Jerry Zhu, Louis Oliphant and ndrew Moore] Logic If the rules of the world are presented formally, then a decision maker can use logical
More informationComputational Learning Theory
Computational Learning Theory Sinh Hoa Nguyen, Hung Son Nguyen Polish-Japanese Institute of Information Technology Institute of Mathematics, Warsaw University February 14, 2006 inh Hoa Nguyen, Hung Son
More informationCritical Reading of Optimization Methods for Logical Inference [1]
Critical Reading of Optimization Methods for Logical Inference [1] Undergraduate Research Internship Department of Management Sciences Fall 2007 Supervisor: Dr. Miguel Anjos UNIVERSITY OF WATERLOO Rajesh
More informationSAT, NP, NP-Completeness
CS 473: Algorithms, Spring 2018 SAT, NP, NP-Completeness Lecture 22 April 13, 2018 Most slides are courtesy Prof. Chekuri Ruta (UIUC) CS473 1 Spring 2018 1 / 57 Part I Reductions Continued Ruta (UIUC)
More informationCS6840: Advanced Complexity Theory Mar 29, Lecturer: Jayalal Sarma M.N. Scribe: Dinesh K.
CS684: Advanced Complexity Theory Mar 29, 22 Lecture 46 : Size lower bounds for AC circuits computing Parity Lecturer: Jayalal Sarma M.N. Scribe: Dinesh K. Theme: Circuit Complexity Lecture Plan: Proof
More informationLecture 8. MINDNF = {(φ, k) φ is a CNF expression and DNF expression ψ s.t. ψ k and ψ is equivalent to φ}
6.841 Advanced Complexity Theory February 28, 2005 Lecture 8 Lecturer: Madhu Sudan Scribe: Arnab Bhattacharyya 1 A New Theme In the past few lectures, we have concentrated on non-uniform types of computation
More information1.1 P, NP, and NP-complete
CSC5160: Combinatorial Optimization and Approximation Algorithms Topic: Introduction to NP-complete Problems Date: 11/01/2008 Lecturer: Lap Chi Lau Scribe: Jerry Jilin Le This lecture gives a general introduction
More informationNP Completeness. CS 374: Algorithms & Models of Computation, Spring Lecture 23. November 19, 2015
CS 374: Algorithms & Models of Computation, Spring 2015 NP Completeness Lecture 23 November 19, 2015 Chandra & Lenny (UIUC) CS374 1 Spring 2015 1 / 37 Part I NP-Completeness Chandra & Lenny (UIUC) CS374
More informationENEE 459E/CMSC 498R In-class exercise February 10, 2015
ENEE 459E/CMSC 498R In-class exercise February 10, 2015 In this in-class exercise, we will explore what it means for a problem to be intractable (i.e. it cannot be solved by an efficient algorithm). There
More informationComp487/587 - Boolean Formulas
Comp487/587 - Boolean Formulas 1 Logic and SAT 1.1 What is a Boolean Formula Logic is a way through which we can analyze and reason about simple or complicated events. In particular, we are interested
More informationLecture 18: P & NP. Revised, May 1, CLRS, pp
Lecture 18: P & NP Revised, May 1, 2003 CLRS, pp.966-982 The course so far: techniques for designing efficient algorithms, e.g., divide-and-conquer, dynamic-programming, greedy-algorithms. What happens
More informationMaximum 3-SAT as QUBO
Maximum 3-SAT as QUBO Michael J. Dinneen 1 Semester 2, 2016 1/15 1 Slides mostly based on Alex Fowler s and Rong (Richard) Wang s notes. Boolean Formula 2/15 A Boolean variable is a variable that can take
More informationComputational Learning Theory: PAC Model
Computational Learning Theory: PAC Model Subhash Suri May 19, 2015 1 A rectangle Learning Game These notes are based on the paper A Theory of the Learnable by Valiant, the book by Kearns-Vazirani, and
More informationComputational Learning Theory
Computational Learning Theory Pardis Noorzad Department of Computer Engineering and IT Amirkabir University of Technology Ordibehesht 1390 Introduction For the analysis of data structures and algorithms
More informationComputational Complexity and Intractability: An Introduction to the Theory of NP. Chapter 9
1 Computational Complexity and Intractability: An Introduction to the Theory of NP Chapter 9 2 Objectives Classify problems as tractable or intractable Define decision problems Define the class P Define
More informationECE473 Lecture 15: Propositional Logic
ECE473 Lecture 15: Propositional Logic Jeffrey Mark Siskind School of Electrical and Computer Engineering Spring 2018 Siskind (Purdue ECE) ECE473 Lecture 15: Propositional Logic Spring 2018 1 / 23 What
More informationAdvanced Topics in LP and FP
Lecture 1: Prolog and Summary of this lecture 1 Introduction to Prolog 2 3 Truth value evaluation 4 Prolog Logic programming language Introduction to Prolog Introduced in the 1970s Program = collection
More informationU.C. Berkeley CS278: Computational Complexity Professor Luca Trevisan August 30, Notes for Lecture 1
U.C. Berkeley CS278: Computational Complexity Handout N1 Professor Luca Trevisan August 30, 2004 Notes for Lecture 1 This course assumes CS170, or equivalent, as a prerequisite. We will assume that the
More informationCSE 3500 Algorithms and Complexity Fall 2016 Lecture 25: November 29, 2016
CSE 3500 Algorithms and Complexity Fall 2016 Lecture 25: November 29, 2016 Intractable Problems There are many problems for which the best known algorithms take a very long time (e.g., exponential in some
More informationAlgorithm Design and Analysis
Algorithm Design and Analysis LECTURE 26 Computational Intractability Polynomial Time Reductions Sofya Raskhodnikova S. Raskhodnikova; based on slides by A. Smith and K. Wayne L26.1 What algorithms are
More informationBoolean Algebra. Philipp Koehn. 9 September 2016
Boolean Algebra Philipp Koehn 9 September 2016 Core Boolean Operators 1 AND OR NOT A B A and B 0 0 0 0 1 0 1 0 0 1 1 1 A B A or B 0 0 0 0 1 1 1 0 1 1 1 1 A not A 0 1 1 0 AND OR NOT 2 Boolean algebra Boolean
More informationUndecidable Problems. Z. Sawa (TU Ostrava) Introd. to Theoretical Computer Science May 12, / 65
Undecidable Problems Z. Sawa (TU Ostrava) Introd. to Theoretical Computer Science May 12, 2018 1/ 65 Algorithmically Solvable Problems Let us assume we have a problem P. If there is an algorithm solving
More information6.841/18.405J: Advanced Complexity Wednesday, February 12, Lecture Lecture 3
6.841/18.405J: Advanced Complexity Wednesday, February 12, 2003 Lecture Lecture 3 Instructor: Madhu Sudan Scribe: Bobby Kleinberg 1 The language MinDNF At the end of the last lecture, we introduced the
More informationCSE507. Introduction. Computer-Aided Reasoning for Software. Emina Torlak courses.cs.washington.edu/courses/cse507/17wi/
Computer-Aided Reasoning for Software CSE507 courses.cs.washington.edu/courses/cse507/17wi/ Introduction Emina Torlak emina@cs.washington.edu Today What is this course about? Course logistics Review of
More informationLecture 3: MSO to Regular Languages
Lecture 3: MSO to Regular Languages To describe the translation from MSO formulas to regular languages one has to be a bit more formal! All the examples we used in the previous class were sentences i.e.,
More informationMulticlass Classification-1
CS 446 Machine Learning Fall 2016 Oct 27, 2016 Multiclass Classification Professor: Dan Roth Scribe: C. Cheng Overview Binary to multiclass Multiclass SVM Constraint classification 1 Introduction Multiclass
More informationSubstitution Theorem. Equivalences. Proposition 2.8 The following equivalences are valid for all formulas φ, ψ, χ: (φ φ) φ Idempotency
Substitution Theorem Proposition 2.7 Let φ 1 and φ 2 be equivalent formulas, and ψ[φ 1 ] p be a formula in which φ 1 occurs as a subformula at position p. Then ψ[φ 1 ] p is equivalent to ψ[φ 2 ] p. Proof.
More informationSupervised Machine Learning (Spring 2014) Homework 2, sample solutions
58669 Supervised Machine Learning (Spring 014) Homework, sample solutions Credit for the solutions goes to mainly to Panu Luosto and Joonas Paalasmaa, with some additional contributions by Jyrki Kivinen
More informationAn Algorithms-based Intro to Machine Learning
CMU 15451 lecture 12/08/11 An Algorithmsbased Intro to Machine Learning Plan for today Machine Learning intro: models and basic issues An interesting algorithm for combining expert advice Avrim Blum [Based
More informationAutomated Program Verification and Testing 15414/15614 Fall 2016 Lecture 2: Propositional Logic
Automated Program Verification and Testing 15414/15614 Fall 2016 Lecture 2: Propositional Logic Matt Fredrikson mfredrik@cs.cmu.edu October 17, 2016 Matt Fredrikson Propositional Logic 1 / 33 Propositional
More informationShow that the following problems are NP-complete
Show that the following problems are NP-complete April 7, 2018 Below is a list of 30 exercises in which you are asked to prove that some problem is NP-complete. The goal is to better understand the theory
More informationCSCI3390-Lecture 6: An Undecidable Problem
CSCI3390-Lecture 6: An Undecidable Problem September 21, 2018 1 Summary The language L T M recognized by the universal Turing machine is not decidable. Thus there is no algorithm that determines, yes or
More informationLECTURE 1. Logic and Proofs
LECTURE 1 Logic and Proofs The primary purpose of this course is to introduce you, most of whom are mathematics majors, to the most fundamental skills of a mathematician; the ability to read, write, and
More informationComputational Learning Theory. Definitions
Computational Learning Theory Computational learning theory is interested in theoretical analyses of the following issues. What is needed to learn effectively? Sample complexity. How many examples? Computational
More informationWarm-Up Problem. Is the following true or false? 1/35
Warm-Up Problem Is the following true or false? 1/35 Propositional Logic: Resolution Carmen Bruni Lecture 6 Based on work by J Buss, A Gao, L Kari, A Lubiw, B Bonakdarpour, D Maftuleac, C Roberts, R Trefler,
More informationClassical Propositional Logic
Classical Propositional Logic Peter Baumgartner http://users.cecs.anu.edu.au/~baumgart/ Ph: 02 6218 3717 Data61/CSIRO and ANU July 2017 1 / 71 Classical Logic and Reasoning Problems A 1 : Socrates is a
More informationAgenda. Artificial Intelligence. Reasoning in the Wumpus World. The Wumpus World
Agenda Artificial Intelligence 10. Propositional Reasoning, Part I: Principles How to Think About What is True or False 1 Introduction Álvaro Torralba Wolfgang Wahlster 2 Propositional Logic 3 Resolution
More informationUNIVERSITY OF CALICUT SCHOOL OF DISTANCE EDUCATION B Sc (MATHEMATICS) I Semester Core Course. FOUNDATIONS OF MATHEMATICS (MODULE I & ii) QUESTION BANK
UNIVERSITY OF CALICUT SCHOOL OF DISTANCE EDUCATION B Sc (MATHEMATICS) (2011 Admission Onwards) I Semester Core Course FOUNDATIONS OF MATHEMATICS (MODULE I & ii) QUESTION BANK 1) If A and B are two sets
More information[read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] General-to-specific ordering over hypotheses
1 CONCEPT LEARNING AND THE GENERAL-TO-SPECIFIC ORDERING [read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] Learning from examples General-to-specific ordering over hypotheses Version spaces and
More informationAn algorithm for the satisfiability problem of formulas in conjunctive normal form
Journal of Algorithms 54 (2005) 40 44 www.elsevier.com/locate/jalgor An algorithm for the satisfiability problem of formulas in conjunctive normal form Rainer Schuler Abt. Theoretische Informatik, Universität
More informationLearning Theory: Basic Guarantees
Learning Theory: Basic Guarantees Daniel Khashabi Fall 2014 Last Update: April 28, 2016 1 Introduction The Learning Theory 1, is somewhat least practical part of Machine Learning, which is most about the
More informationComputational Learning Theory
1 Computational Learning Theory 2 Computational learning theory Introduction Is it possible to identify classes of learning problems that are inherently easy or difficult? Can we characterize the number
More informationCSCI3390-Second Test with Solutions
CSCI3390-Second Test with Solutions April 26, 2016 Each of the 15 parts of the problems below is worth 10 points, except for the more involved 4(d), which is worth 20. A perfect score is 100: if your score
More informationRecap from end of last time
Topics in Machine Learning Theory Avrim Blum 09/03/4 Lecture 3: Shifting/Sleeping Experts, the Winnow Algorithm, and L Margin Bounds Recap from end of last time RWM (multiplicative weights alg) p i = w
More information5.1 Learning using Polynomial Threshold Functions
CS 395T Computational Learning Theory Lecture 5: September 17, 2007 Lecturer: Adam Klivans Scribe: Aparajit Raghavan 5.1 Learning using Polynomial Threshold Functions 5.1.1 Recap Definition 1 A function
More informationCS 395T Computational Learning Theory. Scribe: Rahul Suri
CS 395T Computational Learning Theory Lecture 6: September 19, 007 Lecturer: Adam Klivans Scribe: Rahul Suri 6.1 Overview In Lecture 5, we showed that for any DNF formula f with n variables and s terms
More information