Fundamentals of Concept Learning
|
|
- Barrie Wilcox
- 5 years ago
- Views:
Transcription
1 Aims 09s: COMP947 Macine Learning and Data Mining Fundamentals of Concept Learning Marc, 009 Acknowledgement: Material derived from slides for te book Macine Learning, Tom Mitcell, McGraw-Hill, 997 ttp://www-.cs.cmu.edu/~tom/mlbook.tml Tis lecture aims to develop your understanding of representing and searcing ypotesis spaces for concept learning. Following it you sould be able to: define a representation for concepts define a ypotesis space in terms of generality ordering on concepts describe an algoritm to searc a ypotesis space express te framework of version spaces describe an algoritm to searc a ypotesis space using te framework of version spaces explain te role of inductive bias in concept learning COMP947: Marc, 009 Fundamentals of Concept Learning: Slide Overview Concept Learning inferring a Boolean-valued function from training examples of its input and output. Learning from examples General-to-specific ordering over ypoteses Version spaces and candidate elimination algoritm Picking new examples Te need for inductive bias Training Examples for EnjoySport Sky Temp Humid Wind Water Forecst EnjoySpt Sunny Warm Normal Strong Warm Same Yes Sunny Warm Hig Strong Warm Same Yes Rainy Cold Hig Strong Warm Cange No Sunny Warm Hig Strong Cool Cange Yes Wat is te general concept? Note: simple approac assuming no noise, illustrates key concepts COMP947: Marc, 009 Fundamentals of Concept Learning: Slide COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 3
2 Many possible representations... Representing Hypoteses Here, is a conjunction of constraints on attributes. Given: Te Prototypical Concept Learning Task Instances X: Possible days, eac described by te attributes Eac constraint can be: a specific value (e.g., W ater = W arm) don t care (e.g., W ater =? ) no value allowed (e.g., Water= ) For example, Sky AirTemp Humid Wind Water Forecst Sunny?? Strong? Same Attribute Sky AirTemp Humid Wind Water Forecast Values Sunny, Cloudy, Rainy Warm, Cold Normal, Hig Strong, Weak Warm, Cool Same, Cange COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 4 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 5 Te Prototypical Concept Learning Task Te inductive learning ypotesis Target function c: EnjoySport : X {0, } Hypoteses H: Conjunctions of literals. E.g.?, Cold, Hig,?,?,?. Training examples D: Positive and negative examples of te target function x, c(x ),... x m, c(x m ) Any ypotesis found to approximate te target function well over a sufficiently large set of training examples will also approximate te target function well over oter unobserved examples. Determine: A ypotesis in H suc tat (x) = c(x) for all x in D (usually called te target ypotesis). COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 6 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 7
3 Concept Learning as Searc Question: Wat can be learned? Answer: (only) wat is in te ypotesis space How big is te ypotesis space for EnjoySport? Instance space Sky AirTemp... Forecast = 3 = 96 Concept Learning as Searc Hypotesis space Sky AirTemp... Forecast = = 50 (semantically distinct only) = + ( ) = 973 any ypotesis wit an constraint covers no instances, ence all are semantically equivalent. Te learning problem searcing a ypotesis space. How? COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 8 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 9 Instances, Hypoteses, and More-General-Tan Instances X Hypoteses H Specific A generality order on ypoteses Definition: Let j and k be Boolean-valued functions defined over instances X. Ten j is more general tan or equal to k (written j g k ) if and only if x x 3 General ( x X)[( k (x) = ) ( j (x) = )] Intuitively, j is more general tan or equal to k if any instance satisfying k also satisfies j. x = <Sunny, Warm, Hig, Strong, Cool, Same> x = <Sunny, Warm, Hig, Ligt, Warm, Same> = <Sunny,?,?, Strong,?,?> = <Sunny,?,?,?,?,?> = <Sunny,?,?,?, Cool,?> 3 j is (strictly) more general tan k (written j > g k ) if and only if ( j g k ) ( k g j ). j is more specific tan k wen k is more general tan j. COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 0 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide
4 Te Find-S Algoritm Hypotesis Space Searc by Find-S Instances X Hypoteses H. Initialize to te most specific ypotesis in H. For eac positive training instance x For eac attribute constraint a i in If te constraint a i in is satisfied by x Ten do noting Else replace a i in by te next more general constraint tat is satisfied by x - x 3 x + x+ x+ 4 x = <Sunny Warm Normal Strong Warm Same>, + x = <Sunny Warm Hig Strong Warm Same>, + x 3 = <Rainy Cold Hig Strong Warm Cange>, - x = <Sunny Warm Hig Strong Cool Cange>, + 4 0,3 4 Specific General = <,,,,, > 0 = <Sunny Warm Normal Strong Warm Same> = <Sunny Warm? Strong Warm Same> = <Sunny Warm? Strong Warm Same> 3 = <Sunny Warm? Strong?? > 4 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 3 Find-S - does it work? Complaints about Find-S Assume: a ypotesis c H describes target function c, and training data is error-free. By definition, c is consistent wit all positive training examples and can never cover a negative example. For eac generated by Find-S, c is more general tan or equal to. So can never cover a negative example. Can t tell weter it as learned concept learned ypotesis may not be te only consistent ypotesis Can t tell wen training data inconsistent cannot andle noisy data Picks a maximally specific (wy?) migt require maximally general COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 4 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 5
5 Version Spaces Te List-Ten-Eliminate Algoritm A ypotesis is consistent wit a set of training examples D of target concept c if and only if (x) = c(x) for eac training example x, c(x) in D. Consistent(, D) ( x, c(x) D) (x) = c(x). V ersionspace a list containing every ypotesis in H. For eac training example, x, c(x) remove from V ersionspace any ypotesis for wic (x) c(x) 3. Output te list of ypoteses in V ersionspace Te version space, V S H,D, wit respect to ypotesis space H and training examples D, is te subset of ypoteses from H consistent wit all training examples in D. V S H,D { H Consistent(, D)} COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 6 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 7 Example Version Space Representing Version Spaces S: { <Sunny, Warm,?, Strong,?,?> } Te General boundary, G, of version space V S H,D is te set of its maximally general members G: { <Sunny,?,?,?,?,?>, <?, Warm,?,?,?,?> } Te Specific boundary, S, of version space V S H,D is te set of its maximally specific members Every member of te version space lies between tese boundaries V S H,D = { H ( s S)( g G)(g s)} were x y means x is more general or equal to y COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 8 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 9
6 Te Candidate Elimination Algoritm G maximally general ypoteses in H S maximally specific ypoteses in H For eac training example d, do If d is a positive example Remove from G any ypotesis inconsistent wit d For eac ypotesis s in S tat is not consistent wit d Remove s from S Add to S all minimal generalizations of s suc tat. is consistent wit d, and. some member of G is more general tan Remove from S any ypotesis tat is more general tan anoter ypotesis in S If d is a negative example Te Candidate Elimination Algoritm Remove from S any ypotesis inconsistent wit d For eac ypotesis g in G tat is not consistent wit d Remove g from G Add to G all minimal specializations of g suc tat. is consistent wit d, and. some member of S is more specific tan Remove from G any ypotesis tat is less general tan anoter ypotesis in G COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 0 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide Example Trace Example Trace S 0 : {<Ø, Ø, Ø, Ø, Ø, Ø>} S 0 : { <,,,,, > } S : { <Sunny, Warm, Normal, Strong, Warm, Same> } S : { <Sunny, Warm,?, Strong, Warm, Same> } G 0, G, G : { <?,?,?,?,?,?>} G 0 : {<?,?,?,?,?,?>} Training examples:. <Sunny, Warm, Normal, Strong, Warm, Same>, Enjoy Sport = Yes. <Sunny, Warm, Hig, Strong, Warm, Same>, Enjoy Sport = Yes COMP947: Marc, 009 Fundamentals of Concept Learning: Slide COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 3
7 Example Trace Example Trace S, S 3 : { <Sunny, Warm,?, Strong, Warm, Same> } S 3 : { <Sunny, Warm,?, Strong, Warm, Same> } S 4 : { <Sunny, Warm,?, Strong,?,?>} G 3 : { <Sunny,?,?,?,?,?> <?, Warm,?,?,?,?> <?,?,?,?,?, Same> } G 4: { <Sunny,?,?,?,?,?> <?, Warm,?,?,?,?>} G : { <?,?,?,?,?,?> } G 3 : { <Sunny,?,?,?,?,?> <?, Warm,?,?,?,?> <?,?,?,?,?, Same> } Training Example: 3. <Rainy, Cold, Hig, Strong, Warm, Cange>, EnjoySport=No Training Example: 4.<Sunny, Warm, Hig, Strong, Cool, Cange>, EnjoySport = Yes COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 4 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 5 Example Trace Wic Training Example Is Best To Coose Next? S 4 : { <Sunny, Warm,?, Strong,?,?>} S: { <Sunny, Warm,?, Strong,?,?> } G 4 : { <Sunny,?,?,?,?,?>, <?, Warm,?,?,?,?>} G: { <Sunny,?,?,?,?,?>, <?, Warm,?,?,?,?> } COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 6 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 7
8 Wic Training Example To Coose Next? How Sould New Instances Be Classified? S: { <Sunny, Warm,?, Strong,?,?> } S: { <Sunny, Warm,?, Strong,?,?> } G: { <Sunny,?,?,?,?,?>, <?, Warm,?,?,?,?> } Sunny W arm Normal Ligt W arm Same G: { <Sunny,?,?,?,?,?>, <?, Warm,?,?,?,?> } Sunny Warm Normal Strong Cool Cange Rainy Cold Normal Ligt Warm Same Sunny Warm Normal Ligt Warm Same COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 8 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 9 How Sould New Instances Be Classified? Wat Justifies tis Inductive Leap? S: { <Sunny, Warm,?, Strong,?,?> } + Sunny W arm Normal Strong Cool Cange + Sunny W arm Normal Ligt W arm Same S : Sunny W arm Normal??? G: { <Sunny,?,?,?,?,?>, <?, Warm,?,?,?,?> } Sunny Warm Normal Strong Cool Cange (6 + /0 ) Rainy Cold Normal Ligt Warm Same (0 + /6 ) Sunny Warm Normal Ligt Warm Same (3 + /3 ) Wy believe we can classify tis unseen instance? Sunny W arm Normal Strong W arm Same COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 30 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 3
9 An UNBiased Learner Idea: Coose H tat expresses every teacable concept (i.e. H is te power set of X) Consider H = disjunctions, conjunctions, negations over previous H. E.g. Sunny W arm Normal???????? Cange Wat are S, G in tis case? S G Consider concept learning algoritm L instances X, target concept c Inductive Bias training examples D c = { x, c(x) } let L(x i, D c ) denote te classification assigned to te instance x i by L after training on data D c. Definition: Te inductive bias of L is any minimal set of assertions B suc tat for any target concept c and corresponding training examples D c ( x i X)[(B D c x i ) L(x i, D c )] were A B means A logically entails B COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 3 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 33 Inductive Systems and Equivalent Deductive Systems Training examples New instance Training examples New instance Inductive system Candidate Elimination Algoritm Using Hypotesis Space H Equivalent deductive system Teorem Prover Classification of new instance, or "don t know" Classification of new instance, or "don t know" Tree Learners wit Different Biases. Rote learner: Store examples, Classify x iff it matces previously observed example.. Version space candidate elimination algoritm 3. Find-S Assertion " H contains te target concept" Inductive bias made explicit COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 34 COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 35
10 Summary Points. Concept learning as searc troug H. General-to-specific ordering over H 3. Version space candidate elimination algoritm 4. S and G boundaries caracterize learner s uncertainty 5. Learner can generate useful queries 6. Inductive leaps possible only if learner is biased 7. Inductive learners can be modelled by equivalent deductive systems [Suggested reading: Mitcell, Capter ] COMP947: Marc, 009 Fundamentals of Concept Learning: Slide 36
CSCE 478/878 Lecture 2: Concept Learning and the General-to-Specific Ordering
Outline Learning from eamples CSCE 78/878 Lecture : Concept Learning and te General-to-Specific Ordering Stepen D. Scott (Adapted from Tom Mitcell s slides) General-to-specific ordering over ypoteses Version
More informationConcept Learning through General-to-Specific Ordering
0. Concept Learning through General-to-Specific Ordering Based on Machine Learning, T. Mitchell, McGRAW Hill, 1997, ch. 2 Acknowledgement: The present slides are an adaptation of slides drawn by T. Mitchell
More informationOutline. [read Chapter 2] Learning from examples. General-to-specic ordering over hypotheses. Version spaces and candidate elimination.
Outline [read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] Learning from examples General-to-specic ordering over hypotheses Version spaces and candidate elimination algorithm Picking new examples
More information[read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] General-to-specific ordering over hypotheses
1 CONCEPT LEARNING AND THE GENERAL-TO-SPECIFIC ORDERING [read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] Learning from examples General-to-specific ordering over hypotheses Version spaces and
More informationVersion Spaces.
. Machine Learning Version Spaces Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg riedmiller@informatik.uni-freiburg.de
More informationConcept Learning.
. Machine Learning Concept Learning Prof. Dr. Martin Riedmiller AG Maschinelles Lernen und Natürlichsprachliche Systeme Institut für Informatik Technische Fakultät Albert-Ludwigs-Universität Freiburg Martin.Riedmiller@uos.de
More informationConcept Learning Mitchell, Chapter 2. CptS 570 Machine Learning School of EECS Washington State University
Concept Learning Mitchell, Chapter 2 CptS 570 Machine Learning School of EECS Washington State University Outline Definition General-to-specific ordering over hypotheses Version spaces and the candidate
More informationOutline. Training Examples for EnjoySport. 2 lecture slides for textbook Machine Learning, c Tom M. Mitchell, McGraw Hill, 1997
Outline Training Examples for EnjoySport Learning from examples General-to-specific ordering over hypotheses [read Chapter 2] [suggested exercises 2.2, 2.3, 2.4, 2.6] Version spaces and candidate elimination
More informationQuestion of the Day? Machine Learning 2D1431. Training Examples for Concept Enjoy Sport. Outline. Lecture 3: Concept Learning
Question of the Day? Machine Learning 2D43 Lecture 3: Concept Learning What row of numbers comes next in this series? 2 2 22 322 3222 Outline Training Examples for Concept Enjoy Sport Learning from examples
More informationOverview. Machine Learning, Chapter 2: Concept Learning
Overview Concept Learning Representation Inductive Learning Hypothesis Concept Learning as Search The Structure of the Hypothesis Space Find-S Algorithm Version Space List-Eliminate Algorithm Candidate-Elimination
More informationMachine Learning 2D1431. Lecture 3: Concept Learning
Machine Learning 2D1431 Lecture 3: Concept Learning Question of the Day? What row of numbers comes next in this series? 1 11 21 1211 111221 312211 13112221 Outline Learning from examples General-to specific
More informationConcept Learning. Space of Versions of Concepts Learned
Concept Learning Space of Versions of Concepts Learned 1 A Concept Learning Task Target concept: Days on which Aldo enjoys his favorite water sport Example Sky AirTemp Humidity Wind Water Forecast EnjoySport
More informationIntroduction to machine learning. Concept learning. Design of a learning system. Designing a learning system
Introduction to machine learning Concept learning Maria Simi, 2011/2012 Machine Learning, Tom Mitchell Mc Graw-Hill International Editions, 1997 (Cap 1, 2). Introduction to machine learning When appropriate
More informationLecture 2: Foundations of Concept Learning
Lecture 2: Foundations of Concept Learning Cognitive Systems II - Machine Learning WS 2005/2006 Part I: Basic Approaches to Concept Learning Version Space, Candidate Elimination, Inductive Bias Lecture
More informationConcept Learning. Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University.
Concept Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Tom M. Mitchell, Machine Learning, Chapter 2 2. Tom M. Mitchell s
More informationConcept Learning. Berlin Chen References: 1. Tom M. Mitchell, Machine Learning, Chapter 2 2. Tom M. Mitchell s teaching materials.
Concept Learning Berlin Chen 2005 References: 1. Tom M. Mitchell, Machine Learning, Chapter 2 2. Tom M. Mitchell s teaching materials MLDM-Berlin 1 What is a Concept? Concept of Dogs Concept of Cats Concept
More informationEECS 349: Machine Learning
EECS 349: Machine Learning Bryan Pardo Topic 1: Concept Learning and Version Spaces (with some tweaks by Doug Downey) 1 Concept Learning Much of learning involves acquiring general concepts from specific
More informationBITS F464: MACHINE LEARNING
BITS F464: MACHINE LEARNING Lecture-09: Concept Learning Dr. Kamlesh Tiwari Assistant Professor Department of Computer Science and Information Systems Engineering, BITS Pilani, Rajasthan-333031 INDIA Jan
More informationComputational Learning Theory
09s1: COMP9417 Machine Learning and Data Mining Computational Learning Theory May 20, 2009 Acknowledgement: Material derived from slides for the book Machine Learning, Tom M. Mitchell, McGraw-Hill, 1997
More informationComputational Learning Theory
Computational Learning Theory [read Chapter 7] [Suggested exercises: 7.1, 7.2, 7.5, 7.8] Computational learning theory Setting 1: learner poses queries to teacher Setting 2: teacher chooses examples Setting
More informationComputational Learning Theory
1 Computational Learning Theory 2 Computational learning theory Introduction Is it possible to identify classes of learning problems that are inherently easy or difficult? Can we characterize the number
More informationIntuition Bayesian Classification
Intuition Bayesian Classification More ockey fans in Canada tan in US Wic country is Tom, a ockey ball fan, from? Predicting Canada as a better cance to be rigt Prior probability P(Canadian=5%: reflect
More informationAnswers Machine Learning Exercises 2
nswers Machine Learning Exercises 2 Tim van Erven October 7, 2007 Exercises. Consider the List-Then-Eliminate algorithm for the EnjoySport example with hypothesis space H = {?,?,?,?,?,?, Sunny,?,?,?,?,?,
More informationTopics. Concept Learning. Concept Learning Task. Concept Descriptions
Topics Concept Learning Sattiraju Prabhakar CS898O: Lecture#2 Wichita State University Concept Description Using Concept Descriptions Training Examples Concept Learning Algorithm: Find-S 1/22/2006 ML2006_ConceptLearning
More informationMVT and Rolle s Theorem
AP Calculus CHAPTER 4 WORKSHEET APPLICATIONS OF DIFFERENTIATION MVT and Rolle s Teorem Name Seat # Date UNLESS INDICATED, DO NOT USE YOUR CALCULATOR FOR ANY OF THESE QUESTIONS In problems 1 and, state
More informationFunction Composition and Chain Rules
Function Composition and s James K. Peterson Department of Biological Sciences and Department of Matematical Sciences Clemson University Marc 8, 2017 Outline 1 Function Composition and Continuity 2 Function
More informationEfficient algorithms for for clone items detection
Efficient algoritms for for clone items detection Raoul Medina, Caroline Noyer, and Olivier Raynaud Raoul Medina, Caroline Noyer and Olivier Raynaud LIMOS - Université Blaise Pascal, Campus universitaire
More informationEECS 349: Machine Learning Bryan Pardo
EECS 349: Machine Learning Bryan Pardo Topic: Concept Learning 1 Concept Learning Much of learning involves acquiring general concepts from specific training examples Concept: subset of objects from some
More informationComputational Learning Theory
Computational Learning Theory Sinh Hoa Nguyen, Hung Son Nguyen Polish-Japanese Institute of Information Technology Institute of Mathematics, Warsaw University February 14, 2006 inh Hoa Nguyen, Hung Son
More informationIntroduction to Derivatives
Introduction to Derivatives 5-Minute Review: Instantaneous Rates and Tangent Slope Recall te analogy tat we developed earlier First we saw tat te secant slope of te line troug te two points (a, f (a))
More informationRegularized Regression
Regularized Regression David M. Blei Columbia University December 5, 205 Modern regression problems are ig dimensional, wic means tat te number of covariates p is large. In practice statisticians regularize
More informationContinuity and Differentiability Worksheet
Continuity and Differentiability Workseet (Be sure tat you can also do te grapical eercises from te tet- Tese were not included below! Typical problems are like problems -3, p. 6; -3, p. 7; 33-34, p. 7;
More informationMachine Learning. Computational Learning Theory. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012
Machine Learning CSE6740/CS7641/ISYE6740, Fall 2012 Computational Learning Theory Le Song Lecture 11, September 20, 2012 Based on Slides from Eric Xing, CMU Reading: Chap. 7 T.M book 1 Complexity of Learning
More information2.11 That s So Derivative
2.11 Tat s So Derivative Introduction to Differential Calculus Just as one defines instantaneous velocity in terms of average velocity, we now define te instantaneous rate of cange of a function at a point
More informationBob Brown Math 251 Calculus 1 Chapter 3, Section 1 Completed 1 CCBC Dundalk
Bob Brown Mat 251 Calculus 1 Capter 3, Section 1 Completed 1 Te Tangent Line Problem Te idea of a tangent line first arises in geometry in te context of a circle. But before we jump into a discussion of
More informationMachine Learning. Computational Learning Theory. Eric Xing , Fall Lecture 9, October 5, 2016
Machine Learning 10-701, Fall 2016 Computational Learning Theory Eric Xing Lecture 9, October 5, 2016 Reading: Chap. 7 T.M book Eric Xing @ CMU, 2006-2016 1 Generalizability of Learning In machine learning
More informationNUMERICAL DIFFERENTIATION. James T. Smith San Francisco State University. In calculus classes, you compute derivatives algebraically: for example,
NUMERICAL DIFFERENTIATION James T Smit San Francisco State University In calculus classes, you compute derivatives algebraically: for example, f( x) = x + x f ( x) = x x Tis tecnique requires your knowing
More informationNotes on wavefunctions II: momentum wavefunctions
Notes on wavefunctions II: momentum wavefunctions and uncertainty Te state of a particle at any time is described by a wavefunction ψ(x). Tese wavefunction must cange wit time, since we know tat particles
More informationData Mining and Machine Learning
Data Mining and Machine Learning Concept Learning and Version Spaces Introduction Concept Learning Generality Relations Refinement Operators Structured Hypothesis Spaces Simple algorithms Find-S Find-G
More informationSection 3: The Derivative Definition of the Derivative
Capter 2 Te Derivative Business Calculus 85 Section 3: Te Derivative Definition of te Derivative Returning to te tangent slope problem from te first section, let's look at te problem of finding te slope
More informationCS340: Bayesian concept learning. Kevin Murphy Based on Josh Tenenbaum s PhD thesis (MIT BCS 1999)
CS340: Bayesian concept learning Kevin Murpy Based on Jos Tenenbaum s PD tesis (MIT BCS 1999) Concept learning (binary classification) from positive and negative examples Concept learning from positive
More information2.3 Product and Quotient Rules
.3. PRODUCT AND QUOTIENT RULES 75.3 Product and Quotient Rules.3.1 Product rule Suppose tat f and g are two di erentiable functions. Ten ( g (x)) 0 = f 0 (x) g (x) + g 0 (x) See.3.5 on page 77 for a proof.
More information7.1 Using Antiderivatives to find Area
7.1 Using Antiderivatives to find Area Introduction finding te area under te grap of a nonnegative, continuous function f In tis section a formula is obtained for finding te area of te region bounded between
More informationIntroduction to Machine Learning
Outline Contents Introduction to Machine Learning Concept Learning Varun Chandola February 2, 2018 1 Concept Learning 1 1.1 Example Finding Malignant Tumors............. 2 1.2 Notation..............................
More informationPreface. Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.
Preface Here are my online notes for my course tat I teac ere at Lamar University. Despite te fact tat tese are my class notes, tey sould be accessible to anyone wanting to learn or needing a refreser
More informationLecture 10: Carnot theorem
ecture 0: Carnot teorem Feb 7, 005 Equivalence of Kelvin and Clausius formulations ast time we learned tat te Second aw can be formulated in two ways. e Kelvin formulation: No process is possible wose
More informationSolve exponential equations in one variable using a variety of strategies. LEARN ABOUT the Math. What is the half-life of radon?
8.5 Solving Exponential Equations GOAL Solve exponential equations in one variable using a variety of strategies. LEARN ABOUT te Mat All radioactive substances decrease in mass over time. Jamie works in
More informationLIMITS AND DERIVATIVES CONDITIONS FOR THE EXISTENCE OF A LIMIT
LIMITS AND DERIVATIVES Te limit of a function is defined as te value of y tat te curve approaces, as x approaces a particular value. Te limit of f (x) as x approaces a is written as f (x) approaces, as
More informationOnline Learning: Bandit Setting
Online Learning: Bandit Setting Daniel asabi Summer 04 Last Update: October 0, 06 Introduction [TODO Bandits. Stocastic setting Suppose tere exists unknown distributions ν,..., ν, suc tat te loss at eac
More informationSection 2.7 Derivatives and Rates of Change Part II Section 2.8 The Derivative as a Function. at the point a, to be. = at time t = a is
Mat 180 www.timetodare.com Section.7 Derivatives and Rates of Cange Part II Section.8 Te Derivative as a Function Derivatives ( ) In te previous section we defined te slope of te tangent to a curve wit
More information1. State whether the function is an exponential growth or exponential decay, and describe its end behaviour using limits.
Questions 1. State weter te function is an exponential growt or exponential decay, and describe its end beaviour using its. (a) f(x) = 3 2x (b) f(x) = 0.5 x (c) f(x) = e (d) f(x) = ( ) x 1 4 2. Matc te
More informationFinancial Econometrics Prof. Massimo Guidolin
CLEFIN A.A. 2010/2011 Financial Econometrics Prof. Massimo Guidolin A Quick Review of Basic Estimation Metods 1. Were te OLS World Ends... Consider two time series 1: = { 1 2 } and 1: = { 1 2 }. At tis
More information1. Questions (a) through (e) refer to the graph of the function f given below. (A) 0 (B) 1 (C) 2 (D) 4 (E) does not exist
Mat 1120 Calculus Test 2. October 18, 2001 Your name Te multiple coice problems count 4 points eac. In te multiple coice section, circle te correct coice (or coices). You must sow your work on te oter
More information158 Calculus and Structures
58 Calculus and Structures CHAPTER PROPERTIES OF DERIVATIVES AND DIFFERENTIATION BY THE EASY WAY. Calculus and Structures 59 Copyrigt Capter PROPERTIES OF DERIVATIVES. INTRODUCTION In te last capter you
More informationExam 1 Review Solutions
Exam Review Solutions Please also review te old quizzes, and be sure tat you understand te omework problems. General notes: () Always give an algebraic reason for your answer (graps are not sufficient),
More information2.8 The Derivative as a Function
.8 Te Derivative as a Function Typically, we can find te derivative of a function f at many points of its domain: Definition. Suppose tat f is a function wic is differentiable at every point of an open
More informationLab 6 Derivatives and Mutant Bacteria
Lab 6 Derivatives and Mutant Bacteria Date: September 27, 20 Assignment Due Date: October 4, 20 Goal: In tis lab you will furter explore te concept of a derivative using R. You will use your knowledge
More informationMath 212-Lecture 9. For a single-variable function z = f(x), the derivative is f (x) = lim h 0
3.4: Partial Derivatives Definition Mat 22-Lecture 9 For a single-variable function z = f(x), te derivative is f (x) = lim 0 f(x+) f(x). For a function z = f(x, y) of two variables, to define te derivatives,
More informationDecision Tree Learning
Decision Tree Learning Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University References: 1. Machine Learning, Chapter 3 2. Data Mining: Concepts, Models,
More informationDifferentiation in higher dimensions
Capter 2 Differentiation in iger dimensions 2.1 Te Total Derivative Recall tat if f : R R is a 1-variable function, and a R, we say tat f is differentiable at x = a if and only if te ratio f(a+) f(a) tends
More informationSection 2: The Derivative Definition of the Derivative
Capter 2 Te Derivative Applied Calculus 80 Section 2: Te Derivative Definition of te Derivative Suppose we drop a tomato from te top of a 00 foot building and time its fall. Time (sec) Heigt (ft) 0.0 00
More informationIntroduction to Machine Learning. Recitation 8. w 2, b 2. w 1, b 1. z 0 z 1. The function we want to minimize is the loss over all examples: f =
Introduction to Macine Learning Lecturer: Regev Scweiger Recitation 8 Fall Semester Scribe: Regev Scweiger 8.1 Backpropagation We will develop and review te backpropagation algoritm for neural networks.
More informationDecision Tree Learning and Inductive Inference
Decision Tree Learning and Inductive Inference 1 Widely used method for inductive inference Inductive Inference Hypothesis: Any hypothesis found to approximate the target function well over a sufficiently
More informationNumerical Differentiation
Numerical Differentiation Finite Difference Formulas for te first derivative (Using Taylor Expansion tecnique) (section 8.3.) Suppose tat f() = g() is a function of te variable, and tat as 0 te function
More informationDerivatives of Exponentials
mat 0 more on derivatives: day 0 Derivatives of Eponentials Recall tat DEFINITION... An eponential function as te form f () =a, were te base is a real number a > 0. Te domain of an eponential function
More informationTHE IDEA OF DIFFERENTIABILITY FOR FUNCTIONS OF SEVERAL VARIABLES Math 225
THE IDEA OF DIFFERENTIABILITY FOR FUNCTIONS OF SEVERAL VARIABLES Mat 225 As we ave seen, te definition of derivative for a Mat 111 function g : R R and for acurveγ : R E n are te same, except for interpretation:
More informationMTH-112 Quiz 1 Name: # :
MTH- Quiz Name: # : Please write our name in te provided space. Simplif our answers. Sow our work.. Determine weter te given relation is a function. Give te domain and range of te relation.. Does te equation
More information232 Calculus and Structures
3 Calculus and Structures CHAPTER 17 JUSTIFICATION OF THE AREA AND SLOPE METHODS FOR EVALUATING BEAMS Calculus and Structures 33 Copyrigt Capter 17 JUSTIFICATION OF THE AREA AND SLOPE METHODS 17.1 THE
More informationHOMEWORK HELP 2 FOR MATH 151
HOMEWORK HELP 2 FOR MATH 151 Here we go; te second round of omework elp. If tere are oters you would like to see, let me know! 2.4, 43 and 44 At wat points are te functions f(x) and g(x) = xf(x)continuous,
More information1. Which one of the following expressions is not equal to all the others? 1 C. 1 D. 25x. 2. Simplify this expression as much as possible.
004 Algebra Pretest answers and scoring Part A. Multiple coice questions. Directions: Circle te letter ( A, B, C, D, or E ) net to te correct answer. points eac, no partial credit. Wic one of te following
More informationTutorial 6. By:Aashmeet Kalra
Tutorial 6 By:Aashmeet Kalra AGENDA Candidate Elimination Algorithm Example Demo of Candidate Elimination Algorithm Decision Trees Example Demo of Decision Trees Concept and Concept Learning A Concept
More informationINTRODUCTION TO CALCULUS LIMITS
Calculus can be divided into two ke areas: INTRODUCTION TO CALCULUS Differential Calculus dealing wit its, rates of cange, tangents and normals to curves, curve sketcing, and applications to maima and
More information1 Limits and Continuity
1 Limits and Continuity 1.0 Tangent Lines, Velocities, Growt In tion 0.2, we estimated te slope of a line tangent to te grap of a function at a point. At te end of tion 0.3, we constructed a new function
More informationRecall from our discussion of continuity in lecture a function is continuous at a point x = a if and only if
Computational Aspects of its. Keeping te simple simple. Recall by elementary functions we mean :Polynomials (including linear and quadratic equations) Eponentials Logaritms Trig Functions Rational Functions
More informationLecture 3: Decision Trees
Lecture 3: Decision Trees Cognitive Systems - Machine Learning Part I: Basic Approaches of Concept Learning ID3, Information Gain, Overfitting, Pruning last change November 26, 2014 Ute Schmid (CogSys,
More information1 Calculus. 1.1 Gradients and the Derivative. Q f(x+h) f(x)
Calculus. Gradients and te Derivative Q f(x+) δy P T δx R f(x) 0 x x+ Let P (x, f(x)) and Q(x+, f(x+)) denote two points on te curve of te function y = f(x) and let R denote te point of intersection of
More informationCopyright c 2008 Kevin Long
Lecture 4 Numerical solution of initial value problems Te metods you ve learned so far ave obtained closed-form solutions to initial value problems. A closedform solution is an explicit algebriac formula
More informationClassification (Categorization) CS 391L: Machine Learning: Inductive Classification. Raymond J. Mooney. Sample Category Learning Problem
Classification (Categorization) CS 9L: Machine Learning: Inductive Classification Raymond J. Mooney University of Texas at Austin Given: A description of an instance, x X, where X is the instance language
More informationLecture 3: Decision Trees
Lecture 3: Decision Trees Cognitive Systems II - Machine Learning SS 2005 Part I: Basic Approaches of Concept Learning ID3, Information Gain, Overfitting, Pruning Lecture 3: Decision Trees p. Decision
More informationChapter 2 Limits and Continuity
4 Section. Capter Limits and Continuity Section. Rates of Cange and Limits (pp. 6) Quick Review.. f () ( ) () 4 0. f () 4( ) 4. f () sin sin 0 4. f (). 4 4 4 6. c c c 7. 8. c d d c d d c d c 9. 8 ( )(
More informationCSCE 478/878 Lecture 6: Bayesian Learning
Bayesian Methods Not all hypotheses are created equal (even if they are all consistent with the training data) Outline CSCE 478/878 Lecture 6: Bayesian Learning Stephen D. Scott (Adapted from Tom Mitchell
More informationConsider a function f we ll specify which assumptions we need to make about it in a minute. Let us reformulate the integral. 1 f(x) dx.
Capter 2 Integrals as sums and derivatives as differences We now switc to te simplest metods for integrating or differentiating a function from its function samples. A careful study of Taylor expansions
More informationMachine Learning 2007: Slides 1. Instructor: Tim van Erven Website: erven/teaching/0708/ml/
Machine 2007: Slides 1 Instructor: Tim van Erven (Tim.van.Erven@cwi.nl) Website: www.cwi.nl/ erven/teaching/0708/ml/ September 6, 2007, updated: September 13, 2007 1 / 37 Overview The Most Important Supervised
More informationLearning Decision Trees
Learning Decision Trees Machine Learning Fall 2018 Some slides from Tom Mitchell, Dan Roth and others 1 Key issues in machine learning Modeling How to formulate your problem as a machine learning problem?
More informationCOMPSCI 514: Algorithms for Data Science
COMPSCI 514: Algoritms for Data Science Arya Mazumdar University of Massacusetts at Amerst Fall 2018 Lecture 11 Locality Sensitive Hasing Midterm exam Average 28.96 out of 35 (82.8%). One exam is still
More informationLecture XVII. Abstract We introduce the concept of directional derivative of a scalar function and discuss its relation with the gradient operator.
Lecture XVII Abstract We introduce te concept of directional derivative of a scalar function and discuss its relation wit te gradient operator. Directional derivative and gradient Te directional derivative
More informationMath 161 (33) - Final exam
Name: Id #: Mat 161 (33) - Final exam Fall Quarter 2015 Wednesday December 9, 2015-10:30am to 12:30am Instructions: Prob. Points Score possible 1 25 2 25 3 25 4 25 TOTAL 75 (BEST 3) Read eac problem carefully.
More information2.1 THE DEFINITION OF DERIVATIVE
2.1 Te Derivative Contemporary Calculus 2.1 THE DEFINITION OF DERIVATIVE 1 Te grapical idea of a slope of a tangent line is very useful, but for some uses we need a more algebraic definition of te derivative
More informationIntroduction to Machine Learning
Introduction to Machine Learning Concept Learning Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE 474/574 1 /
More informationThe derivative function
Roberto s Notes on Differential Calculus Capter : Definition of derivative Section Te derivative function Wat you need to know already: f is at a point on its grap and ow to compute it. Wat te derivative
More informationMAT 1339-S14 Class 2
MAT 1339-S14 Class 2 July 07, 2014 Contents 1 Rate of Cange 1 1.5 Introduction to Derivatives....................... 1 2 Derivatives 5 2.1 Derivative of Polynomial function.................... 5 2.2 Te
More informationDefinition of the Derivative
Te Limit Definition of te Derivative Tis Handout will: Define te limit grapically and algebraically Discuss, in detail, specific features of te definition of te derivative Provide a general strategy of
More informationEECS 349:Machine Learning Bryan Pardo
EECS 349:Machine Learning Bryan Pardo Topic 2: Decision Trees (Includes content provided by: Russel & Norvig, D. Downie, P. Domingos) 1 General Learning Task There is a set of possible examples Each example
More informationIntegral Calculus, dealing with areas and volumes, and approximate areas under and between curves.
Calculus can be divided into two ke areas: Differential Calculus dealing wit its, rates of cange, tangents and normals to curves, curve sketcing, and applications to maima and minima problems Integral
More information1 1. Rationalize the denominator and fully simplify the radical expression 3 3. Solution: = 1 = 3 3 = 2
MTH - Spring 04 Exam Review (Solutions) Exam : February 5t 6:00-7:0 Tis exam review contains questions similar to tose you sould expect to see on Exam. Te questions included in tis review, owever, are
More informationCombining functions: algebraic methods
Combining functions: algebraic metods Functions can be added, subtracted, multiplied, divided, and raised to a power, just like numbers or algebra expressions. If f(x) = x 2 and g(x) = x + 2, clearly f(x)
More informationMAT 145. Type of Calculator Used TI-89 Titanium 100 points Score 100 possible points
MAT 15 Test #2 Name Solution Guide Type of Calculator Used TI-89 Titanium 100 points Score 100 possible points Use te grap of a function sown ere as you respond to questions 1 to 8. 1. lim f (x) 0 2. lim
More informationChapters 19 & 20 Heat and the First Law of Thermodynamics
Capters 19 & 20 Heat and te First Law of Termodynamics Te Zerot Law of Termodynamics Te First Law of Termodynamics Termal Processes Te Second Law of Termodynamics Heat Engines and te Carnot Cycle Refrigerators,
More informationLesson 6: The Derivative
Lesson 6: Te Derivative Def. A difference quotient for a function as te form f(x + ) f(x) (x + ) x f(x + x) f(x) (x + x) x f(a + ) f(a) (a + ) a Notice tat a difference quotient always as te form of cange
More informationAVL trees. AVL trees
Dnamic set DT dnamic set DT is a structure tat stores a set of elements. Eac element as a (unique) ke and satellite data. Te structure supports te following operations. Searc(S, k) Return te element wose
More information