Lecture 23: Dynamic Programming: the algorithm
|
|
- Laura Jody Parker
- 5 years ago
- Views:
Transcription
1 Lecture 23: Dynamic Programming: the algorithm University of Southern California Linguistics 2 USC Linguistics November 1, 201 Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
2 Shortest Path problem SEA DET CLE LA DEN CHI DC NY DAL PHL Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
3 Shortest Path problem How do we solve this? Solution 1: exhaustive search How do we do that? For simplicity, we change the names of the cities to A,B,C, etc, Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
4 Exhaustive Search B D G A C E F H I J ABDGJ, ABEHJ, ABEGJ... Cost(ABDGJ) = + + +,... Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
5 Exhaustive Search Algorithm List every possible path Add costs on the path, and sum Find minimal sum Answer is the path corresponding to minimum sum Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, 201 / 1
6 Exhaustive Search B D G A C E F H I J Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, 201 / 1
7 Exhaustive Search results ABDGJ=30 ABDHJ=33 ABDIJ=3 ABEGJ=32 ABEHJ=34 ABEHI=3 ABFGJ=3 ABFHJ=3 ABFHI=3 ACDGJ=34 ACDHJ=3 ACDIJ=3 ACEGJ=31 ACEHJ=33 ACEHI=34 ACFGJ=33 ACFHJ=34 ACFHI=34 Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, 201 / 1
8 Exhaustive Search: So what s the problem? For this small network, there is no problem. But for a general network, it is too slow We can think of the network as a consisting of stages, and at each stage, there are several states. At each stage, the decision is between the n states in the next stage. The total number of decision combinations = (# states in stage 1) x (# states in stage 2) x (# states in stage 3)... As number of states increases, the number of combinations of decisions to be considered gets to be very large, staggeringly so. For example, if there are stages, each with 0 states (not an unrealistic real-world problem), the number of paths to be considered is 20, that is 1 followed by 20 zeroes. Also, in this algorithm, the cost associated with sub-paths needs to calculated over and over (e.g., cost(ab) needs to be computed times, cost(gj) times, etc.) Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, 201 / 1
9 Greedy Search (Suckers!) Alternative strategy: At every choice point, choose the lowest cost path Greedy, short-sighted, doesn t consider possible future paths It seems like it should work, but it doesn t. Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, 201 / 1
10 Greedy Search (Suckers!) B D G A C E F H I J A C E H J = 33 Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, 201 / 1
11 Greedy Search (Suckers!) So it doesn t work. Optimal solution (ABDGJ) has a cost of 30. Why doesn t it work? Selecting on path on an early stage may eliminate cheap paths on later stages. Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, 201 / 1
12 Dynamic Programming Approach Dynamic Programming is an alternative search strategy that is faster than Exhaustive search, slower than Greedy search, but gives the optimal solution. View a problem as consisting of subproblems: Aim: Solve main problem To achieve that aim, you need to solve some subproblems To achieve the solution to these subproblems, you need to solve a set of subsubproblems And so on... Dynamic Programming works when the subproblems have similar forms, and when the tiniest subproblems have very easy solutions. Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, 201 / 1
13 Dynamic Programming B D G A C E H J F I Main Problem: Shortest path from A to J: V AJ DP Thinking: Let s say I know the best paths from B to J and C to J: V BJ and V CJ. Then we would add V BJ to the cost from A to B, V CJ to the cost from C to J, compare the two, and pick the least. So we now have 2 subproblems V BJ and V CJ. If we could solve those subproblems, we could solve the main problem V AJ. But now we can think of V BJ as its own problem, and then repeat the thinking: If I knew the solutions to V DJ, V EJ, and V FJ, then we can solve V BJ. Same with V CJ. Continue thinking in this way till we get to: V GJ, V HJ, V IJ, which are easy to solve! Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
14 Dynamic Programming B D G A C E H J F I V AJ = min + VBJ +V CJ Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, 201 / 1
15 Dynamic Programming B D G A C E H J + VBJ V AJ = min +V CJ F < +V DJ = V BJ = min +V EJ +V FJ < + V DJ = V CJ = min + V EJ + V FJ I Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
16 Dynamic Programming B D G A C + VBJ VAJ = min +VCJ E H F I < +VGJ = VDJ = min +VHJ < +VDJ= + VIJ VBJ = min +VEJ +VFJ < +VGJ = VEJ = min +VHJ < + VDJ= +VIJ VCJ = min + VEJ + VFJ < + VGJ= VFJ = min +VHJ +VIJ J Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
17 Dynamic Programming B D G A C E F < +V GJ = V DJ = min +V HJ < +V DJ = + V IJ V BJ = min +V EJ V GJ = +V FJ < +V GJ = + VBJ V AJ = min V EJ = min +V V HJ = +V HJ CJ < + V DJ = +V IJ V CJ = min + V EJ V : IJ = ; + V FJ < + V GJ = V FJ = min +V HJ +V IJ H I Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1 J
18 Dynamic Programming B D G A E H J C G F I D < +V GJ = 13 V DJ = min +V HJ 1 < +V DJ = 1 + V IJ 1 B V V GJ = BJ = min +V EJ 20 +V FJ 24 < +V GJ = + VBJ 30 V AJ = min V EJ = min +V V HJ = HJ 1 +V CJ 31 < + V DJ = 2 +V IJ 1 V CJ = min + V EJ 24 V IJ = + V FJ 2 < + V GJ = 1 V FJ = min +V HJ 1 +V IJ 1 Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
19 Programming the problem solutions We are going to learn to write code for the dynamic programming solution to our little shortest path problem. Why? Programming requires explicit understanding of the problem and the solution Programming allows you to test your understanding of the problem and the solution (you don t need anyone to tell you if you are right; the program will work or not). Programming requires abstract characterization of the problem and the solution. The abstraction sometimes gives you some new insight into your problem, and also how it might be similar to other problems and give you insight into them. We begin by programming the greedy algorithm, because it is simpler. Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
20 Programming greedy search A B C D E F G H I First step is to replace the letter names for nodes(states) with a numbering scheme (i, j): i is the stage number j is the state number within a stage We do that so can write our code in a really simple way, and so the code can immediately be generalized to other networks with different numbers of stages and states. Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1 J
21 Programming Greedy Search 2, 1 3, 1 4, 1 1, 1 2, 2 3, 2 3, 3 4, 2 4, 3, 1 Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
22 Defining the network Create a matrix called costs which has four dimensions: costs(i,j,k,n) i = stage of begin node j = state of begin node k = stage of end node n = state of end node Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
23 Defining the network 2, 1 3, 1 4, 1 costs(1,1,2,1) = ; costs(1,1,2,2) = ; costs(2,1,3,1) = ; costs(2,1,3,2) = ; costs(2,1,3,3) = ; 1, 1 2, 2 3, 2 3, 3 4, 2 4, 3, 1 costs(2,2,3,1) = ; costs(2,2,3,2) = ; costs(2,2,3,3) = ; costs(3,1,4,1) = ; costs(3,1,4,2) = ; costs(3,1,4,3) = ; costs(3,2,4,1) = ; costs(3,2,4,2) = ; costs(3,2,4,3) = ; costs(3,3,4,1) = ; costs(3,3,4,2) = ; costs(3,3,4,3) = ; costs(4,1,,1) = ; costs(4,2,,1) = ; costs(4,3,,1) = ; Linguistics 2 (USC Linguistics) Lecture 23: Dynamic Programming: the algorithm November 1, / 1
CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 20: Dynamic Programming
CSE 101 Algorithm Design and Analysis Miles Jones mej016@eng.ucsd.edu Office 4208 CSE Building Lecture 20: Dynamic Programming DYNAMIC PROGRAMMING Dynamic programming is an algorithmic paradigm in which
More informationCS173 Lecture B, November 3, 2015
CS173 Lecture B, November 3, 2015 Tandy Warnow November 3, 2015 CS 173, Lecture B November 3, 2015 Tandy Warnow Announcements Examlet 7 is a take-home exam, and is due November 10, 11:05 AM, in class.
More informationNumber of solutions of a system
Roberto s Notes on Linear Algebra Chapter 3: Linear systems and matrices Section 7 Number of solutions of a system What you need to know already: How to solve a linear system by using Gauss- Jordan elimination.
More informationOptimisation and Operations Research
Optimisation and Operations Research Lecture 15: The Greedy Heuristic Matthew Roughan http://www.maths.adelaide.edu.au/matthew.roughan/ Lecture_notes/OORII/ School of
More informationAnswers. Investigation 3. ACE Assignment Choices. Applications. = = 210 (Note: students
Answers Investigation ACE Assignment Choices Problem. Core,,, Other Applications ; Connections, ; Etensions 7, ; unassigned choices from previous problems Problem. Core, Other Connections 7; Etensions
More informationMidterm Exam 2 Solutions
Algorithm Design and Analysis November 12, 2010 Pennsylvania State University CSE 565, Fall 2010 Professor Adam Smith Exam 2 Solutions Problem 1 (Miscellaneous). Midterm Exam 2 Solutions (a) Your friend
More informationLecture 7: Vectors and Matrices II Introduction to Matrices (See Sections, 3.3, 3.6, 3.7 and 3.9 in Boas)
Lecture 7: Vectors and Matrices II Introduction to Matrices (See Sections 3.3 3.6 3.7 and 3.9 in Boas) Here we will continue our discussion of vectors and their transformations. In Lecture 6 we gained
More informationData Structures in Java
Data Structures in Java Lecture 20: Algorithm Design Techniques 12/2/2015 Daniel Bauer 1 Algorithms and Problem Solving Purpose of algorithms: find solutions to problems. Data Structures provide ways of
More informationCS 6901 (Applied Algorithms) Lecture 3
CS 6901 (Applied Algorithms) Lecture 3 Antonina Kolokolova September 16, 2014 1 Representative problems: brief overview In this lecture we will look at several problems which, although look somewhat similar
More informationLECTURES 14/15: LINEAR INDEPENDENCE AND BASES
LECTURES 14/15: LINEAR INDEPENDENCE AND BASES MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 1. Linear Independence We have seen in examples of span sets of vectors that sometimes adding additional vectors
More informationData Structures in Java
Data Structures in Java Lecture 21: Introduction to NP-Completeness 12/9/2015 Daniel Bauer Algorithms and Problem Solving Purpose of algorithms: find solutions to problems. Data Structures provide ways
More informationLINEAR ALGEBRA KNOWLEDGE SURVEY
LINEAR ALGEBRA KNOWLEDGE SURVEY Instructions: This is a Knowledge Survey. For this assignment, I am only interested in your level of confidence about your ability to do the tasks on the following pages.
More informationDefinition: A "system" of equations is a set or collection of equations that you deal with all together at once.
System of Equations Definition: A "system" of equations is a set or collection of equations that you deal with all together at once. There is both an x and y value that needs to be solved for Systems
More informationFinal Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2
Final Review Sheet The final will cover Sections Chapters 1,2,3 and 4, as well as sections 5.1-5.4, 6.1-6.2 and 7.1-7.3 from chapters 5,6 and 7. This is essentially all material covered this term. Watch
More informationSimple Techniques for Improving SGD. CS6787 Lecture 2 Fall 2017
Simple Techniques for Improving SGD CS6787 Lecture 2 Fall 2017 Step Sizes and Convergence Where we left off Stochastic gradient descent x t+1 = x t rf(x t ; yĩt ) Much faster per iteration than gradient
More informationSection Notes 8. Integer Programming II. Applied Math 121. Week of April 5, expand your knowledge of big M s and logical constraints.
Section Notes 8 Integer Programming II Applied Math 121 Week of April 5, 2010 Goals for the week understand IP relaxations be able to determine the relative strength of formulations understand the branch
More informationMachine Learning. Kernels. Fall (Kernels, Kernelized Perceptron and SVM) Professor Liang Huang. (Chap. 12 of CIML)
Machine Learning Fall 2017 Kernels (Kernels, Kernelized Perceptron and SVM) Professor Liang Huang (Chap. 12 of CIML) Nonlinear Features x4: -1 x1: +1 x3: +1 x2: -1 Concatenated (combined) features XOR:
More informationMAKING A BINARY HEAP
CSE 101 Algorithm Design and Analysis Miles Jones mej016@eng.uc sd.edu Office 4208 CSE Building Lecture 19: Divide and Conquer Design examples/dynamic Programming MAKING A BINARY HEAP Base case. Break
More information1 Two Speeds that Determine Retention of an Atmosphere
General Astronomy (29:61) Fall 2012 Lecture 26 Notes, November 2, 2012 1 Two Speeds that Determine Retention of an Atmosphere We can use some of the physics we learned earlier in the semester. In fact,
More informationMATH 310, REVIEW SHEET 2
MATH 310, REVIEW SHEET 2 These notes are a very short summary of the key topics in the book (and follow the book pretty closely). You should be familiar with everything on here, but it s not comprehensive,
More informationAnswers Investigation 3
Answers Investigation Applications. a., b. s = n c. The numbers seem to be increasing b a greater amount each time. The square number increases b consecutive odd integers:,, 7,, c X X=. a.,,, b., X 7 X=
More informationAutomated Program Verification and Testing 15414/15614 Fall 2016 Lecture 8: Procedures for First-Order Theories, Part 2
Automated Program Verification and Testing 15414/15614 Fall 2016 Lecture 8: Procedures for First-Order Theories, Part 2 Matt Fredrikson mfredrik@cs.cmu.edu October 17, 2016 Matt Fredrikson Theory Procedures
More informationPhysics 6303 Lecture 22 November 7, There are numerous methods of calculating these residues, and I list them below. lim
Physics 6303 Lecture 22 November 7, 208 LAST TIME:, 2 2 2, There are numerous methods of calculating these residues, I list them below.. We may calculate the Laurent series pick out the coefficient. 2.
More informationCSE 421 Greedy Algorithms / Interval Scheduling
CSE 421 Greedy Algorithms / Interval Scheduling Yin Tat Lee 1 Interval Scheduling Job j starts at s(j) and finishes at f(j). Two jobs compatible if they don t overlap. Goal: find maximum subset of mutually
More informationMAKING A BINARY HEAP
CSE 101 Algorithm Design and Analysis Miles Jones mej016@eng.uc sd.edu Office 4208 CSE Building Lecture 19: Divide and Conquer Design examples/dynamic Programming MAKING A BINARY HEAP Base case. Break
More informationAnnouncements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic
CS 188: Artificial Intelligence Fall 2008 Lecture 16: Bayes Nets III 10/23/2008 Announcements Midterms graded, up on glookup, back Tuesday W4 also graded, back in sections / box Past homeworks in return
More informationApproximate Gaussian Elimination for Laplacians
CSC 221H : Graphs, Matrices, and Optimization Lecture 9 : 19 November 2018 Approximate Gaussian Elimination for Laplacians Lecturer: Sushant Sachdeva Scribe: Yasaman Mahdaviyeh 1 Introduction Recall sparsifiers
More informationCMPSCI611: Three Divide-and-Conquer Examples Lecture 2
CMPSCI611: Three Divide-and-Conquer Examples Lecture 2 Last lecture we presented and analyzed Mergesort, a simple divide-and-conquer algorithm. We then stated and proved the Master Theorem, which gives
More informationNote that M i,j depends on two entries in row (i 1). If we proceed in a row major order, these two entries will be available when we are ready to comp
CSE 3500 Algorithms and Complexity Fall 2016 Lecture 18: October 27, 2016 Dynamic Programming Steps involved in a typical dynamic programming algorithm are: 1. Identify a function such that the solution
More informationNP-Completeness I. Lecture Overview Introduction: Reduction and Expressiveness
Lecture 19 NP-Completeness I 19.1 Overview In the past few lectures we have looked at increasingly more expressive problems that we were able to solve using efficient algorithms. In this lecture we introduce
More informationApplied Natural Language Processing
Applied Natural Language Processing Info 256 Lecture 3: Finding Distinctive Terms (Jan 29, 2019) David Bamman, UC Berkeley https://www.nytimes.com/interactive/ 2017/11/07/upshot/modern-love-what-wewrite-when-we-write-about-love.html
More informationCSE370 HW3 Solutions (Winter 2010)
CSE370 HW3 Solutions (Winter 2010) 1. CL2e, 4.9 We are asked to implement the function f(a,,c,,e) = A + C + + + CE using the smallest possible multiplexer. We can t use any extra gates or the complement
More informationFirst Derivative Test
MA 2231 Lecture 22 - Concavity and Relative Extrema Wednesday, November 1, 2017 Objectives: Introduce the Second Derivative Test and its limitations. First Derivative Test When looking for relative extrema
More informationLS.1 Review of Linear Algebra
LS. LINEAR SYSTEMS LS.1 Review of Linear Algebra In these notes, we will investigate a way of handling a linear system of ODE s directly, instead of using elimination to reduce it to a single higher-order
More informationOptimization (168) Lecture 7-8-9
Optimization (168) Lecture 7-8-9 Jesús De Loera UC Davis, Mathematics Wednesday, April 2, 2012 1 DEGENERACY IN THE SIMPLEX METHOD 2 DEGENERACY z =2x 1 x 2 + 8x 3 x 4 =1 2x 3 x 5 =3 2x 1 + 4x 2 6x 3 x 6
More informationAlgorithms for Bioinformatics
Adapted from slides by Alexandru Tomescu, Leena Salmela, Veli Mäkinen, Esa Pitkänen 582670 Algorithms for Bioinformatics Lecture 5: Combinatorial Algorithms and Genomic Rearrangements 1.10.2015 Background
More informationLECTURES 4/5: SYSTEMS OF LINEAR EQUATIONS
LECTURES 4/5: SYSTEMS OF LINEAR EQUATIONS MA1111: LINEAR ALGEBRA I, MICHAELMAS 2016 1 Linear equations We now switch gears to discuss the topic of solving linear equations, and more interestingly, systems
More informationInteger Linear Programming
Integer Linear Programming Solution : cutting planes and Branch and Bound Hugues Talbot Laboratoire CVN April 13, 2018 IP Resolution Gomory s cutting planes Solution branch-and-bound General method Resolution
More informationLesson 3-2: Solving Linear Systems Algebraically
Yesterday we took our first look at solving a linear system. We learned that a linear system is two or more linear equations taken at the same time. Their solution is the point that all the lines have
More informationLecture 13: Dynamic Programming Part 2 10:00 AM, Feb 23, 2018
CS18 Integrated Introduction to Computer Science Fisler, Nelson Lecture 13: Dynamic Programming Part 2 10:00 AM, Feb 23, 2018 Contents 1 Holidays 1 1.1 Halloween..........................................
More informationDENSITY FUNCTIONAL THEORY FOR NON-THEORISTS JOHN P. PERDEW DEPARTMENTS OF PHYSICS AND CHEMISTRY TEMPLE UNIVERSITY
DENSITY FUNCTIONAL THEORY FOR NON-THEORISTS JOHN P. PERDEW DEPARTMENTS OF PHYSICS AND CHEMISTRY TEMPLE UNIVERSITY A TUTORIAL FOR PHYSICAL SCIENTISTS WHO MAY OR MAY NOT HATE EQUATIONS AND PROOFS REFERENCES
More informationHow hard is it to find a good solution?
How hard is it to find a good solution? Simons Institute Open Lecture November 4, 2013 Research Area: Complexity Theory Given a computational problem, find an efficient algorithm that solves it. Goal of
More informationClassification and Regression Trees
Classification and Regression Trees Ryan P Adams So far, we have primarily examined linear classifiers and regressors, and considered several different ways to train them When we ve found the linearity
More informationMA 1128: Lecture 08 03/02/2018. Linear Equations from Graphs And Linear Inequalities
MA 1128: Lecture 08 03/02/2018 Linear Equations from Graphs And Linear Inequalities Linear Equations from Graphs Given a line, we would like to be able to come up with an equation for it. I ll go over
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning First-Order Methods, L1-Regularization, Coordinate Descent Winter 2016 Some images from this lecture are taken from Google Image Search. Admin Room: We ll count final numbers
More informationDiscrete (and Continuous) Optimization WI4 131
Discrete (and Continuous) Optimization WI4 131 Kees Roos Technische Universiteit Delft Faculteit Electrotechniek, Wiskunde en Informatica Afdeling Informatie, Systemen en Algoritmiek e-mail: C.Roos@ewi.tudelft.nl
More informationSingular perturbation theory
Singular perturbation theory Marc R Roussel October 19, 2005 1 Introduction When we apply the steady-state approximation (SSA) in chemical kinetics, we typically argue that some of the intermediates are
More informationChapter Four: Motion
Chapter Four: Motion 4.1 Speed and Velocity 4.2 Graphs of Motion 4.3 Acceleration Section 4.3 Learning Goals Define acceleration. Determine acceleration by mathematical and graphical means. Explain the
More informationDynamic Programming( Weighted Interval Scheduling)
Dynamic Programming( Weighted Interval Scheduling) 17 November, 2016 Dynamic Programming 1 Dynamic programming algorithms are used for optimization (for example, finding the shortest path between two points,
More informationORIE 6334 Spectral Graph Theory November 22, Lecture 25
ORIE 64 Spectral Graph Theory November 22, 206 Lecture 25 Lecturer: David P. Williamson Scribe: Pu Yang In the remaining three lectures, we will cover a prominent result by Arora, Rao, and Vazirani for
More informationSolving with Absolute Value
Solving with Absolute Value Who knew two little lines could cause so much trouble? Ask someone to solve the equation 3x 2 = 7 and they ll say No problem! Add just two little lines, and ask them to solve
More informationQuadratic Equations Part I
Quadratic Equations Part I Before proceeding with this section we should note that the topic of solving quadratic equations will be covered in two sections. This is done for the benefit of those viewing
More informationGEOMETRY HW 8. 1 x z
GEOMETRY HW 8 CLAY SHONKWILER Consider the Heisenberg group x z 0 y which is a Lie group diffeomorphic to R 3 a: Find the left invariant vector fields X, Y, Z on R 3 whose value at the identity is the
More informationComplex Numbers Year 12 QLD Maths C
Complex Numbers Representations of Complex Numbers We can easily visualise most natural numbers and effectively use them to describe quantities in day to day life, for example you could describe a group
More informationthe tree till a class assignment is reached
Decision Trees Decision Tree for Playing Tennis Prediction is done by sending the example down Prediction is done by sending the example down the tree till a class assignment is reached Definitions Internal
More informationMATH 341 MIDTERM 2. (a) [5 pts] Demonstrate that A and B are row equivalent by providing a sequence of row operations leading from A to B.
11/01/2011 Bormashenko MATH 341 MIDTERM 2 Show your work for all the problems. Good luck! (1) Let A and B be defined as follows: 1 1 2 A =, B = 1 2 3 0 2 ] 2 1 3 4 Name: (a) 5 pts] Demonstrate that A and
More informationPredator - Prey Model Trajectories are periodic
Predator - Prey Model Trajectories are periodic James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 4, 2013 Outline 1 Showing The PP
More informationLecture 2. Conditional Probability
Math 408 - Mathematical Statistics Lecture 2. Conditional Probability January 18, 2013 Konstantin Zuev (USC) Math 408, Lecture 2 January 18, 2013 1 / 9 Agenda Motivation and Definition Properties of Conditional
More informationWhy should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.
I. Probability basics (Sections 4.1 and 4.2) Flip a fair (probability of HEADS is 1/2) coin ten times. What is the probability of getting exactly 5 HEADS? What is the probability of getting exactly 10
More informationClustering. Léon Bottou COS 424 3/4/2010. NEC Labs America
Clustering Léon Bottou NEC Labs America COS 424 3/4/2010 Agenda Goals Representation Capacity Control Operational Considerations Computational Considerations Classification, clustering, regression, other.
More informationMath 320, spring 2011 before the first midterm
Math 320, spring 2011 before the first midterm Typical Exam Problems 1 Consider the linear system of equations 2x 1 + 3x 2 2x 3 + x 4 = y 1 x 1 + 3x 2 2x 3 + 2x 4 = y 2 x 1 + 2x 3 x 4 = y 3 where x 1,,
More informationDynamic Programming. Prof. S.J. Soni
Dynamic Programming Prof. S.J. Soni Idea is Very Simple.. Introduction void calculating the same thing twice, usually by keeping a table of known results that fills up as subinstances are solved. Dynamic
More informationLecture 18: P & NP. Revised, May 1, CLRS, pp
Lecture 18: P & NP Revised, May 1, 2003 CLRS, pp.966-982 The course so far: techniques for designing efficient algorithms, e.g., divide-and-conquer, dynamic-programming, greedy-algorithms. What happens
More informationAdvanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras
Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture - 3 Simplex Method for Bounded Variables We discuss the simplex algorithm
More informationDIFFERENTIAL GEOMETRY HW 7
DIFFERENTIAL GEOMETRY HW 7 CLAY SHONKWILER 1 Show that within a local coordinate system x 1,..., x n ) on M with coordinate vector fields X 1 / x 1,..., X n / x n, if we pick n 3 smooth real-valued functions
More information1.7 Digital Logic Inverters
11/5/2004 section 1_7 Digital nverters blank 1/2 1.7 Digital Logic nverters Reading Assignment: pp. 40-48 Consider the ideal digital logic inverter. Q: A: H: The deal nverter Q: A: H: Noise Margins H:
More informationStructure Learning: the good, the bad, the ugly
Readings: K&F: 15.1, 15.2, 15.3, 15.4, 15.5 Structure Learning: the good, the bad, the ugly Graphical Models 10708 Carlos Guestrin Carnegie Mellon University September 29 th, 2006 1 Understanding the uniform
More informationLecture 17: Evaluating similarity using the inner product
Lecture 17: Evaluating similarity using the inner product University of Southern California Linguistics 285 USC Linguistics October 25, 2015 Linguistics 285 (USC Linguistics) Lecture 17: Evaluating similarity
More informationLecture 2: Divide and conquer and Dynamic programming
Chapter 2 Lecture 2: Divide and conquer and Dynamic programming 2.1 Divide and Conquer Idea: - divide the problem into subproblems in linear time - solve subproblems recursively - combine the results in
More informationP, NP, NP-Complete, and NPhard
P, NP, NP-Complete, and NPhard Problems Zhenjiang Li 21/09/2011 Outline Algorithm time complicity P and NP problems NP-Complete and NP-Hard problems Algorithm time complicity Outline What is this course
More informationLecture 8: Determinants I
8-1 MATH 1B03/1ZC3 Winter 2019 Lecture 8: Determinants I Instructor: Dr Rushworth January 29th Determinants via cofactor expansion (from Chapter 2.1 of Anton-Rorres) Matrices encode information. Often
More informationSometimes the domains X and Z will be the same, so this might be written:
II. MULTIVARIATE CALCULUS The first lecture covered functions where a single input goes in, and a single output comes out. Most economic applications aren t so simple. In most cases, a number of variables
More informationDesigning Information Devices and Systems I Fall 2018 Lecture Notes Note Positioning Sytems: Trilateration and Correlation
EECS 6A Designing Information Devices and Systems I Fall 08 Lecture Notes Note. Positioning Sytems: Trilateration and Correlation In this note, we ll introduce two concepts that are critical in our positioning
More informationCSE 206A: Lattice Algorithms and Applications Spring Basic Algorithms. Instructor: Daniele Micciancio
CSE 206A: Lattice Algorithms and Applications Spring 2014 Basic Algorithms Instructor: Daniele Micciancio UCSD CSE We have already seen an algorithm to compute the Gram-Schmidt orthogonalization of a lattice
More informationLecture 23: More PSPACE-Complete, Randomized Complexity
6.045 Lecture 23: More PSPACE-Complete, Randomized Complexity 1 Final Exam Information Who: You On What: Everything through PSPACE (today) With What: One sheet (double-sided) of notes are allowed When:
More informationM. Matrices and Linear Algebra
M. Matrices and Linear Algebra. Matrix algebra. In section D we calculated the determinants of square arrays of numbers. Such arrays are important in mathematics and its applications; they are called matrices.
More informationLinear Algebra Practice Final
. Let (a) First, Linear Algebra Practice Final Summer 3 3 A = 5 3 3 rref([a ) = 5 so if we let x 5 = t, then x 4 = t, x 3 =, x = t, and x = t, so that t t x = t = t t whence ker A = span(,,,, ) and a basis
More informationITEC2620 Introduction to Data Structures
ITEC2620 Introduction to Data Structures Lecture 6a Complexity Analysis Recursive Algorithms Complexity Analysis Determine how the processing time of an algorithm grows with input size What if the algorithm
More information22A-2 SUMMER 2014 LECTURE 5
A- SUMMER 0 LECTURE 5 NATHANIEL GALLUP Agenda Elimination to the identity matrix Inverse matrices LU factorization Elimination to the identity matrix Previously, we have used elimination to get a system
More informationTemperature and Radiation. What can we learn from light? Temperature, Heat, or Thermal Energy? Kelvin Temperature Scale
What can we learn from light? Temperature Energy Chemical Composition Speed towards or away from us All from the spectrum! Temperature and Radiation Why do different objects give off different forms of
More information1 Divide and Conquer (September 3)
The control of a large force is the same principle as the control of a few men: it is merely a question of dividing up their numbers. Sun Zi, The Art of War (c. 400 C.E.), translated by Lionel Giles (1910)
More informationMH1200 Final 2014/2015
MH200 Final 204/205 November 22, 204 QUESTION. (20 marks) Let where a R. A = 2 3 4, B = 2 3 4, 3 6 a 3 6 0. For what values of a is A singular? 2. What is the minimum value of the rank of A over all a
More informationCSE 417. Chapter 4: Greedy Algorithms. Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.
CSE 417 Chapter 4: Greedy Algorithms Many Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 Greed is good. Greed is right. Greed works. Greed clarifies, cuts through,
More informationCS 188: Artificial Intelligence Spring Announcements
CS 188: Artificial Intelligence Spring 2011 Lecture 16: Bayes Nets IV Inference 3/28/2011 Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew Moore Announcements
More informationPredator - Prey Model Trajectories are periodic
Predator - Prey Model Trajectories are periodic James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 4, 2013 Outline Showing The PP Trajectories
More informationSymplectic integration. Yichao Jing
Yichao Jing Hamiltonian & symplecticness Outline Numerical integrator and symplectic integration Application to accelerator beam dynamics Accuracy and integration order Hamiltonian dynamics In accelerator,
More informationMetric-based classifiers. Nuno Vasconcelos UCSD
Metric-based classifiers Nuno Vasconcelos UCSD Statistical learning goal: given a function f. y f and a collection of eample data-points, learn what the function f. is. this is called training. two major
More informationPositioning Servo Design Example
Positioning Servo Design Example 1 Goal. The goal in this design example is to design a control system that will be used in a pick-and-place robot to move the link of a robot between two positions. Usually
More informationMatrices Gaussian elimination Determinants. Graphics 2009/2010, period 1. Lecture 4: matrices
Graphics 2009/2010, period 1 Lecture 4 Matrices m n matrices Matrices Definitions Diagonal, Identity, and zero matrices Addition Multiplication Transpose and inverse The system of m linear equations in
More informationThe graphs above are based on the average data from our marble trials. What are the differences between these two graphs? Why do you suppose they are
The graphs above are based on the average data from our marble trials. What are the differences between these two graphs? Why do you suppose they are different? What does each graph tell us about our experiment?
More informationCS4800: Algorithms & Data Jonathan Ullman
CS4800: Algorithms & Data Jonathan Ullman Lecture 22: Greedy Algorithms: Huffman Codes Data Compression and Entropy Apr 5, 2018 Data Compression How do we store strings of text compactly? A (binary) code
More informationSlides for Lecture 16
Slides for Lecture 6 ENEL 353: Digital Circuits Fall 203 Term Steve Norman, PhD, PEng Electrical & Computer Engineering Schulich School of Engineering University of Calgary 6 October, 203 ENEL 353 F3 Section
More informationOptimisation and Operations Research
Optimisation and Operations Research Lecture 22: Linear Programming Revisited Matthew Roughan http://www.maths.adelaide.edu.au/matthew.roughan/ Lecture_notes/OORII/ School
More informationLecture 18: Inner Product, Similarity, and Loops
Lecture 18: Inner Product, Similarity, and Loops University of Southern California Linguistics 285 USC Linguistics October 28, 2015 Linguistics 285 (USC Linguistics) Lecture 18: Inner Product, Similarity,
More informationAutomata Theory CS Complexity Theory I: Polynomial Time
Automata Theory CS411-2015-17 Complexity Theory I: Polynomial Time David Galles Department of Computer Science University of San Francisco 17-0: Tractable vs. Intractable If a problem is recursive, then
More informationP vs. NP. Data Structures and Algorithms CSE AU 1
P vs. NP Data Structures and Algorithms CSE 373-18AU 1 Goals for today Define P, NP, and NP-complete Explain the P vs. NP problem -why it s the biggest open problem in CS. -And what to do when a problem
More informationLecture 7 - Momentum. A Puzzle... Momentum. Basics (1)
Lecture 7 - omentum A Puzzle... An Experiment on Energy The shortest configuration of string joining three given points is the one where all three angles at the point of intersection equal 120. How could
More informationPreface.
This document was written and copyrighted by Paul Dawkins. Use of this document and its online version is governed by the Terms and Conditions of Use located at. The online version of this document is
More informationLinear Programming Duality
Summer 2011 Optimization I Lecture 8 1 Duality recap Linear Programming Duality We motivated the dual of a linear program by thinking about the best possible lower bound on the optimal value we can achieve
More informationMulti-Level Logic Optimization. Technology Independent. Thanks to R. Rudell, S. Malik, R. Rutenbar. University of California, Berkeley, CA
Technology Independent Multi-Level Logic Optimization Prof. Kurt Keutzer Prof. Sanjit Seshia EECS University of California, Berkeley, CA Thanks to R. Rudell, S. Malik, R. Rutenbar 1 Logic Optimization
More information