Problem-Solving via Search Lecture 3

Similar documents
Solving Problems By Searching

Analysis of Algorithm Efficiency. Dr. Yingwu Zhu

Algorithm. Executing the Max algorithm. Algorithm and Growth of Functions Benchaporn Jantarakongkul. (algorithm) ก ก. : ก {a i }=a 1,,a n a i N,

Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ. Instructor: X. Zhang

CSE 473: Artificial Intelligence Spring 2014

Computer Algorithms CISC4080 CIS, Fordham Univ. Outline. Last class. Instructor: X. Zhang Lecture 2

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 2

Lecture 10: Big-Oh. Doina Precup With many thanks to Prakash Panagaden and Mathieu Blanchette. January 27, 2014

Algorithms and Their Complexity

COMP 182 Algorithmic Thinking. Algorithm Efficiency. Luay Nakhleh Computer Science Rice University

CS 4100 // artificial intelligence. Recap/midterm review!

Problem solving and search. Chapter 3. Chapter 3 1

3. Algorithms. What matters? How fast do we solve the problem? How much computer resource do we need?

Measuring Goodness of an Algorithm. Asymptotic Analysis of Algorithms. Measuring Efficiency of an Algorithm. Algorithm and Data Structure

Deterministic Planning and State-space Search. Erez Karpas

Growth of Functions (CLRS 2.3,3)

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

csci 210: Data Structures Program Analysis

Principles of Algorithm Analysis

Data Structures and Algorithms. Asymptotic notation

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Fundamentals of Programming. Efficiency of algorithms November 5, 2017

Tree/Graph Search. q IDDFS-? B C. Search CSL452 - ARTIFICIAL INTELLIGENCE 2

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

csci 210: Data Structures Program Analysis

Runtime Complexity. CS 331: Data Structures and Algorithms

Big O Notation. P. Danziger

Analysis of Algorithms

CSED233: Data Structures (2017F) Lecture4: Analysis of Algorithms

Algorithm runtime analysis and computational tractability

Complexity (Pre Lecture)

Enumerate all possible assignments and take the An algorithm is a well-defined computational

Discrete Mathematics CS October 17, 2006

CSE 421: Intro Algorithms. 2: Analysis. Winter 2012 Larry Ruzzo

Big O 2/14/13. Administrative. Does it terminate? David Kauchak cs302 Spring 2013

Computational Complexity

Asymptotic Analysis. Thomas A. Anastasio. January 7, 2004

CS1210 Lecture 23 March 8, 2019

Algorithms, Design and Analysis. Order of growth. Table 2.1. Big-oh. Asymptotic growth rate. Types of formulas for basic operation count

Great Theoretical Ideas in Computer Science. Lecture 7: Introduction to Computational Complexity

CSE 417: Algorithms and Computational Complexity

2 GRAPH AND NETWORK OPTIMIZATION. E. Amaldi Introduction to Operations Research Politecnico Milano 1

Growth of Functions. As an example for an estimate of computation time, let us consider the sequential search algorithm.

CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis. Riley Porter Winter 2017

CSCE 222 Discrete Structures for Computing

Time Complexity (1) CSCI Spring Original Slides were written by Dr. Frederick W Maier. CSCI 2670 Time Complexity (1)

Last time: Summary. Last time: Summary

Lecture 2. Fundamentals of the Analysis of Algorithm Efficiency

Great Theoretical Ideas in Computer Science. Lecture 9: Introduction to Computational Complexity

Running Time Evaluation

Limitations of Algorithm Power

Complexity Theory Part I

Problem. Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26

Divide and Conquer Problem Solving Method

In complexity theory, algorithms and problems are classified by the growth order of computation time as a function of instance size.

Lecture 1: Asymptotics, Recurrences, Elementary Sorting

CS 310 Advanced Data Structures and Algorithms

CS 4407 Algorithms Lecture 2: Iterative and Divide and Conquer Algorithms

P, NP, NP-Complete, and NPhard

CISC 235: Topic 1. Complexity of Iterative Algorithms

AI Programming CS S-06 Heuristic Search

Copyright 2000, Kevin Wayne 1

CS Data Structures and Algorithm Analysis

Theory of Computation Chapter 1: Introduction

Analysis of Algorithms

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms

Introduction. An Introduction to Algorithms and Data Structures

Computational complexity and some Graph Theory

Analysis of Algorithms

Big O Notation. P. Danziger

Ch 01. Analysis of Algorithms

Mathematical Background. Unsigned binary numbers. Powers of 2. Logs and exponents. Mathematical Background. Today, we will review:

MA008/MIIZ01 Design and Analysis of Algorithms Lecture Notes 2

Theory of Computation

Asymptotic Running Time of Algorithms

Data Structures and Algorithms Running time and growth functions January 18, 2018

ICS 252 Introduction to Computer Design

Data Structures and Algorithms

Unit 1A: Computational Complexity

Example: Fib(N) = Fib(N-1) + Fib(N-2), Fib(1) = 0, Fib(2) = 1

INF2220: algorithms and data structures Series 1

Topic 17. Analysis of Algorithms

Principles of Computing, Carnegie Mellon University. The Limits of Computing

Analysis of Algorithms [Reading: CLRS 2.2, 3] Laura Toma, csci2200, Bowdoin College

What is Performance Analysis?

Algorithms and Theory of Computation. Lecture 2: Big-O Notation Graph Algorithms

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 3. Θ Notation. Comparing Algorithms

2. ALGORITHM ANALYSIS

Outline. Complexity Theory. Example. Sketch of a log-space TM for palindromes. Log-space computations. Example VU , SS 2018

CS 344 Design and Analysis of Algorithms. Tarek El-Gaaly Course website:

Algorithms. Grad Refresher Course 2011 University of British Columbia. Ron Maharik

Artificial Intelligence Methods. Problem solving by searching

Correctness of Dijkstra s algorithm

Data Structures. Outline. Introduction. Andres Mendez-Vazquez. December 3, Data Manipulation Examples

Data structures Exercise 1 solution. Question 1. Let s start by writing all the functions in big O notation:

Lecture 1 - Preliminaries

2.2 Asymptotic Order of Growth. definitions and notation (2.2) examples (2.4) properties (2.2)

CP405 Theory of Computation

data structures and algorithms lecture 2

Announcements. CompSci 230 Discrete Math for Computer Science. The Growth of Functions. Section 3.2

Transcription:

Lecture 3 What is a search problem? How do search algorithms work and how do we evaluate their performance? 1

Agenda An example task Problem formulation Infrastructure for search algorithms Complexity analysis 2

A Motivating Task Start: Arad, Romania Goal: Bucharest, Romania Roads leading to Sibiu, Timisoara, Zerind What is a rational agent to do? 3

Wentworth Institute of Technology COMP3770 Artificial Intelligence Summer 2017 Derbinsky Add Geographical Knowledge 4

Add Abstraction 5

Describe the Task Observability Certainty Representation A priori Full Deterministic Discrete Known Under these conditions we can search for a problem solution, a fixed sequence of actions Given a perfect model, can be done open-loop (i.e. ignore percepts) 6

Search Problem Formalism Defined via the following components: The initial state the agent starts in A successor/transition function S(x) = {action -> state, cost} A goal test, which returns true if a given state is a goal state G(x) = true/false A path cost that assigns a numeric cost to each path Typically assumed to be sum of action costs A solution is a sequence of actions leading from initial state to a goal state (optimal = lowest path cost) Together the initial state and successor function implicitly define the state space, the set of all reachable states 7

Example: Romanian Travel Initial state Arad Successor Adjacency, cost=distance Goal test City?= Bucharest State space Cities 8

Example: Pacman Initial state N, 1.0 Successor function E, 1.0 State space Goal test: no more food (e.g. ) 9

State Abstraction Often world states are absurdly complex To make a particular problem tractable, we abstract the state to only represent details necessary to solve the problem 10

Abstraction is Necessary World state Agent positions: 120 Food count: 30 Ghost positions: 12 Agent facing: NSEW How many world states? 120x(2 30 )x(12 2 )x4 states for path planning? 120 states for eat-all-dots? 120x(2 30 ) 11

Exercise: Abstractions Path Planning States: (x,y) Actions: NSEW Successor: (x,y ) Goal test: (x,y)=end Eat All the Dots States: {(x,y), T/F grid} Actions: NSEW Successor: (x,y ), possibly T/F change Goal test: grid = all F s 12

Exercise Define the vacuum-cleaner search problem: World state representation Search state representation Transition model State space Goal test A B 13

Solution State Space Graph L R L R S S L R R L R R L L S S S S L R L R S S 14

State Space Graph State space graph: A mathematical representation of a search problem Nodes are (abstracted) world configurations Arcs represent successors (action results) The goal test is a set of goal node(s) In a search graph, each state occurs only once! We can rarely build this full graph in memory (i.e. it s too big), but it s a useful idea S b p a d q c h e r f G 15

Search Tree A what if tree of plans and their outcomes The start state is the root node Children correspond to successors Nodes show states, but correspond to PLANS that achieve those states For most problems, we can never actually build the whole tree N, 1.0 E, 1.0 16

State Space Graph vs. Search Tree S b p a d q c h e r f G b a Each NODE in in the search tree is an entire PATH in the state space graph. d c a p h q e r f q c G a p S h q e r f q c G a p q We construct both on demand and we construct as little as possible. 17

Exercise Consider the following 4-state state space graph How big is its search tree (from S)? S a G 1 b 18

Searching for Solutions Basic idea: incrementally build a search tree until a goal state is found Root = initial state Expand via transition function to create new nodes Nodes that haven t been expanded are leaf nodes and form the frontier (open list) Different search strategies (next lecture) choose next node to expand (as few as possible!) Use a closed list to prevent expanding the same state more than once 19

General Algorithm Queue (FIFO) Stack (LIFO) Priority Queue 20

Evaluating a Search Strategy Solution Completeness: does it always find a solution if one exists? Efficiency Time Complexity: number of nodes generated/expanded Optimality: does it always find a least-cost solution? Space Complexity: maximum number of nodes in memory 21

Computational Complexity (A.1) We are going to be comparing several algorithms How do we tell if one is faster/leaner than another? Benchmarking involves running the algorithm on a computer and measuring performance (e.g. time in sec, memory in bytes) Unsatisfactory: specific to machine, implementation, compiler, inputs, Complexity Analysis is a mathematical approach that abstracts away from these details 22

Asymptotic Analysis Basic idea: get a sense of rate of growth of an algorithm, which tells us how bad it will get as problem size grows Example def summation(l): sum = 0 for n in l: sum += n return sum >>> summation(range(5)) 10 23

Step 1: Identify Size Parameter We need to abstract over the input and just identify what parameter characterizes the size of the input def summation(l): sum = 0 for n in l: sum += n return sum For the example what matters is the length of the input list We ll refer to this as n 24

Step 2: Identify Performance Measure Again, abstract over the implementation and find a measure that reflects running time (or memory usage), not tied to a particular computer def summation(l): sum = 0 for n in l: sum += n return sum In this case it could be lines executed, or operations (additions, assignments) performed Call this f(n) If f(n) measures lines executed? f(n) = 2n + 2 25

Step 3: Identify Comparison Metric It is typically not possible to identify exactly the size parameter (i.e. one that perfectly characterizes the performance), and so we settle for a representative metric Most common is worst case Sometimes best case, average case 26

Step 4: Approximation Typically it is hard to exactly compute f(n), and so we settle for an approximation For worst-case, Big-O notation, O(), yields this formal asymptotic analysis f(n) =O(g(n)) as n!1 9c 2 N,k 2 N s.t. 8n >k f(n) apple c g(n) 27

Big-O Definition Visually 28

Example Since f(n) = 2n + 2, we can show that this function is O(n) c=3, k=2 def summation(l): sum = 0 for n in l: sum += n return sum 50 40 30 20 10 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 2n+2 3n 29

Exercise Prove: 5n 2 + 3n + 9 = O(n 2 ) 30

Solution Find c and k such that 8n >k 1. Solve: cn 2 > 5n 2 +3n +9 cn 2 =5n 2 +3n +9 2. Let n=k, solve: If k=3, c=7 c =5+ 3 k + 9 k 2 3. So 7n 2 > 5n 2 +3n +9 8n >3 And thus 5n 2 +3n +9=O(n 2 ) 31

O(A) + O(B) = max(o(a), O(B)) Order of Complexity Slower parts of an algorithm dominate faster parts O(A) * O(B) = O(A*B) Nesting 32

Exercise Prove: x 2 +1 x +1 = O(x) 33

Solution O( x2 +1 + 1) x +1 )=O(x2 O(x + 1) = O(x2 ) O(x) = O( x2 x ) = O(x) 34

Big-O Numerically Big-O Term Cost for n=10 Cost for n=100 O(1) Constant 1 1 O(log n) Logarithmic 3 7 O(n) Linear 10 100 O(n log n) Log-Linear, Linearithmic 33 664 O(n 2 ) Quadratic 100 10,000 O(2 n ) Exponential 1,024 1.27E30 O(n!) Factorial 3,628,800 9.33E157 It s important to know this ranking of growth! 35

Asymptotic Visual O(n!) O(2 n ) O(n 2 ) O(nlogn) O(n) O(logn) O(1) 36

Asymptotic Visual (zoom) O(nlogn) O(n) O(logn) O(1) 37

Exercise Choose all g(n) for which the following statement is true: 1 log(n) n nlogn n 2 2 n n! 10n $ + 14n 12 = O(g n ) 38

Answer Choose all g(x) for which the following statement is true: 1 log(n) n nlogn n 2 2 n n! 10n $ + 14n 12 = O(g n ) 39

Example: O(1) Stays constant regardless of problem size Check even/odd Hash computation Array indexing 40

Example: O(logn) Inverse of exponential: as you double the problem size, resource consumption increases by a constant Binary search Balanced tree search 41

Example: O(n log n) Performing an O(log n) operation for each item in your input Typical of efficient sorting 42

Example: O(n 2 ) For each item, perform an operation with each other item Duplication detection Pairwise comparison Bubble sort 43

Example: O(2 n ) For every added element, resource consumption doubles Hardware verification Cryptography 44

Complexity Analysis Thus far we have analyzed algorithms, but complexity analysis focuses on problems, and classes of problems Problems that can be solved in polynomial time, O(n k ), form class P Generally considered easy (but could have large c) Problems in which you can verify a solution in polynomial time form NP The hardest in NP are NP-complete Open question: P? = NP Most computer scientists assume not If correct, there can be no algorithm that solves all such problems in polynomial time AI is interested in developing algorithms that perform efficiently on typical problems drawn from a pre-determined distribution 45

Complex[ity] Humor (TSP) 46

Complex[ity] Humor (AI) 47

Summary We can represent deterministic, fully observable, discrete, known tasks as search problems Initial state, transition function, goal test, path cost State space: all states reachable from initial Solution: action sequence, initial->goal Optimal: least path cost We abstract search state representation depending on the search problem for computational tractability Once formulated, we solve a search problem by incrementally forming a search tree until a goal state is found We evaluate algorithms with respect to solution completeness/optimality and time/space complexity More next lecture! 48