The Knapsack Problem. COSC 3101A - Design and Analysis of Algorithms 9. Fractional Knapsack Problem. Fractional Knapsack Problem

Similar documents
Preview 11/1/2017. Greedy Algorithms. Coin Change. Coin Change. Coin Change. Coin Change. Greedy algorithms. Greedy Algorithms

Connected-components. Summary of lecture 9. Algorithms and Data Structures Disjoint sets. Example: connected components in graphs

Data Structures LECTURE 10. Huffman coding. Example. Coding: problem definition

The Minimum Label Spanning Tree Problem: Illustrating the Utility of Genetic Algorithms

Convert the NFA into DFA

p-adic Egyptian Fractions

First Midterm Examination

CS 373, Spring Solutions to Mock midterm 1 (Based on first midterm in CS 273, Fall 2008.)

1 Nondeterministic Finite Automata

Regular expressions, Finite Automata, transition graphs are all the same!!

Review of Gaussian Quadrature method

DATA Search I 魏忠钰. 复旦大学大数据学院 School of Data Science, Fudan University. March 7 th, 2018

First Midterm Examination

CS 310 (sec 20) - Winter Final Exam (solutions) SOLUTIONS

Lecture 2: January 27

Finite Automata-cont d

CS 188: Artificial Intelligence Spring 2007

4. GREEDY ALGORITHMS I

Homework 3 Solutions

Section 4: Integration ECO4112F 2011

Uninformed Search Lecture 4

Introduction to Algebra - Part 2

CS 275 Automata and Formal Language Theory

CS415 Compilers. Lexical Analysis and. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University

Designing finite automata II

19 Optimal behavior: Game theory

7.1 Integral as Net Change and 7.2 Areas in the Plane Calculus

Surface maps into free groups

I1 = I2 I1 = I2 + I3 I1 + I2 = I3 + I4 I 3

Bridging the gap: GCSE AS Level

M344 - ADVANCED ENGINEERING MATHEMATICS

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true.

Lecture 08: Feb. 08, 2019

Tries and suffixes trees

Chapter 2 Finite Automata

2.4 Linear Inequalities and Interval Notation

Intermediate Math Circles Wednesday, November 14, 2018 Finite Automata II. Nickolas Rollick a b b. a b 4

CS103B Handout 18 Winter 2007 February 28, 2007 Finite Automata

Calculus Module C21. Areas by Integration. Copyright This publication The Northern Alberta Institute of Technology All Rights Reserved.

Minimal DFA. minimal DFA for L starting from any other

CS 301. Lecture 04 Regular Expressions. Stephen Checkoway. January 29, 2018

Bases for Vector Spaces

Coalgebra, Lecture 15: Equations for Deterministic Automata

Alignment of Long Sequences. BMI/CS Spring 2016 Anthony Gitter

Designing Information Devices and Systems I Spring 2018 Homework 7

10 Vector Integral Calculus

Signal Flow Graphs. Consider a complex 3-port microwave network, constructed of 5 simpler microwave devices:

CSCI 340: Computational Models. Transition Graphs. Department of Computer Science

The size of subsequence automaton

Parse trees, ambiguity, and Chomsky normal form

Designing Information Devices and Systems I Discussion 8B

LINEAR ALGEBRA APPLIED

Java II Finite Automata I

Lecture 3. In this lecture, we will discuss algorithms for solving systems of linear equations.

CMSC 330: Organization of Programming Languages

Where did dynamic programming come from?

Lecture 3: Equivalence Relations

Chapter 9 Definite Integrals

Interpreting Integrals and the Fundamental Theorem

Formal languages, automata, and theory of computation

Farey Fractions. Rickard Fernström. U.U.D.M. Project Report 2017:24. Department of Mathematics Uppsala University

Advanced Algebra & Trigonometry Midterm Review Packet

Math 61CM - Solutions to homework 9

Linear Inequalities. Work Sheet 1

Fingerprint idea. Assume:

MA123, Chapter 10: Formulas for integrals: integrals, antiderivatives, and the Fundamental Theorem of Calculus (pp.

Balanced binary search trees

Minnesota State University, Mankato 44 th Annual High School Mathematics Contest April 12, 2017

This chapter will show you. What you should already know. 1 Write down the value of each of the following. a 5 2

Search: The Core of Planning

AQA Further Pure 2. Hyperbolic Functions. Section 2: The inverse hyperbolic functions

1. For each of the following theorems, give a two or three sentence sketch of how the proof goes or why it is not true.

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. NFA for (a b)*abb.

Types of Finite Automata. CMSC 330: Organization of Programming Languages. Comparing DFAs and NFAs. Comparing DFAs and NFAs (cont.) Finite Automata 2

State space systems analysis (continued) Stability. A. Definitions A system is said to be Asymptotically Stable (AS) when it satisfies

Parsing and Pattern Recognition

Math 017. Materials With Exercises

We will see what is meant by standard form very shortly

(e) if x = y + z and a divides any two of the integers x, y, or z, then a divides the remaining integer

Lecture 9: LTL and Büchi Automata

How do we solve these things, especially when they get complicated? How do we know when a system has a solution, and when is it unique?

Algorithm Design and Analysis

10.2 The Ellipse and the Hyperbola

Thoery of Automata CS402

CS 275 Automata and Formal Language Theory

Random subgroups of a free group

NFAs continued, Closure Properties of Regular Languages

Chapter 6 Techniques of Integration

Algorithms Booklet. 1 Graph Searching. 1.1 Breadth First Search

Introduction to Electrical & Electronic Engineering ENGG1203

1 ELEMENTARY ALGEBRA and GEOMETRY READINESS DIAGNOSTIC TEST PRACTICE

CS 330 Formal Methods and Models Dana Richards, George Mason University, Spring 2016 Quiz Solutions

Languages & Automata

Assignment 1 Automata, Languages, and Computability. 1 Finite State Automata and Regular Languages

Section 6.1 Definite Integral

Definite Integrals. The area under a curve can be approximated by adding up the areas of rectangles = 1 1 +

Closure Properties of Regular Languages

CMPSCI 250: Introduction to Computation. Lecture #31: What DFA s Can and Can t Do David Mix Barrington 9 April 2014

Fast Frequent Free Tree Mining in Graph Databases

Module 9: Tries and String Matching

Transcription:

The Knpsck Prolem COSC A - Design nd Anlsis of Algorithms Knpsck Prolem Huffmn Codes Introduction to Grphs Mn of these slides re tken from Monic Nicolescu, Univ. of Nevd, Reno, monic@cs.unr.edu The - knpsck prolem A thief ruing store finds n items: the i-th item is worth v i dollrs nd weights w i pounds (v i, w i integers) The thief cn onl crr W pounds in his knpsck Items must e tken entirel or left ehind Which items should the thief tke to mximize the vlue of his lod? The frctionl knpsck prolem Similr to ove The thief cn tke frctions of items 6/2/24 Lecture COSCA 2 Frctionl Knpsck Prolem Knpsck cpcit: W There re n items: the i- th item hs vlue v i nd weight w i Gol: find x i such tht for ll x i, i =, 2,.., n w i x i W nd x i v i is mximum Frctionl Knpsck Prolem Greed strteg : Pick the item with the mximum vlue E.g.: W = w =, v = 2 w 2 =, v 2 = Tking from the item with the mximum vlue: Totl vlue tken = v /w = 2/ Smller thn wht the thief cn tke if choosing the other item Totl vlue (choose item 2) = v 2 /w 2 = 6/2/24 Lecture COSCA 6/2/24 Lecture COSCA 4 Frctionl Knpsck Prolem Greed strteg 2: Pick the item with the mximum vlue per pound v i /w i If the suppl of tht element is exhusted nd the thief cn crr more: tke s much s possile from the item with the next gretest vlue per pound It is good to order items sed on their vlue per pound v v2 vn... w w w 2 6/2/24 Lecture COSCA 5 n Frctionl Knpsck Prolem Alg.: Frctionl-Knpsck (W, v[n], w[n]). While w > nd s long s there re items remining 2. pick item with mximum v i /w i. x i min (, w/w i ) 4. remove item i from list 5. w w x i w i w the mount of spce remining in the knpsck (w = W) Running time: Θ(n) if items lred ordered; else Θ(nlgn) 6/2/24 Lecture COSCA 6

E.g.: Frctionl Knpsck - Exmple Item Item 2 2 Item $6 $ $2 $6/pound $5/pound $4/pound 5 2 $ + 6/2/24 Lecture COSCA 7 5 2 --- $8 + $6 $24 Greed Choice Items: j n Optiml solution: x x 2 x x j x n Greed solution: x x 2 x x j x n We know tht: x x greed choice tkes s much s possile from item Modif the optiml solution to tke x of item We hve to decrese the quntit tken from some item j: the new x j is decresed : (x - x ) w /w j Increse in profit: (x - x) v Decrese in profit: (x - x)w v j/w j (x - x ) v (x - x )w v /w j j v j v v v w j True, since x hd the w j w w est vlue/pound rtio j 6/2/24 Lecture COSCA 8 Optiml Sustructure The - Knpsck Prolem Consider the most vlule lod tht weights t most W pounds If we remove weight w of item j from the optiml lod The remining lod must e the most vlule lod weighing t most W w tht cn e tken from the remining n items plus w j w pounds of item j 6/2/24 Lecture COSCA Thief hs knpsck of cpcit W There re n items: for i- th item vlue v i nd weight w i Gol: find x i such tht for ll x i = {, }, i =, 2,.., n w i x i W nd x i v i is mximum 6/2/24 Lecture COSCA The - Knpsck Prolem Thief hs knpsck of cpcit W There re n items: for i- th item vlue v i nd weight w i Gol: find x i such tht for ll x i = {, }, i =, 2,.., n w i x i W nd x i v i is mximum - Knpsck - Greed Strteg E.g.: Item Item 2 2 Item $6 $ $2 5 5 2 $ + $6 $6 $6/pound $5/pound $4/pound None of the solutions involving the greed choice (item ) leds to n optiml solution The greed choice propert does not hold $2 5 + 2 $ $22 6/2/24 Lecture COSCA 6/2/24 Lecture COSCA 2

- Knpsck - Dnmic Progrmming P(i, w) the mximum profit tht cn e otined from items to i, if the knpsck hs size w Cse : thief tkes item i P(i, w) = v i + P(i-, w- w i ) Cse 2: thief does not tke item i P(i, w) = P(i-, w) 6/2/24 Lecture COSCA - Knpsck - Dnmic Progrmming i- i n P(i, w) = mx {v i + P(i-, w- w i ), P(i-, w) } : w - w i w W Item i ws tken Item i ws not tken 6/2/24 Lecture COSCA 4 first second W = 5 Exmple: P(i, w) = mx {v i + P(i -, w-w i ), P(i -, w) } 4 5 2 2 2 2 2 2 22 22 22 2 22 2 4 5 25 7 P(, ) = P(, ) = P(, 2) = mx{2+, } = 2 P(, ) = mx{2+, } = 2 P(, 4) = mx{2+, } = 2 P(, 5) = mx{2+, } = 2 Item Weight Vlue 2 2 2 4 2 5 P(2, )= mx{+, } = P(, )= P(2,) = P(4, )= P(,) = P(2, 2)= mx{+, 2} = 2 P(, 2)= P(2,2) = 2 P(4, 2)= mx{5+, 2} = 5 P(2, )= mx{+2, 2} = 22 P(, )= mx{2+, 22}=22 P(4, )= mx{5+, 22}=25 P(2, 4)= mx{+2, 2} = 22 P(, 4)= mx{2+,22}= P(4, 4)= mx{5+2, }= P(2, 5)= mx{+2, 2} = 22 P(4, 5)= mx{2+2,22}=2 P(4, 5)= mx{5+22, 2}=7 6/2/24 Lecture COSCA 5 Reconstructing the Optiml Solution 2 4 4 5 2 2 2 2 2 22 22 22 2 22 2 5 25 7 Item 4 Item 2 Item Strt t P(n, W) When ou go left- up item i hs een tken When ou go stright up item i hs not een tken 6/2/24 Lecture COSCA 6 Optiml Sustructure Overlpping Suprolems Consider the most vlule lod tht weights t most W pounds If we remove item j from this lod The remining lod must e the most vlule lod weighing t most W w j tht cn e tken from the remining n items 6/2/24 Lecture COSCA 7 P(i, w) = mx {v i + P(i-, w- w i ), P(i-, w) } i- i : w W n E.g.: ll the suprolems shown in gre m depend on P(i-, w) 6/2/24 Lecture COSCA 8

Huffmn Codes Huffmn Codes Widel used technique for dt compression Ide: Use the frequencies of occurrence of chrcters to Assume the dt to e sequence of chrcters Looking for n effective w of storing the dt uild optiml w of representing ech chrcter Frequenc (thousnds) 45 Binr chrcter code Uniquel represents chrcter inr string c 2 d 6 e f 5 6/2/24 Lecture COSCA 6/2/24 Lecture COSCA 2 Fixed-Length Codes E.g.: Dt file contining, chrcters Vrile-Length Codes E.g.: Dt file contining, chrcters Frequenc (thousnds) 45 c 2 d 6 e f 5 Frequenc (thousnds) 45 c 2 d 6 e f 5 its needed =, =, c =, d =, e =, f = Requires:, =, its Assign short codewords to frequent chrcters nd long codewords to infrequent chrcters =, =, c =, d =, e =, f = (45 + + 2 + 6 + 4 + 5 4), = 224, its 6/2/24 Lecture COSCA 2 6/2/24 Lecture COSCA 22 Prefix Codes Encoding with Binr Chrcter Codes Prefix codes: Codes for which no codeword is lso prefix of some other codeword Better nme would e prefix-free codes We cn chieve optiml dt compression using prefix codes Encoding Conctente the codewords representing ech chrcter in the file E.g.: =, =, c =, d =, e =, f = c = = We will restrict our ttention to prefix codes 6/2/24 Lecture COSCA 2 6/2/24 Lecture COSCA 24

Decoding with Binr Chrcter Codes Prefix codes simplif decoding No codeword is prefix of nother the codeword tht egins n encoded file is unmiguous Approch Identif the initil codeword Trnslte it ck to the originl chrcter Repet the process on the reminder of the file E.g.: =, =, c =, d =, e =, f = = = e 6/2/24 Lecture COSCA 25 Prefix Code Representtion Binr tree whose leves re the given chrcters Binr codeword the pth from the root to the chrcter, where mens go to the left child nd mens go to the right child Length of the codeword Length of the pth from root to the chrcter lef (depth of node) 86 4 58 28 4 : 45 : c: 2 d: 6 e: f: 5 : 45 c: 2 : 55 25 6/2/24 Lecture COSCA f: 5 e: 26 4 d: 6 Optiml Codes An optiml code is lws represented full inr tree Ever non-lef hs two children Fixed-length code is not optiml, vrile-length is How mn its re required to encode file? Let C e the lphet of chrcters Let f(c) e the frequenc of chrcter c Let d T (c) e the depth of c s lef in the tree T corresponding to prefix code B ( T ) = f ( c) d ( c) the cost of tree T T c C 6/2/24 Lecture COSCA 27 Constructing Huffmn Code A greed lgorithm tht constructs n optiml prefix code clled Huffmn code Assume tht: C is set of n chrcters Ech chrcter hs frequenc f(c) The tree T is uilt in ottom up mnner Ide: f: 5 e: c: 2 : d: 6 : 45 Strt with set of C leves At ech step, merge the two lest frequent ojects: the frequenc of the new node = sum of two frequencies Use min-priorit queue Q, keed on f to identif the two lest frequent ojects 6/2/24 Lecture COSCA 28 Exmple f: 5 e: c: 2 : d: 6 : 45 c: 2 : 4 d: 6 : 45 f: 5 e: 4 d: 6 25 : 45 25 : 45 f: 5 e: c: 2 : c: 2 : 4 d: 6 f: 5 e: : 45 55 : 45 55 25 25 c: 2 : 4 d: 6 c: 2 : 4 d: 6 f: 5 e: 6/2/24 Lecture COSCA f: 5 e: 2 Building Huffmn Code Alg.: HUFFMAN(C) Running time: O(nlgn). n C 2. Q C O(n). for i to n 4. do llocte new node z 5. left[z] x EXTRACT- MIN(Q) 6. right[z] EXTRACT- MIN(Q) O(nlgn) 7. f[z] f[x] + f[] 8. INSERT (Q, z). return EXTRACT- MIN(Q) 6/2/24 Lecture COSCA

Greed Choice Propert Lemm: Let C e n lphet in which ech chrcter c C hs frequenc f[c]. Let x nd e two chrcters in C hving the lowest frequencies. Then, there exists n optiml prefix code for C in which the codewords for x nd hve the sme length nd differ onl in the lst it. Proof of the Greed Choice Ide: Consider tree T representing n ritrr optiml prefix code Modif T to mke tree representing nother optiml prefix code in which x nd will pper s siling leves of mximum depth The codes of x nd will hve the sme length nd differ onl in the lst it 6/2/24 Lecture COSCA 6/2/24 Lecture COSCA 2 Proof of the Greed Choice (cont.) Proof of the Greed Choice (cont.) T T T x x x, two chrcters, siling leves of mximum depth in T Assume: f[] f[] nd f[x] f[] f[x] nd f[] re the two lowest lef frequencies, in order f[x] f[] nd f[] f[] Exchnge the positions of nd x (T ) nd of nd (T ) 6/2/24 Lecture COSCA T T T x B(T) B(T ) = f ( c) dt ( c) f ( c) dt '( c) c C c C = f[x]d T (x) + f[]d T () f[x]d T (x) f[]d T () = f[x]d T (x) + f[]d T () f[x]d T () f[]d T (x) = (f[] - f[x]) (d T () - d T (x)) x is minimum frequenc lef x 6/2/24 Lecture COSCA 4 is lef of mximum depth x Proof of the Greed Choice (cont.) Discussion T T T x x x B(T) B(T ) Similrl, exchnging nd does not increse the cost B(T ) B(T ) B(T ) B(T) nd since T is optiml B(T) B(T ) B(T) = B(T ) T is n optiml tree, in which x nd re siling leves of mximum depth 6/2/24 Lecture COSCA 5 Greed choice propert: Building n optiml tree mergers cn egin with the greed choice: merging the two chrcters with the lowest frequencies The cost of ech merger is the sum of frequencies of the two items eing merged Of ll possile mergers, HUFFMAN chooses the one tht incurs the lest cost 6/2/24 Lecture COSCA 6

Grphs Applictions tht involve not onl set of items, ut lso the connections etween them Mps Hpertexts Circuits Schedules Trnsctions Mtching Computer Networks 6/2/24 Lecture COSCA 7 Grphs - Bckground Grphs = set of nodes (vertices) with edges (links) etween them. Nottions: G = (V, E) - grph V = set of vertices V = n E = set of edges E = m 4 Directed grph 4 Undirected grph 4 Acclic grph 6/2/24 Lecture COSCA 8 Other Tpes of Grphs Grph Representtion A grph is connected if there is pth etween ever two vertices A iprtite grph is n undirected grph G = (V, E) in which V = V + V 2 nd there re edges onl etween vertices in V nd V 2 4 4 Connected Not connected 4 8 6 7 4 Adjcenc list representtion of G = (V, E) An rr of V lists, one for ech vertex in V Ech list Adj[u] contins ll the vertices v such tht there is n edge etween u nd v Adj[u] contins the vertices djcent to u (in ritrr order) Cn e used for oth directed nd undirected grphs Undirected grph 2 4 5 2 5 / 5 4 / 2 4 2 5 / 4 6/2/24 Lecture COSCA 6/2/24 Lecture COSCA 4 Properties of Adjcenc-List Representtion Sum of the lengths of ll the djcenc lists Directed grph: E Edge (u, v) ppers onl once in u s list Undirected grph: 2 E u nd v pper in ech other s djcenc lists: edge (u, v) ppers twice 4 Directed grph Undirected grph 6/2/24 Lecture COSCA 4 Properties of Adjcenc-List Representtion Memor required Θ(V + E) Preferred when the grph is sprse: E << V 2 Disdvntge no quick w to determine whether there is n edge etween node u nd v Time to list ll vertices djcent to u: Θ(degree(u)) Time to determine if (u, v) E: O(degree(u)) Undirected grph 4 Directed grph 6/2/24 Lecture COSCA 42

Grph Representtion Adjcenc mtrix representtion of G = (V, E) Assume vertices re numered, 2, V The representtion consists of mtrix A V x V : ij = if (i, j) E otherwise Undirected grph 2 4 5 4 5 Mtrix A is smmetric: ij = ji A = A T 6/2/24 Lecture COSCA 4 Properties of Adjcenc Mtrix Representtion Memor required Θ(V 2 ), independent on the numer of edges in G Preferred when The grph is dense E is close to V 2 We need to quickl determine if there is n edge etween two vertices Time to list ll vertices djcent to u: Θ(V) Time to determine if (u, v) E: Θ() 6/2/24 Lecture COSCA 44 Weighted Grphs Weighted grphs = grphs for which ech edge hs n ssocited weight w(u, v) w: E R, weight function Storing the weights of grph Adjcenc list: Store w(u,v) long with vertex v in u s djcenc list Adjcenc mtrix: Store w(u, v) t loction (u, v) in the mtrix Serching in Grph Grph serching = sstemticll follow the edges of the grph so s to visit the vertices of the grph Two sic grph serching lgorithms: Bredth-first serch Depth-first serch The difference etween them is in the order in which the explore the unvisited edges of the grph Grph lgorithms re tpicll elortions of the sic grph- serching lgorithms 6/2/24 Lecture COSCA 45 6/2/24 Lecture COSCA 46 Bredth-First Serch (BFS) Input: A grph G = (V, E) (directed or undirected) A source vertex s V Gol: Explore the edges of G to discover ever vertex rechle from s, tking the ones closest to s first Output: d[v] = distnce (smllest # of edges) from s to v, for ll v V A redth-first tree rooted t s tht contins ll rechle vertices 6/2/24 Lecture COSCA 47 Bredth-First Serch (cont.) Discover vertices in incresing order of distnce from the source s serch in redth not depth Find ll vertices t edge from s, then ll vertices t 2 edges from s, nd so on 7 6 7 6/2/24 Lecture COSCA 48 2

Bredth-First Serch (cont.) Bredth-First Tree Keeping trck of progress: Color ech vertex in either white, gr or lck Initill, ll vertices re white When eing discovered vertex ecomes gr After discovering ll its djcent vertices the node ecomes lck Use FIFO queue Q to mintin the set of gr vertices source BFS constructs redth- first tree Initill contins the root (source vertex s) When vertex v is discovered while scnning the djcenc list of vertex u vertex v nd edge (u, v) re dded to the tree u is the predecessor (prent) of v in the redth-first tree A vertex is discovered onl once it hs t most one prent source 6/2/24 Lecture COSCA 4 6/2/24 Lecture COSCA 5 BFS Additionl Dt Structures BFS(G, s) G = (V, E) represented using djcenc lists. for ech u V[G] - {s} color[u] the color of the vertex for ll u V source d= π[u] predecessor of u If u = s (root) or node u hs not et een π= 2 discovered π[u] = NIL d[u] the distnce from the source s to vertex u d= d=2 π= π=5 Use FIFO queue Q to mintin the set of gr vertices d=2 π=2 2. do color[u] WHITE. d[u] 4. π[u] = NIL 5. color[s] GRAY 6. d[s] 7. π[s] = NIL 8. Q. Q ENQUEUE(Q, s) Q: s 6/2/24 Lecture COSCA 5 6/2/24 Lecture COSCA 52 BFS(V, E, s). while Q Q: s. do u DEQUEUE(Q) 2. for ech v Adj[u]. do if color[v] = WHITE Q: w 4. then color[v] GRAY 5. d[v] d[u] + 6. π[v] = u 7. ENQUEUE(Q, v) Q: w, r 8. color[u] BLACK 6/2/24 Lecture COSCA 5 Q: s 2 2 Q: t, x, v 2 Exmple Q: w, r 2 2 Q: x, v, u 2 2 Q: r, t, x 2 2 Q: v, u, 2 2 2 2 Q: 6/2/24 u, Lecture Q: COSCA Q: 54

Anlsis of BFS. for ech u V - {s} 2. do color[u] WHITE O(V). d[u] 4. π[u] = NIL 5. color[s] GRAY 6. d[s] 7. π[s] = NIL Θ() 8. Q. Q ENQUEUE(Q, s) 6/2/24 Lecture COSCA 55 Anlsis of BFS. while Q. do u DEQUEUE(Q) 2. for ech v Adj[u]. do if color[v] = WHITE 4. then color[v] = GRAY Θ() 5. d[v] d[u] + 6. π[v] = u 7. ENQUEUE(Q, v) Θ() 8. color[u] BLACK Totl running time for BFS = O(V + E) Scn Adj[u] for ll vertices in the grph Ech vertex is scnned onl once, when the vertex is dequeued Sum of lengths of ll djcenc lists = Θ(E) Scnning opertions: O(E) 6/2/24 Lecture COSCA 56 Shortest Pths Propert BFS finds the shortest- pth distnce from the source vertex s V to ech node in the grph Shortest- pth distnce = δ(s, u) Minimum numer of edges in n pth from s to u Chpter 6 Chpter 22 Redings source 2 2 6/2/24 Lecture COSCA 57 6/2/24 Lecture COSCA 58