Mathematical Foudatio CSE 6331 Algorithms Steve Lai
Complexity of Algorithms Aalysis of algorithm: to predict the ruig time required by a algorithm. Elemetary operatios: arithmetic & boolea operatios: +,,, /, mod, div, ad, or compariso: if a < b, if a = b, etc. brachig: go to assigmet: ad so o a b
The ruig time of a algorithm is the umber of elemetary operatios required for the algorithm. It depeds o the size of the iput ad the data themselves. The worst-case time coml p exity of a algorithm is a fuctio of the iput size : T ( ) = the worst case ruig time over all istaces of size. The worst-case asymptotic time complexity is the worst case time complexity expressed i O, Ω, or Θ. The word asymptotic is ofte omitted. 3
O-Notatio Note: Uless otherwise stated, all fuctios cosidered i this class are assumed to be oegative. Covetioal Defiitio : We say f( ) = Og ( ( )) or f ( ) is O( g( )) if the re exist positive costats c ad such that f ( ) cg( ) for all. 0 0 More Abstract Defiitio: f( ): f( ) = O( g( )) i the Og ( ( )) =, covetioal meaig i.e., the set of all fuctio s that are O( g( )) i the covetioal meaig. 4
These expressios all mea the same thig: 4 + 3 = O ( ) 4 + 3 is O ( ) 4 + 3 O ( ) 4 3 is i ( ). + O Sometimes Og ( ( )) is used to mea some fuctio i the set Og ( ( )) which we do't care to specify. Example: we may write: Let f( ) = 3 + O(log ). 5
Theorem 1 If f( ) = Og ( ( )) ad f( ) = Og ( ( )), the 1 1 1. f ( ) + f ( ) = O( g ( ) + g ( )) 1 1. f ( ) + f ( ) = O(max( g ( ), g ( ))) 1 1 3. f ( ) f ( ) = O( g ( ) g ( )) 1 1 Proof. There exist positive costats c, c such that 1 1 1 1 1 1 1 f ( ) c g ( ) ad f ( ) c g ( ) for sufficietly large. Thus, f( ) + f ( ) cg( ) + cg ( ) ( c + c ) ( g ( ) + g ( )) 1 1 ( c + c ) max( g ( ), g ( )). 1 1 By defiitio, 1 ad hold. 3 ca be proved similarly. 6
Ω-Notatio Covetioal Defiitio: We say f( ) = Ω ( g ( )) if there exist positive costats c ad 0 such that f ( ) cg( ) for all. Or defie Ω ( g( )) as a set: Ω ( g( ) ) 0 f( ): f( ) = Ω ( g( )) i the = covetioal meaig Theorem : If f1( ) =Ω ( g1( )) ad f( ) =Ω( g( )), the 1. f1( ) + f( ) = Ω ( g1( ) + g( )). f1( ) + f( ) = Ω (max( g1( ), g( ))) 3. f ( ) f ( ) = Ω ( g ( ) g ( )) 1 1 7
Θ-Notatio Covetioal Defiitio: We say f( ) = Θ( g ( )) if there exist positive costats c, c, ad such that 1 0 cg 1 ( ) f( ) cg ( ) for all 0. That is, f( ) = Θ( g( )) [ f( ) = Og ( ( )) ad f( ) = Ω ( g ( ))]. I terms of sets: Θ( g ( )) = Og ( ( )) Ω ( g( )) Theorem 3: If f ( ) =Θ ( g ( )) ad f ( ) =Θ( g ( )), the 1 1 1. f ( ) + f ( ) = Θ ( g ( ) + g ( )). 1 1 f ( ) + f ( ) 1 = Θ(max( g ( ), g ( ))) 1 1 1 3. f ( ) f ( ) = Θ( g ( ) g ( )) 8
o-notatio, ω-notatio Defiitio of o f( ) f( ) = og ( ( )) iff lim = 0. g ( ) Defiitio of ω f( ) f( ) = ω( g ( )) iff lim =. g ( ) 9
Some Properties of Asymptotic Notatio Trasitive property: If f( ) = Og ( ( )) ad g ( ) = Oh ( ( )), the f( ) = Oh ( ( )). The trasitive property also holds for Ω ad Θ. f( ) = Og ( ( )) g ( ) =Ω( f( )). See the textbook for may others. 10
Asymptotic Notatio with Multiple Parameters Defiitio: We say that f ( m, ) = Ogm ( (, )) iff there are positive costats c, m, such that 0 0 f( m, ) cg( m, ) for all m m ad. Agai, we ca defie Ogm ( (, )) as a set. 0 0 Ω( gm (, )) ad Θ( gm (, )) ca be similarly defied. 11
Coditioal Asymptotic Notatio Let P ( ) be a predicate. We write iff there exist positive costats c, such that T ( ) cf( ) for all for which P ( ) is true. 0 T ( ) = O( f( ) P ( )) 0 Ca similarly defie Ω( f( ) P ( )) ad Θ( f ( ) P ( )). Example: Suppose for 0, T ( ) + = 3 if is odd 4 if is eve = Θ The, T ( ) ( is eve) 1
Smooth Fuctios A fuctio f( ) is smooth odecreasig ad f( ) = O( f( )). iff f( ) is asymptotically Thus, a smooth fuctio does ot grow very fast. Example: log, log, are all smooth. What about? 13
Theorem 4. If f( ) is smooth, the f( b) = O( f( )) for ay fixed positive iteger b. Proof. By iductio o b. Iductio base: For Iductio hypothesis: where b >. b = 1,, obviously f ( b) = O( f ( )). ( ) Assume f ( b 1) = O( f( )), Iductio step: Need to show f ( b) = O( f ( )). We have: ( ( 1) ) ( (( 1) )) ( ) ( ) f ( b) f b = O f b O( f ( )) (i.e., f( b) f ( b 1) c f ( b 1) cc f( ) for some 1 1 1 costats c, c ad sufficietly large ). The theorem is proved. 14
Theorem 5. If T ( ) = Of ( ( ) a power of b), where b is a costat, T ( ) is asymptotically odecreasig ad f( ) is smooth, the T ( ) = Of ( ( )). Proof. From the give coditios, we kow: 1. T ( ) is asymptotically odecreasig.. T ( ) cf( ) for sufficietly large ad a power of b. 1 3. f ( b) c f ( ) for sufficietly large. k k 1 For ay, there is a such that. k b < b + Whe is sufficietly large, we have T( ) T( b ) c f( b ) cc f( b ) cc f( ). k+ 1 k+ 1 k 1 1 1 Be defiitio, T ( ) = Of ( ( )). 15
c 1 k k 1 b b + T T b c ( ) c ( ) cc f ). k + 1 k+ 1 k ( ) ( ) 1f b 1c f b 1 ( 16
Theorem 6. b If T( ) = Ω ( f( ) a power of b), where is a costat, T( ) is asymptotically odecreasig ad f( ) is smooth, the T( ) = Ω ( f( )). Theorem 7. b If T( ) = Θ ( f( ) a power of b), where is a costat, T( ) is asymptotically odecreasig ad f( ) is smooth, the T( ) = Θ( f( )). Applicatio. I order to show T ( ) = O ( log ), we oly have to establish T ( ) O ( log a power of ), provided that T( ) is asymptotically odecreasi g. 17
Some Notatios, Fuctios, Formulas x x = the floor of x. = the ceilig of x. log = log. (Or lg ) 1 ( 1) ( ). + + + = + = Θ k k For costats k > 0, 1+ + 3 + + k+ 1 ( ). + 1 + 1 1 a a 1 If 1, the 1. a + a+ a + + a = = 1 a a 1 If a> 1, the f( ) = 1 + a+ a + + a = Θ( a ). If 1, the ( ) 1 (1). a< f = + a+ a + + a =Θ k = Θ 18
Approximatig summatios by itegratio Suppose fuctio m f is icreasig or decreasig. f ( x) dx f ( i) f ( x) dx + m i= m ( ) f x dx m ( ) f x dx m f( ) f icreasig f( m) f dereasig ( ) m So, if f is icreasig ad f ( x) dx = Ω f ( ), the i= m f () i = Θ ( ) ( ) Similarly, if f is decreasig ad f ( x) dx = Ω f ( m), the i= m f () i = Θ ( ) 1 1 Example: =Θ dx l l m lg lg m m =Θ =Θ i= mi x m ( ) ( ) 19
Aalysis of Algorithm: Example Procedure BiarySearch( A, x, i, j) if i m > j the retur(0) ( i+ j) case Am [ ] = x: retur( m) ( ) ( A x i m ) Am [ ] < x: retur BiarySearch( A, x, m+ 1, j) Am [ ] ed > x: retur BiarySearch(,,, 1) 0
Aalysis of Biary Search Let T( ) deote the worst-case ruig time. T( ) satisfies the recurrece: T( ) = T. + c Solvig the recurrece yields: T( ) = Θ(log ) 1
Euclid's Algorithm Fid gcd( ab, ) for itegers ab, 0, ot both zero. Theorem: If b= 0, gcd( ab, ) = a. fuctio Euclid( ab, ) if b = 0 the else If b> 0, gcd( ab, ) = gcd( ba, mod b) retur( a) ( ba b) retur Euclid(, mod ) The ruig time is proportioal to the umber of recursive calls.
Aalysis of Euclid's Algorithm a b c = a mod b 0 0 0 0 0 a b c = a mod b 1 1 1 1 1 a b c a b Observe that a = b = c. k k 1 k W.l.o.g., assume a b. The values a, a, a, decrease by at least oe half 0 0 0 4 Reaso: If c: = a mod b, the c< a. So, there are most O(log a 0 with each recursive call. ) recursive calls. 3
Solutio to Q4 of example aalysis 0 1 log i=,,,, i log k= 0 k= 0 log k log ( k ) ( ) ( = = = ) Θ =Θ =Θ ( ) log ( 4 ) 4