Slides for CIS 675. Huffman Encoding, 1. Huffman Encoding, 2. Huffman Encoding, 3. Encoding 1. DPV Chapter 5, Part 2. Encoding 2
|
|
- Owen Flowers
- 5 years ago
- Views:
Transcription
1 Huffman Encoding, 1 EECS Slides for CIS 675 DPV Chapter 5, Part 2 Jim Royer October 13, 2009 A toy example: Suppose our alphabet is { A, B, C, D }. Suppose T is a text of 130 million characters. What is shortest binary string representing T? Encoding 1 A 00, B 01, C 10, D 11. Statistics on T Symbol Frequency A 70 million B 3 million C 20 million D 37 million Idea: Use variable length codes A s code C s code B s code Total: 260 megabits. Encoding 2 (A hard question.) A 0, B 100, C 101, D 11. Total: 213 megabits 17% better. How to unambiguously decode? How to come up with the code? How good is the result? Royer: CIS 675 Slides, 1 Royer: CIS 675 Slides, 2 Huffman Encoding, 2 Definition A prefix-free code is a code in which no codeword is the prefix of another. Prefix-free codes can be represented by full binary trees (i.e., trees in which each non-leaf node has two children. Example: re 5.10 A prefix-free encoding. Frequencies are shown in square bracets. Symbol Codeword A 0 B 100 C 101 D 11 0 A [70] [23] B [3] 1 C [20] [60] D [37] Question: How do you use such a tree to decode a file? Algorithms ur toy example, where (under the codes of Figure 5.10) the total size of the binary string s to 213 megabits, a 17% improvement. Royer: CIS 675 Slides, 3 n general, how do we find the optimal coding tree, given the frequencies f 1, f 2,..., f n of Huffman Encoding, 3 Goal: Find an optimal coding tree for the frequencies given. cost of a tree = = n i=1 n i=1 f [i] (depth of the ith symbol in tree) f [i] (# of bits required for the ith symbol) Assigning frequencies to all tree nodes (a) Leaf nodes get the frequency of their character. 154 Algorithms (b) Internal nodes get the sum of the freqs of the leaf nodes below them. Figure 5.10 A prefix-free encoding. Frequencies are shown in square bracets. Symbol Codeword A 0 B 100 C 101 D 11 Royer: CIS 675 Slides, 4 0 A [70] [23] B [3] 1 C [20] [60] D [37] for our toy example, where (under the codes of Figure 5.10) the total size of the binary string drops to 213 megabits, a 17% improvement.
2 Huffman Encoding, 4 Huffman Encoding, 5 Observation procedure Huffman(f ) The two lowest freq. characters must be at the children of the lowest internal // Input: An array f [1... n] of freqs node in an optimal tree. (Why? Try a replacement argument) // Output: An encoding tree with n leaves H a priority queue of integers, ordered by f Greedy Strategy for n + 1 to 2n 1 do Find these two characters, build this node, repeat S. Dasgupta, C.H. Papadimitriou, and U.V. Vazirani i 155 deletemin(h); j deletemin(h) (where some nodes are groups of characters as we go along). procedure Huffman(f ) // Input: An array f [1... n] of freqs // Output: An encoding tree with n leaves H a priority queue of integers, ordered by f for n + 1 to 2n 1 do i deletemin(h); j deletemin(h) f [] f [i] + f [j]; insert(h,, f []) f 1 + f 2 f 5 f 4 f 1 f 2 f 3 f [] f [i] + f [j]; insert(h,, f []) Example a : 45% b : 13% c : 12% d : 16% e : 9% f : 5% The latter problem is just a smaller version of the one we started with. So we pull f 1 and f 2 off the list of frequencies, insert (f 1 + f 2 ), and loop. The resulting algorithm can be described in terms of priority queue operations (as defined on page 120) and taes O(n log n) time if a binary heap (Section 4.5.2) is used. Royer: CIS 675 Slides, 5 Royer: CIS 675 Slides, 6 Huffman Encoding, 6 procedure Huffman(f) Input: An array f[1 n] of frequencies Output: An encoding tree with n leaves let H be a priority queue of integers, ordered by f for i = 1 to n: insert(h, i) for = n + 1 to 2n 1: Runtime Analysis i = deletemin(h), j = deletemin(h) create a node numbered Initializing with children H: Θ(n) i, j procedure Huffman(f ) f[] = f[i] + f[j] time // Input: An array f [1... n] insert(h, of freqs ) // Output: An encoding tree with n leaves for-loop iterations: H a priority queue Returning of integers, to our ordered toy example: by f can you n tell 1 if the tree of Figure 5.10 is optimal? for n + 1 to 2n 1 do i deletemin(h); j deletemin(h) f [] f [i] + f [j]; insert(h,, f []) Cost of deletemin s and insert s: O(log n) each Total: Θ(n) + (n 1)O(log n) = O(n log n). Huffman Encoding, 7: Correctness y b a x b a Suppose x and y are the two characters with the smallest frequencies with f [x] f [y]. Lemma (1) There is an optimal code tree in which x and y have the same length and differ only in their last bit. Proof. Suppose T is an optimal code tree and characters a and b which are max-depth siblings in T where f [a] f [b]. Let T be the result of swapping a x and b y. Then cost(t) cost(t ) = f [x] (d T (x) d T (a)) + f [y] (d T (y) d T (b)) + f [a] (d T (a) d T (x)) + f [b] (d T (b) d T (y)) = (f [a] f [x]) (d T (a) d T (x)) + (f [b] f [y]) (d T (b) d T (y)) 0. x y So, cost(t) cost(t ) is optimal. Since T is optimal, so is T. Royer: CIS 675 Slides, 7 Royer: CIS 675 Slides, 8
3 Huffman Encoding, 8: Correctness Huffman Encoding, 9: Correctness z : f[x]+f[y] parent x : f[x] y: f[y] Suppose x and y are the two characters with the smallest frequencies with f [x] f [y]. Lemma (2) Replace x and y by a new character z with frequency f [x] + f [y]. Suppose T is an optimal code tree for the new character set. Then swapping the z-node for a node with children x and y results in an optimal code tree T for the old character set. Proof. Then cost(t) = cost(t ) + f [x] + f [y]. Suppose T is an optimal code three for the old char. set. WLOG, T has x and y as siblings of max depth. Replace x s and y s parent s subtree with a node for z with frequency f [x] + f [y] and call the tree T. Then cost(t ) = cost(t ) f [x] f [y] cost(t) f [x] f [y] = cost(t ). But since T is optimal, so is T. Hence, cost(t) = cost(t ) and T is optimal. Suppose x and y are the two chars with the smallest frequencies with f [x] f [y]. Lemma (1: The greedy choice is safe) There is an optimal code tree in which x and y have the same length and differ only in their last bit. Lemma (2: Opt. code trees have optimal substructure) Replace x and y by a new character z with frequency f [x] + f [y]. Suppose T is an optimal code tree for the new character set. Then swapping the z-node for a node with children x and y results in an optimal code tree T for the old char. set. procedure Huffman(f ) // Input: An array f [1... n] of freqs // Output: An encoding tree with n leaves H a priority queue of integers, ordered by f for n + 1 to 2n 1 do i deletemin(h); j deletemin(h) // Safe by Lemma 1 f [] f [i] + f [j]; insert(h,, f []) // Safe by Lemma 2 Royer: CIS 675 Slides, 9 Royer: CIS 675 Slides, 10 Propositional Logic The formulas of propositional logic are given by the grammar: P ::= Var P P P P P P P Var ::= the syntactic category of variables A truth assignment is a function I : Variables { False, True }. A truth assignment I determines the value of a formula as follows: I[[x]] = True iff I(x) = True I[[ p]] = True iff I[[p]] = False I[[p q]] = True iff I[[p]] = I[[q]] = True. (x a variable) I[[p q]] = True iff I[[p]] = True or I[[q]] = True. CI[[p q]] = True iff I[[p]] = False or I[[q]] = True. A satisfying assignment for a formula p is an I with I[[p]] = True. Finding satisfying assignments for general propositional formulas seems to be a very hard problem. (See Chapter 8.) Horn clauses Definition A Horn clause is a propositional logic formula of one of two special forms: Positive Implications: Var Var Var Pure negative clauses: Var Var A Horn formula is the conjunction of a set of Horn clauses. Examples toddler child (child male) boy infant child (child female) girl toddler female girl Example from: Royer: CIS 675 Slides, 11 Royer: CIS 675 Slides, 12
4 Satisfying Horn Formula, 1 A Horn clause is a propositional logic formula of one of two special forms: Positive Implications: Var Var Var Pure negative clauses: Var Var A Horn formula is the conjunction of a set of Horn clauses. Problem: Finding satisfying Assignments for Sets of Clause Given: A set of Horn clauses: { c 1,..., c n }. Find: Find a truth assignment I that satisfies each of c 1,..., c n or else report that there is no such I. Observation: 1. The positive implications push us to mae things true. 2. The pure negative clauses push us to mae things false. Strategy: We greedily build up a satisfying assignment I for the positive implications maing a few variables True as possible. We chec that I also satisfies the pure negative clauses. Satisfying Horn Formula, 2 // Input: H, a Horn formula (i.e., a set of Horn clauses) // Output: a satisfying assignment, if one exists T // = the set of vars set to True // Invariant: Each x T must be set to True in any satisfying assignment. while (there is an (x 1 x ) x 0 in H with x 1,..., x T but x 0 / T) do T T { x 0 } for each pure negative clause ( x 1 x ) in H do if x 1,..., x T then return No satisfying assignment return T Trace with: (w y z) x, (x z) w, x y, x, (x y) w, ( w x y), ( z) Royer: CIS 675 Slides, 13 Royer: CIS 675 Slides, 14 Satisfying Horn Formula, 3 Satisfying Horn Formula, 4 // Input: H, a Horn formula (i.e., a set of Horn clauses) // Output: a satisfying assignment, if one exists T // = the set of vars set to True // Invariant: Each x T must be set to True in any satisfying assignment. while (there is an (x 1 x ) x 0 in H with x 1,..., x T but x 0 / T) do T T { x 0 } for each pure negative clause ( x 1 x ) in H do if x 1,..., x T then return No satisfying assignment return T Why does this wor? Claim 1: The invariant holds in the while-loop. Claim 2: The while-loop eventually terminates. Claim 3: When the while-loop terminates, T = the set of variables that must be true in any satisfying assignment for H s positive implications. Claim 4: The algorithm is correct. // Input: H, a Horn formula (i.e., a set of Horn clauses) // Output: a satisfying assignment, if one exists T // = the set of vars set to True // Invariant: Each x T must be set to True in any satisfying assignment. while (there is an (x 1 x ) x 0 in H with x 1,..., x T but x 0 / T) do T T { x 0 } for each pure negative clause ( x 1 x ) in H do if x 1,..., x T then return No satisfying assignment return T Runtime Analysis n = the number of characters in the Horn formula. Naïvely, Ø(n 2 ) time. Note: This is in part a setup for Chapter 8. Royer: CIS 675 Slides, 15 Royer: CIS 675 Slides, 16
5 Set Cover, 1 Suppose B is a set and S 1,..., S m B. Definition (a) A set cover of B is a { S 1,..., S } { S 1,..., S m } with B i=1 S i (b) A minimal set cover of B is a set cover of B using as few of the S i -sets as possible. The Set Cover Problem (SCP) Given: B and S 1,..., S m as above. Find: A minimal set cover of B. Example For: B = { 1,..., 14 } and S 1 = { 1, 2 } S 2 = { 3, 4, 5, 6 } S 3 = { 7, 8, 9, 10, 11, 12, 13, 14 } S 4 = { 1, 3, 5, 7, 9, 11, 13 } S 5 = { 2, 4, 6, 8, 10, 12, 14 } Set Cover, 2 A Greedy Approximation to the Set Cover Problem // Input: B and S 1,..., S m B as above. // Output: A set cover of B which is close to minimal. while (some element of B is not yet covered) do Pic the S i with the largest number of uncovered B-elements Example For: B = { 1,..., 14 } and S 1 = { 1, 2 } S 2 = { 3, 4, 5, 6 } S 3 = { 7, 8, 9, 10, 11, 12, 13, 14 } S 4 = { 1, 3, 5, 7, 9, 11, 13 } S 5 = { 2, 4, 6, 8, 10, 12, 14 } The algorithm returns { S 1, S 2, S 3 }. the solution to SCP is { S 4, S 5 }. Royer: CIS 675 Slides, 17 Royer: CIS 675 Slides, 18 Set Cover, 3 A Greedy Approx. to SCP // Input: B and S 1,..., S m B // Output: A near min. set cover while (all of B is not covered) do Pic the S i with the largest number of uncovered B-elms Claim Suppose B contains n elements and the min. cover has sets. Then the greedy algorithm will use at most ln n sets. Proof: Let n t = the number of uncovered elms after t-many while loop iterations of the So n 0 = n. After iteration t: there are n t elms left. many sets cover them So there must be some set with at least n t / many elements. So by the greedy choice, n t+1 n t n ( t = n t 1 1 ) ( = n ) t. Royer: CIS 675 Slides, 19 Set Cover, 4 A Greedy Approx. to SCP ( ) t. // Input: B and S 1,..., S m B We now: n t+1 n 1 1 // Output: A160 near min. set cover Fact: 1 x e x for all x, while (all of Bwhich is notis covered) most easily do proved bywith a picture: equality iff x = 0. Pic the S i with the largest number of uncovered B-elms Claim Suppose B contains n elements and the min. cover has sets. Then the greedy algorithm will use at most ln n sets. Proof: Let n t = the number of uncovered elms after Thus t-many while loop iterations of the 1 x e x 0 1 ( n t n ) t < n 0 (e 1/ ) t = ne t/. Royer: CIS 675 Slides, 20 At t = ln n, therefore, n t is strictly less than ne ln n = 1, which means no e x
6 Set Cover, 5 A Greedy Approx. to SCP // Input: B and S 1,..., S m B // Output: A near min. set cover while (all of B is not covered) do Pic the S i with the largest number of uncovered B-elms Claim Suppose B contains n elements and the min. cover has sets. Then the greedy algorithm will use at most ln n sets. Proof: Let n t = the number of uncovered elms after t-many while loop iterations of the ( ) t. We now: n t+1 n 1 1 Fact: 1 x e x for all x, with equality iff x = 0. At t log e n, n t < ne log e n = 1, i.e., we must have covered all of B. So the greedy algorithm is optimal within a log e n factor. Fact: If certain widely-held complexity assumptions hold, then no poly-time algorithm has a better than an (log e n)-approximation factor. (More on this in Chapters 8 and 9.) Royer: CIS 675 Slides, 21
4.8 Huffman Codes. These lecture slides are supplied by Mathijs de Weerd
4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd Data Compression Q. Given a text that uses 32 symbols (26 different letters, space, and some punctuation characters), how can we
More information21. Dynamic Programming III. FPTAS [Ottman/Widmayer, Kap. 7.2, 7.3, Cormen et al, Kap. 15,35.5]
575 21. Dynamic Programming III FPTAS [Ottman/Widmayer, Kap. 7.2, 7.3, Cormen et al, Kap. 15,35.5] Approximation 576 Let ε (0, 1) given. Let I opt an optimal selection. No try to find a valid selection
More informationCSE 421 Greedy: Huffman Codes
CSE 421 Greedy: Huffman Codes Yin Tat Lee 1 Compression Example 100k file, 6 letter alphabet: File Size: ASCII, 8 bits/char: 800kbits 2 3 > 6; 3 bits/char: 300kbits better: 2.52 bits/char 74%*2 +26%*4:
More informationGreedy. Outline CS141. Stefano Lonardi, UCR 1. Activity selection Fractional knapsack Huffman encoding Later:
October 5, 017 Greedy Chapters 5 of Dasgupta et al. 1 Activity selection Fractional knapsack Huffman encoding Later: Outline Dijkstra (single source shortest path) Prim and Kruskal (minimum spanning tree)
More informationLecture 1 : Data Compression and Entropy
CPS290: Algorithmic Foundations of Data Science January 8, 207 Lecture : Data Compression and Entropy Lecturer: Kamesh Munagala Scribe: Kamesh Munagala In this lecture, we will study a simple model for
More information3 Greedy Algorithms. 3.1 An activity-selection problem
3 Greedy Algorithms [BB chapter 6] with different examples or [Par chapter 2.3] with different examples or [CLR2 chapter 16] with different approach to greedy algorithms 3.1 An activity-selection problem
More informationAlgorithm Design and Analysis
Algorithm Design and Analysis LECTURE 8 Greedy Algorithms V Huffman Codes Adam Smith Review Questions Let G be a connected undirected graph with distinct edge weights. Answer true or false: Let e be the
More information1 Basic Definitions. 2 Proof By Contradiction. 3 Exchange Argument
1 Basic Definitions A Problem is a relation from input to acceptable output. For example, INPUT: A list of integers x 1,..., x n OUTPUT: One of the three smallest numbers in the list An algorithm A solves
More information10-704: Information Processing and Learning Fall Lecture 10: Oct 3
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 0: Oct 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of
More informationPART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015
Outline Codes and Cryptography 1 Information Sources and Optimal Codes 2 Building Optimal Codes: Huffman Codes MAMME, Fall 2015 3 Shannon Entropy and Mutual Information PART III Sources Information source:
More informationAlgorithm Design CS 515 Fall 2015 Sample Final Exam Solutions
Algorithm Design CS 515 Fall 2015 Sample Final Exam Solutions Copyright c 2015 Andrew Klapper. All rights reserved. 1. For the functions satisfying the following three recurrences, determine which is the
More informationGreedy Alg: Huffman abhi shelat
L15 Greedy Alg: Huffman 4102 10.17.201 abhi shelat Huffman Coding image: wikimedia In testimony before the committee, Mr. Lew stressed that the Treasury Department would run out of extraordinary measures
More informationRun-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE
General e Image Coder Structure Motion Video x(s 1,s 2,t) or x(s 1,s 2 ) Natural Image Sampling A form of data compression; usually lossless, but can be lossy Redundancy Removal Lossless compression: predictive
More informationSIGNAL COMPRESSION Lecture 7. Variable to Fix Encoding
SIGNAL COMPRESSION Lecture 7 Variable to Fix Encoding 1. Tunstall codes 2. Petry codes 3. Generalized Tunstall codes for Markov sources (a presentation of the paper by I. Tabus, G. Korodi, J. Rissanen.
More informationLecture 9. Greedy Algorithm
Lecture 9. Greedy Algorithm T. H. Cormen, C. E. Leiserson and R. L. Rivest Introduction to Algorithms, 3rd Edition, MIT Press, 2009 Sungkyunkwan University Hyunseung Choo choo@skku.edu Copyright 2000-2018
More informationAn Approximation Algorithm for Constructing Error Detecting Prefix Codes
An Approximation Algorithm for Constructing Error Detecting Prefix Codes Artur Alves Pessoa artur@producao.uff.br Production Engineering Department Universidade Federal Fluminense, Brazil September 2,
More informationChapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code
Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average
More informationLecture 4 : Adaptive source coding algorithms
Lecture 4 : Adaptive source coding algorithms February 2, 28 Information Theory Outline 1. Motivation ; 2. adaptive Huffman encoding ; 3. Gallager and Knuth s method ; 4. Dictionary methods : Lempel-Ziv
More informationInformation Theory and Statistics Lecture 2: Source coding
Information Theory and Statistics Lecture 2: Source coding Łukasz Dębowski ldebowsk@ipipan.waw.pl Ph. D. Programme 2013/2014 Injections and codes Definition (injection) Function f is called an injection
More informationProblem. Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26
Binary Search Introduction Problem Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26 Strategy 1: Random Search Randomly select a page until the page containing
More informationChapter 5 Arrays and Strings 5.1 Arrays as abstract data types 5.2 Contiguous representations of arrays 5.3 Sparse arrays 5.4 Representations of
Chapter 5 Arrays and Strings 5.1 Arrays as abstract data types 5.2 Contiguous representations of arrays 5.3 Sparse arrays 5.4 Representations of strings 5.5 String searching algorithms 0 5.1 Arrays as
More informationComp487/587 - Boolean Formulas
Comp487/587 - Boolean Formulas 1 Logic and SAT 1.1 What is a Boolean Formula Logic is a way through which we can analyze and reason about simple or complicated events. In particular, we are interested
More informationChapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code
Chapter 2 Date Compression: Source Coding 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code 2.1 An Introduction to Source Coding Source coding can be seen as an efficient way
More informationEECS 229A Spring 2007 * * (a) By stationarity and the chain rule for entropy, we have
EECS 229A Spring 2007 * * Solutions to Homework 3 1. Problem 4.11 on pg. 93 of the text. Stationary processes (a) By stationarity and the chain rule for entropy, we have H(X 0 ) + H(X n X 0 ) = H(X 0,
More informationText Compression. Jayadev Misra The University of Texas at Austin December 5, A Very Incomplete Introduction to Information Theory 2
Text Compression Jayadev Misra The University of Texas at Austin December 5, 2003 Contents 1 Introduction 1 2 A Very Incomplete Introduction to Information Theory 2 3 Huffman Coding 5 3.1 Uniquely Decodable
More informationLecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code
Lecture 16 Agenda for the lecture Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Variable-length source codes with error 16.1 Error-free coding schemes 16.1.1 The Shannon-Fano-Elias
More informationTight Upper Bounds on the Redundancy of Optimal Binary AIFV Codes
Tight Upper Bounds on the Redundancy of Optimal Binary AIFV Codes Weihua Hu Dept. of Mathematical Eng. Email: weihua96@gmail.com Hirosuke Yamamoto Dept. of Complexity Sci. and Eng. Email: Hirosuke@ieee.org
More informationCS583 Lecture 11. Many slides here are based on D. Luebke slides. Review: Dynamic Programming
// CS8 Lecture Jana Kosecka Dynamic Programming Greedy Algorithms Many slides here are based on D. Luebke slides Review: Dynamic Programming A meta-technique, not an algorithm (like divide & conquer) Applicable
More informationCSE 202 Homework 4 Matthias Springer, A
CSE 202 Homework 4 Matthias Springer, A99500782 1 Problem 2 Basic Idea PERFECT ASSEMBLY N P: a permutation P of s i S is a certificate that can be checked in polynomial time by ensuring that P = S, and
More informationData Structures in Java
Data Structures in Java Lecture 20: Algorithm Design Techniques 12/2/2015 Daniel Bauer 1 Algorithms and Problem Solving Purpose of algorithms: find solutions to problems. Data Structures provide ways of
More informationTTIC 31230, Fundamentals of Deep Learning David McAllester, April Information Theory and Distribution Modeling
TTIC 31230, Fundamentals of Deep Learning David McAllester, April 2017 Information Theory and Distribution Modeling Why do we model distributions and conditional distributions using the following objective
More informationSuppose h maps number and variables to ɛ, and opening parenthesis to 0 and closing parenthesis
1 Introduction Parenthesis Matching Problem Describe the set of arithmetic expressions with correctly matched parenthesis. Arithmetic expressions with correctly matched parenthesis cannot be described
More information4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak
4. Quantization and Data Compression ECE 32 Spring 22 Purdue University, School of ECE Prof. What is data compression? Reducing the file size without compromising the quality of the data stored in the
More informationDESIGN AND IMPLEMENTATION OF ENCODERS AND DECODERS. To design and implement encoders and decoders using logic gates.
DESIGN AND IMPLEMENTATION OF ENCODERS AND DECODERS AIM To design and implement encoders and decoders using logic gates. COMPONENTS REQUIRED S.No Components Specification Quantity 1. Digital IC Trainer
More informationCoding of memoryless sources 1/35
Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems
More informationBinary Decision Diagrams. Graphs. Boolean Functions
Binary Decision Diagrams Graphs Binary Decision Diagrams (BDDs) are a class of graphs that can be used as data structure for compactly representing boolean functions. BDDs were introduced by R. Bryant
More informationThe following techniques for methods of proofs are discussed in our text: - Vacuous proof - Trivial proof
Ch. 1.6 Introduction to Proofs The following techniques for methods of proofs are discussed in our text - Vacuous proof - Trivial proof - Direct proof - Indirect proof (our book calls this by contraposition)
More informationDivide-and-Conquer Algorithms Part Two
Divide-and-Conquer Algorithms Part Two Recap from Last Time Divide-and-Conquer Algorithms A divide-and-conquer algorithm is one that works as follows: (Divide) Split the input apart into multiple smaller
More informationif the two intervals overlap at most at end
Activity Selection Given a set of activities represented by intervals (s i f i ), i =1 ::: n. To select a maximum number of compatible intervals (activities). Two activities (s i f i ) and (s j f j ) are
More informationCS361 Homework #3 Solutions
CS6 Homework # Solutions. Suppose I have a hash table with 5 locations. I would like to know how many items I can store in it before it becomes fairly likely that I have a collision, i.e., that two items
More informationPriority queues implemented via heaps
Priority queues implemented via heaps Comp Sci 1575 Data s Outline 1 2 3 Outline 1 2 3 Priority queue: most important first Recall: queue is FIFO A normal queue data structure will not implement a priority
More informationGreedy Algorithms and Data Compression. Curs 2018
Greedy Algorithms and Data Compression. Curs 2018 Greedy Algorithms A greedy algorithm, is a technique that always makes a locally optimal choice in the myopic hope that this choice will lead to a globally
More informationBinary Decision Diagrams
Binary Decision Diagrams Binary Decision Diagrams (BDDs) are a class of graphs that can be used as data structure for compactly representing boolean functions. BDDs were introduced by R. Bryant in 1986.
More informationHeaps and Priority Queues
Heaps and Priority Queues Motivation Situations where one has to choose the next most important from a collection. Examples: patients in an emergency room, scheduling programs in a multi-tasking OS. Need
More informationQuantum-inspired Huffman Coding
Quantum-inspired Huffman Coding A. S. Tolba, M. Z. Rashad, and M. A. El-Dosuky Dept. of Computer Science, Faculty of Computers and Information Sciences, Mansoura University, Mansoura, Egypt. tolba_954@yahoo.com,
More informationThis lecture covers Chapter 5 of HMU: Context-free Grammars
This lecture covers Chapter 5 of HMU: Context-free rammars (Context-free) rammars (Leftmost and Rightmost) Derivations Parse Trees An quivalence between Derivations and Parse Trees Ambiguity in rammars
More informationCS213d Data Structures and Algorithms
CS21d Data Structures and Algorithms Heaps and their Applications Milind Sohoni IIT Bombay and IIT Dharwad March 22, 2017 1 / 18 What did I see today? March 22, 2017 2 / 18 Heap-Trees A tree T of height
More information0 1 d 010. h 0111 i g 01100
CMSC 5:Fall 07 Dave Mount Solutions to Homework : Greedy Algorithms Solution : The code tree is shown in Fig.. a 0000 c 0000 b 00 f 000 d 00 g 000 h 0 i 00 e Prob / / /8 /6 / Depth 5 Figure : Solution
More informationINF2220: algorithms and data structures Series 1
Universitetet i Oslo Institutt for Informatikk I. Yu, D. Karabeg INF2220: algorithms and data structures Series 1 Topic Function growth & estimation of running time, trees (Exercises with hints for solution)
More information/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17
601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Dynamic Programming II Date: 10/12/17 12.1 Introduction Today we re going to do a couple more examples of dynamic programming. While
More informationInformation Theory with Applications, Math6397 Lecture Notes from September 30, 2014 taken by Ilknur Telkes
Information Theory with Applications, Math6397 Lecture Notes from September 3, 24 taken by Ilknur Telkes Last Time Kraft inequality (sep.or) prefix code Shannon Fano code Bound for average code-word length
More informationChapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn
Chapter 5 Data Structures Algorithm Theory WS 2017/18 Fabian Kuhn Priority Queue / Heap Stores (key,data) pairs (like dictionary) But, different set of operations: Initialize-Heap: creates new empty heap
More informationDesign and Analysis of Algorithms
Design and Analysis of Algorithms CSE 5311 Lecture 21 Single-Source Shortest Paths Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1 Single-Source
More informationAlgorithms: COMP3121/3821/9101/9801
NEW SOUTH WALES Algorithms: COMP3121/3821/9101/9801 Aleks Ignjatović School of Computer Science and Engineering University of New South Wales TOPIC 4: THE GREEDY METHOD COMP3121/3821/9101/9801 1 / 23 The
More informationFibonacci Heaps These lecture slides are adapted from CLRS, Chapter 20.
Fibonacci Heaps These lecture slides are adapted from CLRS, Chapter 20. Princeton University COS 4 Theory of Algorithms Spring 2002 Kevin Wayne Priority Queues Operation mae-heap insert find- delete- union
More informationCS 4407 Algorithms Lecture: Shortest Path Algorithms
CS 440 Algorithms Lecture: Shortest Path Algorithms Prof. Gregory Provan Department of Computer Science University College Cork 1 Outline Shortest Path Problem General Lemmas and Theorems. Algorithms Bellman-Ford
More informationLogic. Introduction to Artificial Intelligence CS/ECE 348 Lecture 11 September 27, 2001
Logic Introduction to Artificial Intelligence CS/ECE 348 Lecture 11 September 27, 2001 Last Lecture Games Cont. α-β pruning Outline Games with chance, e.g. Backgammon Logical Agents and thewumpus World
More informationabhi shelat
L15 4102.17.2016 abhi shelat Huffman image: wikimedia Alice m Bob m Alice m Bob MOSCOW President Vladimir V. Putin s typically theatrical order to withdraw the bulk of Russian forces from Syria, a process
More informationKolmogorov complexity ; induction, prediction and compression
Kolmogorov complexity ; induction, prediction and compression Contents 1 Motivation for Kolmogorov complexity 1 2 Formal Definition 2 3 Trying to compute Kolmogorov complexity 3 4 Standard upper bounds
More informationSIGNAL COMPRESSION Lecture Shannon-Fano-Elias Codes and Arithmetic Coding
SIGNAL COMPRESSION Lecture 3 4.9.2007 Shannon-Fano-Elias Codes and Arithmetic Coding 1 Shannon-Fano-Elias Coding We discuss how to encode the symbols {a 1, a 2,..., a m }, knowing their probabilities,
More informationAnalysis of Algorithms. Outline 1 Introduction Basic Definitions Ordered Trees. Fibonacci Heaps. Andres Mendez-Vazquez. October 29, Notes.
Analysis of Algorithms Fibonacci Heaps Andres Mendez-Vazquez October 29, 2015 1 / 119 Outline 1 Introduction Basic Definitions Ordered Trees 2 Binomial Trees Example 3 Fibonacci Heap Operations Fibonacci
More informationLecture #14: NP-Completeness (Chapter 34 Old Edition Chapter 36) Discussion here is from the old edition.
Lecture #14: 0.0.1 NP-Completeness (Chapter 34 Old Edition Chapter 36) Discussion here is from the old edition. 0.0.2 Preliminaries: Definition 1 n abstract problem Q is a binary relations on a set I of
More informationSingle Source Shortest Paths
CMPS 00 Fall 015 Single Source Shortest Paths Carola Wenk Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk 1 Paths in graphs Consider a digraph G = (V, E) with an edge-weight
More informationCSC 421: Algorithm Design & Analysis. Spring 2018
CSC 421: Algorithm Design & Analysis Spring 2018 Complexity & Computability complexity theory tractability, decidability P vs. NP, Turing machines NP-complete, reductions approximation algorithms, genetic
More informationEntropy Coding. Connectivity coding. Entropy coding. Definitions. Lossles coder. Input: a set of symbols Output: bitstream. Idea
Connectivity coding Entropy Coding dd 7, dd 6, dd 7, dd 5,... TG output... CRRRLSLECRRE Entropy coder output Connectivity data Edgebreaker output Digital Geometry Processing - Spring 8, Technion Digital
More informationCS4800: Algorithms & Data Jonathan Ullman
CS4800: Algorithms & Data Jonathan Ullman Lecture 22: Greedy Algorithms: Huffman Codes Data Compression and Entropy Apr 5, 2018 Data Compression How do we store strings of text compactly? A (binary) code
More informationCS Data Structures and Algorithm Analysis
CS 483 - Data Structures and Algorithm Analysis Lecture VII: Chapter 6, part 2 R. Paul Wiegand George Mason University, Department of Computer Science March 22, 2006 Outline 1 Balanced Trees 2 Heaps &
More information8 Priority Queues. 8 Priority Queues. Prim s Minimum Spanning Tree Algorithm. Dijkstra s Shortest Path Algorithm
8 Priority Queues 8 Priority Queues A Priority Queue S is a dynamic set data structure that supports the following operations: S. build(x 1,..., x n ): Creates a data-structure that contains just the elements
More informationAverage Case Analysis of QuickSort and Insertion Tree Height using Incompressibility
Average Case Analysis of QuickSort and Insertion Tree Height using Incompressibility Tao Jiang, Ming Li, Brendan Lucier September 26, 2005 Abstract In this paper we study the Kolmogorov Complexity of a
More informationAlgorithm Design and Analysis
Algorithm Design and Analysis LECTURE 6 Greedy Algorithms Interval Scheduling Interval Partitioning Scheduling to Minimize Lateness Sofya Raskhodnikova S. Raskhodnikova; based on slides by E. Demaine,
More informationA An Overview of Complexity Theory for the Algorithm Designer
A An Overview of Complexity Theory for the Algorithm Designer A.1 Certificates and the class NP A decision problem is one whose answer is either yes or no. Two examples are: SAT: Given a Boolean formula
More information1 Circuit Complexity. CS 6743 Lecture 15 1 Fall Definitions
CS 6743 Lecture 15 1 Fall 2007 1 Circuit Complexity 1.1 Definitions A Boolean circuit C on n inputs x 1,..., x n is a directed acyclic graph (DAG) with n nodes of in-degree 0 (the inputs x 1,..., x n ),
More informationData Compression. Limit of Information Compression. October, Examples of codes 1
Data Compression Limit of Information Compression Radu Trîmbiţaş October, 202 Outline Contents Eamples of codes 2 Kraft Inequality 4 2. Kraft Inequality............................ 4 2.2 Kraft inequality
More informationCOMP9319 Web Data Compression and Search. Lecture 2: Adaptive Huffman, BWT
COMP9319 Web Data Compression and Search Lecture 2: daptive Huffman, BWT 1 Original readings Login to your cse account:! cd ~cs9319/papers! Original readings of each lecture will be placed there. 2 Course
More informationAnalysis of Algorithms. Outline. Single Source Shortest Path. Andres Mendez-Vazquez. November 9, Notes. Notes
Analysis of Algorithms Single Source Shortest Path Andres Mendez-Vazquez November 9, 01 1 / 108 Outline 1 Introduction Introduction and Similar Problems General Results Optimal Substructure Properties
More informationBinary Search Trees. Motivation
Binary Search Trees Motivation Searching for a particular record in an unordered list takes O(n), too slow for large lists (databases) If the list is ordered, can use an array implementation and use binary
More informationLecture 1: Shannon s Theorem
Lecture 1: Shannon s Theorem Lecturer: Travis Gagie January 13th, 2015 Welcome to Data Compression! I m Travis and I ll be your instructor this week. If you haven t registered yet, don t worry, we ll work
More informationChapter 2. Reductions and NP. 2.1 Reductions Continued The Satisfiability Problem (SAT) SAT 3SAT. CS 573: Algorithms, Fall 2013 August 29, 2013
Chapter 2 Reductions and NP CS 573: Algorithms, Fall 2013 August 29, 2013 2.1 Reductions Continued 2.1.1 The Satisfiability Problem SAT 2.1.1.1 Propositional Formulas Definition 2.1.1. Consider a set of
More informationBandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)
Compression Motivation Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Storage: Store large & complex 3D models (e.g. 3D scanner
More informationPS10.3 Logical implications
Warmup: Construct truth tables for these compound statements: 1) p (q r) p q r p q r p (q r) PS10.3 Logical implications Lets check it out: We will be covering Implications, logical equivalence, converse,
More informationGeneral Methods for Algorithm Design
General Methods for Algorithm Design 1. Dynamic Programming Multiplication of matrices Elements of the dynamic programming Optimal triangulation of polygons Longest common subsequence 2. Greedy Methods
More informationData Compression Techniques
Data Compression Techniques Part 2: Text Compression Lecture 5: Context-Based Compression Juha Kärkkäinen 14.11.2017 1 / 19 Text Compression We will now look at techniques for text compression. These techniques
More informationComplexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler
Complexity Theory Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien 15 May, 2018 Reinhard
More informationOutline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.
Complexity Theory Complexity Theory Outline Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität
More informationCSCI 3110 Assignment 6 Solutions
CSCI 3110 Assignment 6 Solutions December 5, 2012 2.4 (4 pts) Suppose you are choosing between the following three algorithms: 1. Algorithm A solves problems by dividing them into five subproblems of half
More information25. Minimum Spanning Trees
695 25. Minimum Spanning Trees Motivation, Greedy, Algorithm Kruskal, General Rules, ADT Union-Find, Algorithm Jarnik, Prim, Dijkstra, Fibonacci Heaps [Ottman/Widmayer, Kap. 9.6, 6.2, 6.1, Cormen et al,
More informationPropositional Logic: Models and Proofs
Propositional Logic: Models and Proofs C. R. Ramakrishnan CSE 505 1 Syntax 2 Model Theory 3 Proof Theory and Resolution Compiled at 11:51 on 2016/11/02 Computing with Logic Propositional Logic CSE 505
More informationPropositional logic. Programming and Modal Logic
Propositional logic Programming and Modal Logic 2006-2007 4 Contents Syntax of propositional logic Semantics of propositional logic Semantic entailment Natural deduction proof system Soundness and completeness
More information25. Minimum Spanning Trees
Problem Given: Undirected, weighted, connected graph G = (V, E, c). 5. Minimum Spanning Trees Motivation, Greedy, Algorithm Kruskal, General Rules, ADT Union-Find, Algorithm Jarnik, Prim, Dijkstra, Fibonacci
More informationFind: a multiset M { 1,..., n } so that. i M w i W and. i M v i is maximized. Find: a set S { 1,..., n } so that. i S w i W and. i S v i is maximized.
Knapsack gain Slides for IS 675 PV hapter 6: ynamic Programming, Part 2 Jim Royer EES October 28, 2009 The Knapsack Problem (KP) knapsack with weight capacity W. Items 1,..., n where item i has weight
More information3 Propositional Logic
3 Propositional Logic 3.1 Syntax 3.2 Semantics 3.3 Equivalence and Normal Forms 3.4 Proof Procedures 3.5 Properties Propositional Logic (25th October 2007) 1 3.1 Syntax Definition 3.0 An alphabet Σ consists
More informationLecture 3 : Algorithms for source coding. September 30, 2016
Lecture 3 : Algorithms for source coding September 30, 2016 Outline 1. Huffman code ; proof of optimality ; 2. Coding with intervals : Shannon-Fano-Elias code and Shannon code ; 3. Arithmetic coding. 1/39
More informationPart V. Intractable Problems
Part V Intractable Problems 507 Chapter 16 N P-Completeness Up to now, we have focused on developing efficient algorithms for solving problems. The word efficient is somewhat subjective, and the degree
More informationCOS597D: Information Theory in Computer Science October 19, Lecture 10
COS597D: Information Theory in Computer Science October 9, 20 Lecture 0 Lecturer: Mark Braverman Scribe: Andrej Risteski Kolmogorov Complexity In the previous lectures, we became acquainted with the concept
More informationLogic as a Tool Chapter 1: Understanding Propositional Logic 1.1 Propositions and logical connectives. Truth tables and tautologies
Logic as a Tool Chapter 1: Understanding Propositional Logic 1.1 Propositions and logical connectives. Truth tables and tautologies Valentin Stockholm University September 2016 Propositions Proposition:
More informationLecture 6 September 21, 2016
ICS 643: Advanced Parallel Algorithms Fall 2016 Lecture 6 September 21, 2016 Prof. Nodari Sitchinava Scribe: Tiffany Eulalio 1 Overview In the last lecture, we wrote a non-recursive summation program and
More informationNP-Completeness. Andreas Klappenecker. [based on slides by Prof. Welch]
NP-Completeness Andreas Klappenecker [based on slides by Prof. Welch] 1 Prelude: Informal Discussion (Incidentally, we will never get very formal in this course) 2 Polynomial Time Algorithms Most of the
More informationUNIT I INFORMATION THEORY. I k log 2
UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper
More information1 Basic Combinatorics
1 Basic Combinatorics 1.1 Sets and sequences Sets. A set is an unordered collection of distinct objects. The objects are called elements of the set. We use braces to denote a set, for example, the set
More informationCOMP9319 Web Data Compression and Search. Lecture 2: Adaptive Huffman, BWT
COMP9319 Web Data Compression and Search Lecture 2: daptive Huffman, BWT 1 Original readings Login to your cse account: cd ~cs9319/papers Original readings of each lecture will be placed there. 2 Course
More information