Information Theory and Synthetic Steganography

Size: px
Start display at page:

Download "Information Theory and Synthetic Steganography"

Transcription

1 Information Theory and Synthetic Steganography CSM25 Secure Information Hiding Dr Hans Georg Schaathun University of Surrey Spring 2009 Week 8 Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 1 / 56

2 Learning Outcomes understand the relationship between steganography and established disciplines like communications, information theory, data compression, and coding theory. be familiar with at least one way of doing steganography by cover synthesis Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 2 / 56

3 Reading Core Reading Peter Wayner: Disappearing Cryptography Ch. 6-7 Core Reading Cox et al.: Appendix A Suggested Reading Lin & Costello: Error-Control Coding Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 3 / 56

4 Outline Communications essentials 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression 3 Miscellanea Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 4 / 56

5 Outline Communications essentials Communications and Redundancy 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 5 / 56

6 Communications essentials Communications and Redundancy The communications problem Alice Bob m ˆm Bob s problem Estimate m, given (partly) random output ˆm from the channel How much (un)certainty does Bob have about m? Information theory and Shannon entropy. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 6 / 56

7 Communications essentials Communications and Redundancy The communications problem Alice Bob m Noisy channel ˆm Bob s problem Estimate m, given (partly) random output ˆm from the channel How much (un)certainty does Bob have about m? Information theory and Shannon entropy. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 6 / 56

8 Communications essentials Communications and Redundancy The communications problem Alice Bob m Noisy channel ˆm Bob s problem Estimate m, given (partly) random output ˆm from the channel How much (un)certainty does Bob have about m? Information theory and Shannon entropy. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 6 / 56

9 Communications essentials Communications and Redundancy The communications problem Alice Bob m Noisy channel ˆm Bob s problem Estimate m, given (partly) random output ˆm from the channel How much (un)certainty does Bob have about m? Information theory and Shannon entropy. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 6 / 56

10 Communications essentials The communications problem Communications and Redundancy Alice Bob m Noisy Encode c r Decode ˆm channel Bob s problem Estimate m, given (partly) random output ˆm from the channel How much (un)certainty does Bob have about m? Information theory and Shannon entropy. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 6 / 56

11 Communications essentials The communications problem Communications and Redundancy Alice Bob m Noisy Encode c r Decode ˆm channel Bob s problem Estimate m, given (partly) random output ˆm from the channel How much (un)certainty does Bob have about m? Information theory and Shannon entropy. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 6 / 56

12 Communications essentials Redundancy of English Communications and Redundancy Fact The English language is more than 50% redundant. Message destroyed on the channel Redundancy allows Bob to determine the original m. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 7 / 56

13 Communications essentials Redundancy of English Communications and Redundancy Fact The English language is more than 50% redundant. t** p*oce*s o**hid**g *ata**nsid* o*her**ata. For ex*****, a **xt f*le c**ld*** hid*** "in**de"****im*ge or***s**nd *ile* By look****at t*e im*g***or list***** to th**s**nd,*yo* w*u*d n*t *no**that***ere is *x*ra info******* *r*sent. from Message destroyed on the channel Redundancy allows Bob to determine the original m. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 7 / 56

14 Communications essentials Redundancy of English Communications and Redundancy Fact The English language is more than 50% redundant. t** p*oce*s o**hid**g *ata**nsid* o*her**ata. For ex*****, a **xt f*le c**ld*** hid*** "in**de"****im*ge or***s**nd *ile* By look****at t*e im*g***or list***** to th**s**nd,*yo* w*u*d n*t *no**that***ere is *x*ra info******* *r*sent. from Message destroyed on the channel Redundancy allows Bob to determine the original m. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 7 / 56

15 Communications essentials Redundancy of English Communications and Redundancy Fact The English language is more than 50% redundant. t*e p*oce*s o* hid**g *ata*insid* o*her*data. For ex*m***, a t*xt f*le c**ld*b* hidd** "ind*de" a**im*ge or*a*s*und *ile* By look**g*at t*e im*g*,*or list**in* to th* s**nd,*yo* w*uld n*t *no**that *here is *x*ra info*****on *r*sent. from Message destroyed on the channel Redundancy allows Bob to determine the original m. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 7 / 56

16 Communications essentials Redundancy of English Communications and Redundancy Fact The English language is more than 50% redundant. the process of hiding data inside other data. For example, a text file could be hidden "inside" an image or a sound file. By looking at the image, or listening to the sound, you would not know that there is extra information present. from Message destroyed on the channel Redundancy allows Bob to determine the original m. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 7 / 56

17 Communications essentials Communications and Redundancy Benefits of redundancy Cross-word puzzles Understand foreigners with imperfect pronounciation. How much would you understand of a lecture without redundancy? Hear in a noisy environment. Read bad hand writing How could I mark exam scripts without redundancy? Cryptanalysis? Steganalysis? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 8 / 56

18 Communications essentials Communications and Redundancy Benefits of redundancy Cross-word puzzles Understand foreigners with imperfect pronounciation. How much would you understand of a lecture without redundancy? Hear in a noisy environment. Read bad hand writing How could I mark exam scripts without redundancy? Cryptanalysis? Steganalysis? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 8 / 56

19 Communications essentials Communications and Redundancy Benefits of redundancy Cross-word puzzles Understand foreigners with imperfect pronounciation. How much would you understand of a lecture without redundancy? Hear in a noisy environment. Read bad hand writing How could I mark exam scripts without redundancy? Cryptanalysis? Steganalysis? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 8 / 56

20 Communications essentials Communications and Redundancy Benefits of redundancy Cross-word puzzles Understand foreigners with imperfect pronounciation. How much would you understand of a lecture without redundancy? Hear in a noisy environment. Read bad hand writing How could I mark exam scripts without redundancy? Cryptanalysis? Steganalysis? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 8 / 56

21 Communications essentials Communications and Redundancy Benefits of redundancy Cross-word puzzles Understand foreigners with imperfect pronounciation. How much would you understand of a lecture without redundancy? Hear in a noisy environment. Read bad hand writing How could I mark exam scripts without redundancy? Cryptanalysis? Steganalysis? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 8 / 56

22 Communications essentials Communications and Redundancy Benefits of redundancy Cross-word puzzles Understand foreigners with imperfect pronounciation. How much would you understand of a lecture without redundancy? Hear in a noisy environment. Read bad hand writing How could I mark exam scripts without redundancy? Cryptanalysis? Steganalysis? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 8 / 56

23 Communications essentials Communications and Redundancy Benefits of redundancy Cross-word puzzles Understand foreigners with imperfect pronounciation. How much would you understand of a lecture without redundancy? Hear in a noisy environment. Read bad hand writing How could I mark exam scripts without redundancy? Cryptanalysis? Steganalysis? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 8 / 56

24 Communications essentials What if there were no redundancy? Communications and Redundancy No use for steganography! Any text would be meaningful, in particular, ciphertext would be meaningful Simple encryption would give a stegogramme indistinguishable from cover-text. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 9 / 56

25 Communications essentials What if there were no redundancy? Communications and Redundancy No use for steganography! Any text would be meaningful, in particular, ciphertext would be meaningful Simple encryption would give a stegogramme indistinguishable from cover-text. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 9 / 56

26 Communications essentials What if there were no redundancy? Communications and Redundancy No use for steganography! Any text would be meaningful, in particular, ciphertext would be meaningful Simple encryption would give a stegogramme indistinguishable from cover-text. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 9 / 56

27 Outline Communications essentials Anderson and Petitcolas Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 10 / 56

28 Perfect compression Communications essentials Anderson and Petitcolas 1999 Compression removes redundancy Minimises average string length (file size) Retaining information contents Decompression replaces the redundancy Recover original (loss-less compression) Perfect means no redundancy in compressed string Consequently all strings are used A(ny) random string can be decompressed... and yield sensible output Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 11 / 56

29 Perfect compression Communications essentials Anderson and Petitcolas 1999 Compression removes redundancy Minimises average string length (file size) Retaining information contents Decompression replaces the redundancy Recover original (loss-less compression) Perfect means no redundancy in compressed string Consequently all strings are used A(ny) random string can be decompressed... and yield sensible output Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 11 / 56

30 Communications essentials Anderson and Petitcolas 1999 Steganography by Perfect Compression Anderson and Petitcolas 1998 A perfect compression scheme. A secure cipher. Message Encryption adf!haj dgh a Decompress Key Once upon a time there was a red herring... Message Decrypt adf!haj dgh a Compress Steganography without data hiding. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 12 / 56

31 Communications essentials Anderson and Petitcolas 1999 Steganography by Perfect Compression Anderson and Petitcolas 1998 A perfect compression scheme. A secure cipher. Message Encryption adf!haj dgh a Decompress Key Once upon a time there was a red herring... Message Decrypt adf!haj dgh a Compress Steganography without data hiding. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 12 / 56

32 Outline Communications essentials Digital Communications 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 13 / 56

33 Communications essentials Digital Communications Problems in natural language How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy Others have very little Unstructured: hard to automate correction Structured redundancy is necessary for digital comms Coding Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 14 / 56

34 Communications essentials Digital Communications Problems in natural language How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy Others have very little Unstructured: hard to automate correction Structured redundancy is necessary for digital comms Coding Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 14 / 56

35 Communications essentials Digital Communications Problems in natural language How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy Others have very little Unstructured: hard to automate correction Structured redundancy is necessary for digital comms Coding Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 14 / 56

36 Communications essentials Digital Communications Problems in natural language How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy Others have very little Unstructured: hard to automate correction Structured redundancy is necessary for digital comms Coding Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 14 / 56

37 Communications essentials Digital Communications Problems in natural language How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy Others have very little Unstructured: hard to automate correction Structured redundancy is necessary for digital comms Coding Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 14 / 56

38 Communications essentials Digital Communications Problems in natural language How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy Others have very little Unstructured: hard to automate correction Structured redundancy is necessary for digital comms Coding Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 14 / 56

39 Communications essentials Digital Communications Problems in natural language How efficient is the redundancy Natural languages are arbitrary Some words/sentences have a lot of redundancy Others have very little Unstructured: hard to automate correction Structured redundancy is necessary for digital comms Coding Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 14 / 56

40 Communications essentials Digital Communications Coding Channel and source coding Source coding (aka. compression) Remove redundancy Make a compact representation Channel coding (aka. error-control coding) Add mathematically structured redundancy Computationally efficient error-correction Optimised (low error-rate, small space) Two aspect of Information Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 15 / 56

41 Communications essentials Digital Communications Coding Channel and source coding Source coding (aka. compression) Remove redundancy Make a compact representation Channel coding (aka. error-control coding) Add mathematically structured redundancy Computationally efficient error-correction Optimised (low error-rate, small space) Two aspect of Information Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 15 / 56

42 Communications essentials Digital Communications Coding Channel and source coding Source coding (aka. compression) Remove redundancy Make a compact representation Channel coding (aka. error-control coding) Add mathematically structured redundancy Computationally efficient error-correction Optimised (low error-rate, small space) Two aspect of Information Theory Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 15 / 56

43 Communications essentials Channel and Source Coding Digital Communications Message r Comp. Decom. Encode Channel Decode Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 16 / 56

44 Communications essentials Channel and Source Coding Digital Communications Message r Remove redundancy Comp. Decom. Encode Channel Decode Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 16 / 56

45 Communications essentials Channel and Source Coding Digital Communications Message r Remove redundancy Comp. Decom. Add redundancy Encode Channel Decode Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 16 / 56

46 Communications essentials Channel and Source Coding Digital Communications Message r Remove redundancy Comp. Decom. Scramble Encrypt. Decrypt. Add redundancy Encode Channel Decode Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 16 / 56

47 Outline Communications essentials Shannon Entropy 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 17 / 56

48 Communications essentials Shannon Entropy Uncertainty Shannon Entropy m and r are stochastic variables (drawn at random from a distribution) How much uncertainty about the message m? Uncertainty measured by entropy H(m) before any message is received. H(m r) after receipt of the message Conditional entropy Mutual Information is derived from entropy I(m; r) = H(m) H(m r) I(m; r) is the amount of information contained in r about m. I(m; r) = I(r; m) Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 18 / 56

49 Communications essentials Shannon Entropy Uncertainty Shannon Entropy m and r are stochastic variables (drawn at random from a distribution) How much uncertainty about the message m? Uncertainty measured by entropy H(m) before any message is received. H(m r) after receipt of the message Conditional entropy Mutual Information is derived from entropy I(m; r) = H(m) H(m r) I(m; r) is the amount of information contained in r about m. I(m; r) = I(r; m) Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 18 / 56

50 Communications essentials Shannon Entropy Uncertainty Shannon Entropy m and r are stochastic variables (drawn at random from a distribution) How much uncertainty about the message m? Uncertainty measured by entropy H(m) before any message is received. H(m r) after receipt of the message Conditional entropy Mutual Information is derived from entropy I(m; r) = H(m) H(m r) I(m; r) is the amount of information contained in r about m. I(m; r) = I(r; m) Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 18 / 56

51 Shannon entropy Definition Communications essentials Shannon Entropy Random variable X X H q (X) = x X Pr(X = x) log q Pr(X = x) Usually q = 2, giving entropy in bits q = e (natural logarithm) gives entropy in nats If Pr(X = x i ) = p i for x 1, x 2,... X, we write H(X) = h(p 1, p 2,...) Example: One question Q; Yes/No is probability ( ) H(Q) = log 1 2 = 1 Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 19 / 56

52 Shannon entropy Definition Communications essentials Shannon Entropy Random variable X X H q (X) = x X Pr(X = x) log q Pr(X = x) Usually q = 2, giving entropy in bits q = e (natural logarithm) gives entropy in nats If Pr(X = x i ) = p i for x 1, x 2,... X, we write H(X) = h(p 1, p 2,...) Example: One question Q; Yes/No is probability ( ) H(Q) = log 1 2 = 1 Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 19 / 56

53 Shannon entropy Definition Communications essentials Shannon Entropy Random variable X X H q (X) = x X Pr(X = x) log q Pr(X = x) Usually q = 2, giving entropy in bits q = e (natural logarithm) gives entropy in nats If Pr(X = x i ) = p i for x 1, x 2,... X, we write H(X) = h(p 1, p 2,...) Example: One question Q; Yes/No is probability ( ) H(Q) = log 1 2 = 1 Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 19 / 56

54 Shannon entropy Definition Communications essentials Shannon Entropy Random variable X X H q (X) = x X Pr(X = x) log q Pr(X = x) Usually q = 2, giving entropy in bits q = e (natural logarithm) gives entropy in nats If Pr(X = x i ) = p i for x 1, x 2,... X, we write H(X) = h(p 1, p 2,...) Example: One question Q; Yes/No is probability ( ) H(Q) = log 1 2 = 1 Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 19 / 56

55 Communications essentials Shannon Entropy Example Alice has a 1-bit message m, with distribution The entropy (Bob s uncertainty) is H(m Binary Symmetric Channel with error rate of 25% i.e. 25% risk that Alice s message is flipped Alice s uncertainty about the received message is H(r 1) = H(r 0) = 0.25 log log H(r m) = 0.5H(r 0) + 0.5H(r 1) = The information received by Bob is I(m; r) = H(m) H(m r) = H(r) H(r m) = = What if the error rate is 50%? Or 10%? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 20 / 56

56 Communications essentials Shannon Entropy Shannon entropy Properties 1 Additive, if X and Y are independent, then H(X, Y ) = H(X) + H(Y ). If you are uncertain about two completely different questions, the entropy is the sum of uncertainty for each question 2 If X is uniformly distributed, then H(X) increase when the size of X increases. The more possibilities, the more uncertainty 3 Continuity, h(p 1, p 2,...) is continuous in each p i. Shannon entropy is a measure in mathematical terms Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 21 / 56

57 What it tells us Shannon entropy Communications essentials Shannon Entropy Consider a message X of entropy k = H(X) (in bits) The average size of a file F describing X is at least k bits If the size of F is exactly k bits on average then we have found a perfect compression of F Each message bit contains one bit of information on average Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 22 / 56

58 What it tells us Shannon entropy Communications essentials Shannon Entropy Consider a message X of entropy k = H(X) (in bits) The average size of a file F describing X is at least k bits If the size of F is exactly k bits on average then we have found a perfect compression of F Each message bit contains one bit of information on average Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 22 / 56

59 Communications essentials Shannon Entropy Exemple banale A single bit may contain more than a 1 bit of information E.G. Image Compression 0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F : Che Guevarra : other images However, on average, Maximum information in one bit is one bit (most of the time it is less) The example is based on Huffmann coding Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 23 / 56

60 Communications essentials Shannon Entropy Exemple banale A single bit may contain more than a 1 bit of information E.G. Image Compression 0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F : Che Guevarra : other images However, on average, Maximum information in one bit is one bit (most of the time it is less) The example is based on Huffmann coding Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 23 / 56

61 Communications essentials Shannon Entropy Exemple banale A single bit may contain more than a 1 bit of information E.G. Image Compression 0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F : Che Guevarra : other images However, on average, Maximum information in one bit is one bit (most of the time it is less) The example is based on Huffmann coding Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 23 / 56

62 Communications essentials Shannon Entropy Exemple banale A single bit may contain more than a 1 bit of information E.G. Image Compression 0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F : Che Guevarra : other images However, on average, Maximum information in one bit is one bit (most of the time it is less) The example is based on Huffmann coding Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 23 / 56

63 Communications essentials Shannon Entropy Exemple banale A single bit may contain more than a 1 bit of information E.G. Image Compression 0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F : Che Guevarra : other images However, on average, Maximum information in one bit is one bit (most of the time it is less) The example is based on Huffmann coding Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 23 / 56

64 Communications essentials Shannon Entropy Exemple banale A single bit may contain more than a 1 bit of information E.G. Image Compression 0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F : Che Guevarra : other images However, on average, Maximum information in one bit is one bit (most of the time it is less) The example is based on Huffmann coding Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 23 / 56

65 Communications essentials Shannon Entropy Exemple banale A single bit may contain more than a 1 bit of information E.G. Image Compression 0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F : Che Guevarra : other images However, on average, Maximum information in one bit is one bit (most of the time it is less) The example is based on Huffmann coding Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 23 / 56

66 Communications essentials Shannon Entropy Exemple banale A single bit may contain more than a 1 bit of information E.G. Image Compression 0: Mona Lisa 10: Lenna 110: Baboon 11100: Peppers 11110: F : Che Guevarra : other images However, on average, Maximum information in one bit is one bit (most of the time it is less) The example is based on Huffmann coding Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 23 / 56

67 Outline Communications essentials Security 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 24 / 56

68 Cryptography Communications essentials Security Alice ciphertext Bob, m c m Eve Eve seeks information about m, observing c If I(m; c) > 0 then Eve succeeds in theory or if I(k; c) > 0 If H(m c) = H(m) then the system is absolutely secure. The above are strong statements Even if Eve has information I(m; c) > 0, she may be unable to make sense of it. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 25 / 56

69 Communications essentials Security Stegananalysis Question: Does Alice send secret information to Bob? Answer: X {yes, no} What is the uncertainty H(X)? Eve intercepts a message S, Is there any information I(X; S)? If H(X S) = H(X), then the system is absolutely secure. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 26 / 56

70 Communications essentials Security Stegananalysis Question: Does Alice send secret information to Bob? Answer: X {yes, no} What is the uncertainty H(X)? Eve intercepts a message S, Is there any information I(X; S)? If H(X S) = H(X), then the system is absolutely secure. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 26 / 56

71 Communications essentials Security Stegananalysis Question: Does Alice send secret information to Bob? Answer: X {yes, no} What is the uncertainty H(X)? Eve intercepts a message S, Is there any information I(X; S)? If H(X S) = H(X), then the system is absolutely secure. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 26 / 56

72 Communications essentials Security Stegananalysis Question: Does Alice send secret information to Bob? Answer: X {yes, no} What is the uncertainty H(X)? Eve intercepts a message S, Is there any information I(X; S)? If H(X S) = H(X), then the system is absolutely secure. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 26 / 56

73 Communications essentials Security Stegananalysis Question: Does Alice send secret information to Bob? Answer: X {yes, no} What is the uncertainty H(X)? Eve intercepts a message S, Is there any information I(X; S)? If H(X S) = H(X), then the system is absolutely secure. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 26 / 56

74 Outline Communications essentials Prediction 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 27 / 56

75 Communications essentials Prediction Random sequences Text is a sequence of random samples (letters) (l 1, l 2, l 3,...); l i A = {A, B,..., Z} Each letter has a probability distribution P(l), l A. Statistical dependence (implies redundancy) P(l i l i 1 ) P(l i ) H(l i l i 1 ) < H(l i ): Letter i 1 contains information about l i Use this information to guess l i The more letters l i j,..., l i 1 we have seen the more reliable can we predict l i Wayner (Ch 6.1) gives example of first, second,..., fifth order prediction Using j = 0, 1, 2, 3, 4 Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 28 / 56

76 Communications essentials First-order prediction Example from Wayner Prediction Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 29 / 56

77 Communications essentials Second-order prediction Example from Wayner Prediction Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 30 / 56

78 Communications essentials Third-order prediction Example from Wayner Prediction Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 31 / 56

79 Communications essentials Fourth-order prediction Example from Wayner Prediction Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 32 / 56

80 Markov models Communications essentials Prediction Markov source is a sequence M 1, M 2,... of stochastic (random) variables An n-th order Markov source completely described by probability distributions P[M 1, M 2,..., M n] P[M i M i n,..., M i 1 ] (identical for all i) This is a finite-state machine (automaton) State of the source last n bits M i n,..., M i 1 determines probability distribution of next symbol The random texts from Wayner are generated using 1st, 2nd, 3rd, and 4th order Markov models Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 33 / 56

81 Communications essentials Prediction A related example A group of MIT students software generating random science papers random paper accepted for WMSCI 2005 You can generate your own paper on-line Source code available (SCIgen) If you are brave as a poster topic modify SCIgen for steganography Or maybe for your dissertation if you have a related topic you can tweek Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 34 / 56

82 Outline Compression 1 Communications essentials 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 35 / 56

83 Outline Compression Huffmann Coding 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 36 / 56

84 Compression Huffmann Coding Compression F is set of binary strings of arbitrary length Definition A compression system is a function c : F F, such that E(length m) > E(length(c(m))) when m is drawn from F. The compressed string is expected to be shorter than the original. Definition A compression c is perfect E(length c(m)) = H(m). It follows from the definition that the compression is one-to-one Decompress any random string m, and c 1 (m) makes sense! Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 37 / 56

85 Huffmann Coding Compression Huffmann Coding Short codewords for frequent quantities Long codewords for unusual quantities Each symbol (bit) should be equally probable % % 25% Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 38 / 56

86 Example Compression Huffmann Coding % 25% 25% % % 7 1 % 4 4 Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 39 / 56

87 Compression Huffmann Coding Decoding Huffmann codes are prefix free No codeword is the prefix of another This simplifies the decoding This is expressed in the Huffmann tree, follow edges for each coded bit (only) leaf node resolves to a message symbol When a message symbol is recovered, start over for next symbol. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 40 / 56

88 Compression Huffmann Coding Ideal Huffmann code Each branch equally likely: P(b i b i 1, b i 2,...) = 1/2 Maximum entropy H(B i B i 1, B i 2,...) = 1 uniform distribution of compressed files implies perfect compression In practice, the probabilities are rarely powers of 1 2 hence the Huffmann code is imperfect Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 41 / 56

89 Outline Compression Huffmann Steganography 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 42 / 56

90 Reverse Huffmann Compression Huffmann Steganography Core Reading Peter Wayner: Disappearing Cryptography Ch. 6-7 Use a Huffmann code for each state in the Markov model Stegano-encoder: Huffmann decompression Stegano-decoder: Huffmann compression Is this similar to Anderson & Petitcolas Steganography by Perfect Compression? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 43 / 56

91 Compression Huffmann Steganography The Steganogram Steganogram looks like random text use probability distribution based on sample text higher-order statistics make it look natural Fifth-order statistics is reasonable Higher order will look more natural Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 44 / 56

92 Compression Huffmann Steganography The Steganogram Steganogram looks like random text use probability distribution based on sample text higher-order statistics make it look natural Fifth-order statistics is reasonable Higher order will look more natural Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 44 / 56

93 Compression Huffmann Steganography Example Fifth order For each 5-tupple of letters A 0, A 1, A 2, A 3, A 4, Let l i 4,..., l i be consecutive letters in natural text tabulate P(l i = A 0 l i j = A j, j = 1, 2, 3, 4) For each 4-tuple A 1, A 2, A 3, A 4 make an (approximate) Huffmann code for A 0. we may ommit some values of A 0, or have non-unique codewords We encode a message by Huffmann decompression using Huffmann code depending on the last four stegogramme symbols obtaining a fifth-order random text Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 45 / 56

94 Compression Huffmann Steganography Example Fifth order Consider four preceeding letters comp Next letter may be letter r e l a o probability 40% 12% 22% 18% 8% combined 52% 22% 26% rounded 50% 25% 25% Rounding to power of 1 2 Combining several letters reduces rounding error. The example is arbitrary and fictuous. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 46 / 56

95 Compression Huffmann Steganography Example The Huffmann code Huffmann code based on fifth-order conditional probabilities 0 1 r/e 0 1 l a/o When two letters are possible, choose at random (according to probability in natural text) decoding (compression) is still unique encoding (decompression) is not unique This evens out the statistics in the stegogramme Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 47 / 56

96 Outline Miscellanea 1 Communications essentials 2 Compression 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 48 / 56

97 Outline Miscellanea Synthesis by Grammar 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 49 / 56

98 Miscellanea Synthesis by Grammar Grammar A grammar describes the structure of a language Simple grammar sentence noun verb noun Mr. Brown Miss Scarlet verb eats drinks Each choice can map to a message symbol 0 : Mr. Brown, eats 1 : Miss Scarlet, drinks Two messages can be stego-encrypted No cover-text is input. Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 50 / 56

99 Miscellanea Synthesis by Grammar More complex grammar sentence noun verb addition noun Mr. Brown Miss Scarlet... Mrs. White verb eats drinks celebrates... cooks addition addition term term on Monday in March with Mr. Green... in Alaska at home general sentence question question Does noun verb addition? xgeneral general sentence, because sentence Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 51 / 56

100 Miscellanea Synthesis by Grammar More complex grammar sentence noun verb addition noun Mr. Brown Miss Scarlet... Mrs. White verb eats drinks celebrates... cooks addition addition term term on Monday in March with Mr. Green... in Alaska at home general sentence question question Does noun verb addition? xgeneral general sentence, because sentence Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 51 / 56

101 Miscellanea Synthesis by Grammar More complex grammar sentence noun verb addition noun Mr. Brown Miss Scarlet... Mrs. White verb eats drinks celebrates... cooks addition addition term term on Monday in March with Mr. Green... in Alaska at home general sentence question question Does noun verb addition? xgeneral general sentence, because sentence Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 51 / 56

102 Miscellanea Synthesis by Grammar Is this practical? Exercise Choose either the reverse Huffmann or the grammar-based steganography technique, and write a short critique (approx. 1 page) where you answer some of the following questions. How can you do steganalysis? Under what condition will it be secure? Is the system practical? Useful? Which implementation issues do you foresee? How could it be implemented? Could the technique extend to images? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 52 / 56

103 Outline Miscellanea Redundancy in Images 1 Communications essentials Communications and Redundancy Anderson and Petitcolas 1999 Digital Communications Shannon Entropy Security Prediction 2 Compression Huffmann Coding Huffmann Steganography 3 Miscellanea Synthesis by Grammar Redundancy in Images Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 53 / 56

104 Miscellanea Redundancy in Images Returning to Images Communications deal with abstract and discrete data Arbitrary bit strings How does redundancy apply to images? Some say that the LSB is redundant... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information any value in the LSB would be meaningful/valid Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 54 / 56

105 Miscellanea Redundancy in Images Returning to Images Communications deal with abstract and discrete data Arbitrary bit strings How does redundancy apply to images? Some say that the LSB is redundant... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information any value in the LSB would be meaningful/valid Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 54 / 56

106 Miscellanea Redundancy in Images Returning to Images Communications deal with abstract and discrete data Arbitrary bit strings How does redundancy apply to images? Some say that the LSB is redundant... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information any value in the LSB would be meaningful/valid Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 54 / 56

107 Miscellanea Redundancy in Images Returning to Images Communications deal with abstract and discrete data Arbitrary bit strings How does redundancy apply to images? Some say that the LSB is redundant... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information any value in the LSB would be meaningful/valid Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 54 / 56

108 Miscellanea Redundancy in Images Returning to Images Communications deal with abstract and discrete data Arbitrary bit strings How does redundancy apply to images? Some say that the LSB is redundant... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information any value in the LSB would be meaningful/valid Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 54 / 56

109 Miscellanea Redundancy in Images Returning to Images Communications deal with abstract and discrete data Arbitrary bit strings How does redundancy apply to images? Some say that the LSB is redundant... e.g. it does not change the semantics however, the LSB cannot be reconstructed from the other bits Characters removed from English text can be reconstructed I.e. the LSB-s contain little, but still some information any value in the LSB would be meaningful/valid Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 54 / 56

110 Miscellanea Redundancy in Images Lossy and loss-less compression The Huffmann code is loss-less decompression restores the original exactly How does image processing work? Lossy... i.e. information is unrevocably lost down-sampling (reduce resolution) reduce colour depth (e.g. discard LSB)... and similar approaches in the transform domain Loss-less image compression is still possible... but the loss/compression trade-off favours lossy compression Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 55 / 56

111 Miscellanea Redundancy in Images Lossy and loss-less compression The Huffmann code is loss-less decompression restores the original exactly How does image processing work? Lossy... i.e. information is unrevocably lost down-sampling (reduce resolution) reduce colour depth (e.g. discard LSB)... and similar approaches in the transform domain Loss-less image compression is still possible... but the loss/compression trade-off favours lossy compression Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 55 / 56

112 Miscellanea Redundancy in Images Lossy and loss-less compression The Huffmann code is loss-less decompression restores the original exactly How does image processing work? Lossy... i.e. information is unrevocably lost down-sampling (reduce resolution) reduce colour depth (e.g. discard LSB)... and similar approaches in the transform domain Loss-less image compression is still possible... but the loss/compression trade-off favours lossy compression Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 55 / 56

113 Miscellanea Redundancy in Images LSB in English text Thought experiment Hide in redundant characters in English prose analogously to LSB Would it work? Why (not)? Correct character can be predicted spelling mistakes would be suspicious Would it work using spelling mistakes in an MSc dissertation by an overseas student? Dr Hans Georg Schaathun Information Theory and Synthetic Steganography Spring 2009 Week 8 56 / 56

Learning Outcomes. Information Theory and Synthetic Steganography. Reading. The communications problem

Learning Outcomes. Information Theory and Synthetic Steganography. Reading. The communications problem Learning Outcomes Information Theory and Synthetic Steganography CSM25 Secure Information Hiding Dr Hans Georg Schaathun University of Surrey Spring 2009 Week 8 understand the relationship between steganography

More information

Learning outcomes. Palettes and GIF. The colour palette. Using the colour palette The GIF file. CSM25 Secure Information Hiding

Learning outcomes. Palettes and GIF. The colour palette. Using the colour palette The GIF file. CSM25 Secure Information Hiding Learning outcomes Palettes and GIF CSM25 Secure Information Hiding Dr Hans Georg Schaathun University of Surrey Learn how images are represented using a palette Get an overview of hiding techniques in

More information

Lecture 1: Shannon s Theorem

Lecture 1: Shannon s Theorem Lecture 1: Shannon s Theorem Lecturer: Travis Gagie January 13th, 2015 Welcome to Data Compression! I m Travis and I ll be your instructor this week. If you haven t registered yet, don t worry, we ll work

More information

Outline. Computer Science 418. Number of Keys in the Sum. More on Perfect Secrecy, One-Time Pad, Entropy. Mike Jacobson. Week 3

Outline. Computer Science 418. Number of Keys in the Sum. More on Perfect Secrecy, One-Time Pad, Entropy. Mike Jacobson. Week 3 Outline Computer Science 48 More on Perfect Secrecy, One-Time Pad, Mike Jacobson Department of Computer Science University of Calgary Week 3 2 3 Mike Jacobson (University of Calgary) Computer Science 48

More information

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H. Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

A Mathematical Theory of Communication

A Mathematical Theory of Communication A Mathematical Theory of Communication Ben Eggers Abstract This paper defines information-theoretic entropy and proves some elementary results about it. Notably, we prove that given a few basic assumptions

More information

PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY

PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY BURTON ROSENBERG UNIVERSITY OF MIAMI Contents 1. Perfect Secrecy 1 1.1. A Perfectly Secret Cipher 2 1.2. Odds Ratio and Bias 3 1.3. Conditions for Perfect

More information

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the

More information

Cryptography. P. Danziger. Transmit...Bob...

Cryptography. P. Danziger. Transmit...Bob... 10.4 Cryptography P. Danziger 1 Cipher Schemes A cryptographic scheme is an example of a code. The special requirement is that the encoded message be difficult to retrieve without some special piece of

More information

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance. 9. Distance measures 9.1 Classical information measures How similar/close are two probability distributions? Trace distance Fidelity Example: Flipping two coins, one fair one biased Head Tail Trace distance

More information

Information & Correlation

Information & Correlation Information & Correlation Jilles Vreeken 11 June 2014 (TADA) Questions of the day What is information? How can we measure correlation? and what do talking drums have to do with this? Bits and Pieces What

More information

Information Theory and Coding Techniques

Information Theory and Coding Techniques Information Theory and Coding Techniques Lecture 1.2: Introduction and Course Outlines Information Theory 1 Information Theory and Coding Techniques Prof. Ja-Ling Wu Department of Computer Science and

More information

Secrecy and the Quantum

Secrecy and the Quantum Secrecy and the Quantum Benjamin Schumacher Department of Physics Kenyon College Bright Horizons 35 (July, 2018) Keeping secrets Communication Alice sound waves, photons, electrical signals, paper and

More information

UNIT I INFORMATION THEORY. I k log 2

UNIT I INFORMATION THEORY. I k log 2 UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper

More information

Cryptography. Lecture 2: Perfect Secrecy and its Limitations. Gil Segev

Cryptography. Lecture 2: Perfect Secrecy and its Limitations. Gil Segev Cryptography Lecture 2: Perfect Secrecy and its Limitations Gil Segev Last Week Symmetric-key encryption (KeyGen, Enc, Dec) Historical ciphers that are completely broken The basic principles of modern

More information

CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrosky. Lecture 4

CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrosky. Lecture 4 CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrosky Lecture 4 Lecture date: January 26, 2005 Scribe: Paul Ray, Mike Welch, Fernando Pereira 1 Private Key Encryption Consider a game between

More information

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 Lecturer: Dr. Mark Tame Introduction With the emergence of new types of information, in this case

More information

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

Multimedia Communications. Mathematical Preliminaries for Lossless Compression Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when

More information

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2 Introduction to Information Theory B. Škorić, Physical Aspects of Digital Security, Chapter 2 1 Information theory What is it? - formal way of counting information bits Why do we need it? - often used

More information

Lecture 1: Perfect Secrecy and Statistical Authentication. 2 Introduction - Historical vs Modern Cryptography

Lecture 1: Perfect Secrecy and Statistical Authentication. 2 Introduction - Historical vs Modern Cryptography CS 7880 Graduate Cryptography September 10, 2015 Lecture 1: Perfect Secrecy and Statistical Authentication Lecturer: Daniel Wichs Scribe: Matthew Dippel 1 Topic Covered Definition of perfect secrecy One-time

More information

Topics. Probability Theory. Perfect Secrecy. Information Theory

Topics. Probability Theory. Perfect Secrecy. Information Theory Topics Probability Theory Perfect Secrecy Information Theory Some Terms (P,C,K,E,D) Computational Security Computational effort required to break cryptosystem Provable Security Relative to another, difficult

More information

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria Source Coding Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Asymptotic Equipartition Property Optimal Codes (Huffman Coding) Universal

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

William Stallings Copyright 2010

William Stallings Copyright 2010 A PPENDIX F M EASURES OF S ECRECY AND S ECURITY William Stallings Copyright 2010 F.1 PERFECT SECRECY...2! F.2 INFORMATION AND ENTROPY...8! Information...8! Entropy...10! Properties of the Entropy Function...12!

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Cryptography and Network Security Prof. D. Mukhopadhyay Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur

Cryptography and Network Security Prof. D. Mukhopadhyay Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur Cryptography and Network Security Prof. D. Mukhopadhyay Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur Module No. # 01 Lecture No. # 08 Shannon s Theory (Contd.)

More information

Ping Pong Protocol & Auto-compensation

Ping Pong Protocol & Auto-compensation Ping Pong Protocol & Auto-compensation Adam de la Zerda For QIP seminar Spring 2004 02.06.04 Outline Introduction to QKD protocols + motivation Ping-Pong protocol Security Analysis for Ping-Pong Protocol

More information

Lecture 16 Oct 21, 2014

Lecture 16 Oct 21, 2014 CS 395T: Sublinear Algorithms Fall 24 Prof. Eric Price Lecture 6 Oct 2, 24 Scribe: Chi-Kit Lam Overview In this lecture we will talk about information and compression, which the Huffman coding can achieve

More information

6.02 Fall 2012 Lecture #1

6.02 Fall 2012 Lecture #1 6.02 Fall 2012 Lecture #1 Digital vs. analog communication The birth of modern digital communication Information and entropy Codes, Huffman coding 6.02 Fall 2012 Lecture 1, Slide #1 6.02 Fall 2012 Lecture

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Chapter 2. A Look Back. 2.1 Substitution ciphers

Chapter 2. A Look Back. 2.1 Substitution ciphers Chapter 2 A Look Back In this chapter we take a quick look at some classical encryption techniques, illustrating their weakness and using these examples to initiate questions about how to define privacy.

More information

Steganalysis in JPEG

Steganalysis in JPEG Steganalysis in JPEG CSM25 Secure Information Hiding Dr Hans Georg Schaathun University of Surrey Spring 2007 Dr Hans Georg Schaathun Steganalysis in JPEG Spring 2007 1 / 24 Learning Outcomes learn additional

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1 Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,

More information

CHALMERS GÖTEBORGS UNIVERSITET. TDA352 (Chalmers) - DIT250 (GU) 11 April 2017, 8:30-12:30

CHALMERS GÖTEBORGS UNIVERSITET. TDA352 (Chalmers) - DIT250 (GU) 11 April 2017, 8:30-12:30 CHALMERS GÖTEBORGS UNIVERSITET CRYPTOGRAPHY TDA35 (Chalmers) - DIT50 (GU) 11 April 017, 8:30-1:30 No extra material is allowed during the exam except for pens and a simple calculator (not smartphones).

More information

Intro to Information Theory

Intro to Information Theory Intro to Information Theory Math Circle February 11, 2018 1. Random variables Let us review discrete random variables and some notation. A random variable X takes value a A with probability P (a) 0. Here

More information

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Compression Motivation Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Storage: Store large & complex 3D models (e.g. 3D scanner

More information

Final Exam Math 105: Topics in Mathematics Cryptology, the Science of Secret Writing Rhodes College Tuesday, 30 April :30 11:00 a.m.

Final Exam Math 105: Topics in Mathematics Cryptology, the Science of Secret Writing Rhodes College Tuesday, 30 April :30 11:00 a.m. Final Exam Math 10: Topics in Mathematics Cryptology, the Science of Secret Writing Rhodes College Tuesday, 0 April 2002 :0 11:00 a.m. Instructions: Please be as neat as possible (use a pencil), and show

More information

PERFECTLY secure key agreement has been studied recently

PERFECTLY secure key agreement has been studied recently IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 2, MARCH 1999 499 Unconditionally Secure Key Agreement the Intrinsic Conditional Information Ueli M. Maurer, Senior Member, IEEE, Stefan Wolf Abstract

More information

Lecture 19: Public-key Cryptography (Diffie-Hellman Key Exchange & ElGamal Encryption) Public-key Cryptography

Lecture 19: Public-key Cryptography (Diffie-Hellman Key Exchange & ElGamal Encryption) Public-key Cryptography Lecture 19: (Diffie-Hellman Key Exchange & ElGamal Encryption) Recall In private-key cryptography the secret-key sk is always established ahead of time The secrecy of the private-key cryptography relies

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2013, 29 November 2013 Memoryless Sources Arithmetic Coding Sources with Memory Markov Example 2 / 21 Encoding the output

More information

DCSP-3: Minimal Length Coding. Jianfeng Feng

DCSP-3: Minimal Length Coding. Jianfeng Feng DCSP-3: Minimal Length Coding Jianfeng Feng Department of Computer Science Warwick Univ., UK Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html Automatic Image Caption (better than

More information

Notes 10: Public-key cryptography

Notes 10: Public-key cryptography MTH6115 Cryptography Notes 10: Public-key cryptography In this section we look at two other schemes that have been proposed for publickey ciphers. The first is interesting because it was the earliest such

More information

17.1 Binary Codes Normal numbers we use are in base 10, which are called decimal numbers. Each digit can be 10 possible numbers: 0, 1, 2, 9.

17.1 Binary Codes Normal numbers we use are in base 10, which are called decimal numbers. Each digit can be 10 possible numbers: 0, 1, 2, 9. ( c ) E p s t e i n, C a r t e r, B o l l i n g e r, A u r i s p a C h a p t e r 17: I n f o r m a t i o n S c i e n c e P a g e 1 CHAPTER 17: Information Science 17.1 Binary Codes Normal numbers we use

More information

Lecture 1: Introduction, Entropy and ML estimation

Lecture 1: Introduction, Entropy and ML estimation 0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual

More information

Chapter 2: Source coding

Chapter 2: Source coding Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent

More information

Lecture 2: Perfect Secrecy and its Limitations

Lecture 2: Perfect Secrecy and its Limitations CS 4501-6501 Topics in Cryptography 26 Jan 2018 Lecture 2: Perfect Secrecy and its Limitations Lecturer: Mohammad Mahmoody Scribe: Mohammad Mahmoody 1 Introduction Last time, we informally defined encryption

More information

CS4800: Algorithms & Data Jonathan Ullman

CS4800: Algorithms & Data Jonathan Ullman CS4800: Algorithms & Data Jonathan Ullman Lecture 22: Greedy Algorithms: Huffman Codes Data Compression and Entropy Apr 5, 2018 Data Compression How do we store strings of text compactly? A (binary) code

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Problem Set: TT Quantum Information

Problem Set: TT Quantum Information Problem Set: TT Quantum Information Basics of Information Theory 1. Alice can send four messages A, B, C, and D over a classical channel. She chooses A with probability 1/, B with probability 1/4 and C

More information

! Where are we on course map? ! What we did in lab last week. " How it relates to this week. ! Compression. " What is it, examples, classifications

! Where are we on course map? ! What we did in lab last week.  How it relates to this week. ! Compression.  What is it, examples, classifications Lecture #3 Compression! Where are we on course map?! What we did in lab last week " How it relates to this week! Compression " What is it, examples, classifications " Probability based compression # Huffman

More information

Solutions for week 1, Cryptography Course - TDA 352/DIT 250

Solutions for week 1, Cryptography Course - TDA 352/DIT 250 Solutions for week, Cryptography Course - TDA 352/DIT 250 In this weekly exercise sheet: you will use some historical ciphers, the OTP, the definition of semantic security and some combinatorial problems.

More information

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

10-704: Information Processing and Learning Fall Lecture 10: Oct 3 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 0: Oct 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of

More information

Lecture 28: Public-key Cryptography. Public-key Cryptography

Lecture 28: Public-key Cryptography. Public-key Cryptography Lecture 28: Recall In private-key cryptography the secret-key sk is always established ahead of time The secrecy of the private-key cryptography relies on the fact that the adversary does not have access

More information

Coding of memoryless sources 1/35

Coding of memoryless sources 1/35 Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems

More information

Cryptography 2017 Lecture 2

Cryptography 2017 Lecture 2 Cryptography 2017 Lecture 2 One Time Pad - Perfect Secrecy Stream Ciphers November 3, 2017 1 / 39 What have seen? What are we discussing today? Lecture 1 Course Intro Historical Ciphers Lecture 2 One Time

More information

Univ.-Prof. Dr. rer. nat. Rudolf Mathar. Written Examination. Cryptography. Tuesday, August 29, 2017, 01:30 p.m.

Univ.-Prof. Dr. rer. nat. Rudolf Mathar. Written Examination. Cryptography. Tuesday, August 29, 2017, 01:30 p.m. Cryptography Univ.-Prof. Dr. rer. nat. Rudolf Mathar 1 2 3 4 15 15 15 15 60 Written Examination Cryptography Tuesday, August 29, 2017, 01:30 p.m. Name: Matr.-No.: Field of study: Please pay attention to

More information

Outline. CPSC 418/MATH 318 Introduction to Cryptography. Information Theory. Partial Information. Perfect Secrecy, One-Time Pad

Outline. CPSC 418/MATH 318 Introduction to Cryptography. Information Theory. Partial Information. Perfect Secrecy, One-Time Pad Outline CPSC 418/MATH 318 Introduction to Cryptography, One-Time Pad Renate Scheidler Department of Mathematics & Statistics Department of Computer Science University of Calgary Based in part on slides

More information

Information Theory. Week 4 Compressing streams. Iain Murray,

Information Theory. Week 4 Compressing streams. Iain Murray, Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 4 Compressing streams Iain Murray, 2014 School of Informatics, University of Edinburgh Jensen s inequality For convex functions: E[f(x)]

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Discussion 6A Solution

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Discussion 6A Solution CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Discussion 6A Solution 1. Polynomial intersections Find (and prove) an upper-bound on the number of times two distinct degree

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free

More information

Cryptography. pieces from work by Gordon Royle

Cryptography. pieces from work by Gordon Royle Cryptography pieces from work by Gordon Royle The set-up Cryptography is the mathematics of devising secure communication systems, whereas cryptanalysis is the mathematics of breaking such systems. We

More information

Homework Set #2 Data Compression, Huffman code and AEP

Homework Set #2 Data Compression, Huffman code and AEP Homework Set #2 Data Compression, Huffman code and AEP 1. Huffman coding. Consider the random variable ( x1 x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0.11 0.04 0.04 0.03 0.02 (a Find a binary Huffman code

More information

Quantum Information & Quantum Computation

Quantum Information & Quantum Computation CS90A, Spring 005: Quantum Information & Quantum Computation Wim van Dam Engineering, Room 509 vandam@cs http://www.cs.ucsb.edu/~vandam/teaching/cs90/ Administrative The Final Examination will be: Monday

More information

MATH3302 Cryptography Problem Set 2

MATH3302 Cryptography Problem Set 2 MATH3302 Cryptography Problem Set 2 These questions are based on the material in Section 4: Shannon s Theory, Section 5: Modern Cryptography, Section 6: The Data Encryption Standard, Section 7: International

More information

Lecture 1 : Data Compression and Entropy

Lecture 1 : Data Compression and Entropy CPS290: Algorithmic Foundations of Data Science January 8, 207 Lecture : Data Compression and Entropy Lecturer: Kamesh Munagala Scribe: Kamesh Munagala In this lecture, we will study a simple model for

More information

Shannon s Theory of Secrecy Systems

Shannon s Theory of Secrecy Systems Shannon s Theory of Secrecy Systems See: C. E. Shannon, Communication Theory of Secrecy Systems, Bell Systems Technical Journal, Vol. 28, pp. 656 715, 1948. c Eli Biham - March 1, 2011 59 Shannon s Theory

More information

Information and Entropy. Professor Kevin Gold

Information and Entropy. Professor Kevin Gold Information and Entropy Professor Kevin Gold What s Information? Informally, when I communicate a message to you, that s information. Your grade is 100/100 Information can be encoded as a signal. Words

More information

CODING AND CRYPTOLOGY III CRYPTOLOGY EXERCISES. The questions with a * are extension questions, and will not be included in the assignment.

CODING AND CRYPTOLOGY III CRYPTOLOGY EXERCISES. The questions with a * are extension questions, and will not be included in the assignment. CODING AND CRYPTOLOGY III CRYPTOLOGY EXERCISES A selection of the following questions will be chosen by the lecturer to form the Cryptology Assignment. The Cryptology Assignment is due by 5pm Sunday 1

More information

RSA RSA public key cryptosystem

RSA RSA public key cryptosystem RSA 1 RSA As we have seen, the security of most cipher systems rests on the users keeping secret a special key, for anyone possessing the key can encrypt and/or decrypt the messages sent between them.

More information

( c ) E p s t e i n, C a r t e r a n d B o l l i n g e r C h a p t e r 1 7 : I n f o r m a t i o n S c i e n c e P a g e 1

( c ) E p s t e i n, C a r t e r a n d B o l l i n g e r C h a p t e r 1 7 : I n f o r m a t i o n S c i e n c e P a g e 1 ( c ) E p s t e i n, C a r t e r a n d B o l l i n g e r 2 0 1 6 C h a p t e r 1 7 : I n f o r m a t i o n S c i e n c e P a g e 1 CHAPTER 17: Information Science In this chapter, we learn how data can

More information

Information Hiding and Covert Communication

Information Hiding and Covert Communication Information Hiding and Covert Communication Andrew Ker adk @ comlab.ox.ac.uk Royal Society University Research Fellow Oxford University Computing Laboratory Foundations of Security Analysis and Design

More information

Lecture 1: September 25, A quick reminder about random variables and convexity

Lecture 1: September 25, A quick reminder about random variables and convexity Information and Coding Theory Autumn 207 Lecturer: Madhur Tulsiani Lecture : September 25, 207 Administrivia This course will cover some basic concepts in information and coding theory, and their applications

More information

1. Basics of Information

1. Basics of Information 1. Basics of Information 6.004x Computation Structures Part 1 Digital Circuits Copyright 2015 MIT EECS 6.004 Computation Structures L1: Basics of Information, Slide #1 What is Information? Information,

More information

Noisy-Channel Coding

Noisy-Channel Coding Copyright Cambridge University Press 2003. On-screen viewing permitted. Printing not permitted. http://www.cambridge.org/05264298 Part II Noisy-Channel Coding Copyright Cambridge University Press 2003.

More information

Transmitting and Hiding Quantum Information

Transmitting and Hiding Quantum Information 2018/12/20 @ 4th KIAS WORKSHOP on Quantum Information and Thermodynamics Transmitting and Hiding Quantum Information Seung-Woo Lee Quantum Universe Center Korea Institute for Advanced Study (KIAS) Contents

More information

Information Theory, Statistics, and Decision Trees

Information Theory, Statistics, and Decision Trees Information Theory, Statistics, and Decision Trees Léon Bottou COS 424 4/6/2010 Summary 1. Basic information theory. 2. Decision trees. 3. Information theory and statistics. Léon Bottou 2/31 COS 424 4/6/2010

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Introduction to Cryptology. Lecture 2

Introduction to Cryptology. Lecture 2 Introduction to Cryptology Lecture 2 Announcements 2 nd vs. 1 st edition of textbook HW1 due Tuesday 2/9 Readings/quizzes (on Canvas) due Friday 2/12 Agenda Last time Historical ciphers and their cryptanalysis

More information

to mere bit flips) may affect the transmission.

to mere bit flips) may affect the transmission. 5 VII. QUANTUM INFORMATION THEORY to mere bit flips) may affect the transmission. A. Introduction B. A few bits of classical information theory Information theory has developed over the past five or six

More information

Compression and Coding

Compression and Coding Compression and Coding Theory and Applications Part 1: Fundamentals Gloria Menegaz 1 Transmitter (Encoder) What is the problem? Receiver (Decoder) Transformation information unit Channel Ordering (significance)

More information

Chapter 2 : Perfectly-Secret Encryption

Chapter 2 : Perfectly-Secret Encryption COMP547 Claude Crépeau INTRODUCTION TO MODERN CRYPTOGRAPHY _ Second Edition _ Jonathan Katz Yehuda Lindell Chapter 2 : Perfectly-Secret Encryption 1 2.1 Definitions and Basic Properties We refer to probability

More information

Entropy as a measure of surprise

Entropy as a measure of surprise Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify

More information

(Classical) Information Theory II: Source coding

(Classical) Information Theory II: Source coding (Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable

More information

5th March Unconditional Security of Quantum Key Distribution With Practical Devices. Hermen Jan Hupkes

5th March Unconditional Security of Quantum Key Distribution With Practical Devices. Hermen Jan Hupkes 5th March 2004 Unconditional Security of Quantum Key Distribution With Practical Devices Hermen Jan Hupkes The setting Alice wants to send a message to Bob. Channel is dangerous and vulnerable to attack.

More information

Computer Science A Cryptography and Data Security. Claude Crépeau

Computer Science A Cryptography and Data Security. Claude Crépeau Computer Science 308-547A Cryptography and Data Security Claude Crépeau These notes are, largely, transcriptions by Anton Stiglic of class notes from the former course Cryptography and Data Security (308-647A)

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Solutions to Set #2 Data Compression, Huffman code and AEP

Solutions to Set #2 Data Compression, Huffman code and AEP Solutions to Set #2 Data Compression, Huffman code and AEP. Huffman coding. Consider the random variable ( ) x x X = 2 x 3 x 4 x 5 x 6 x 7 0.50 0.26 0. 0.04 0.04 0.03 0.02 (a) Find a binary Huffman code

More information

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding

Ch 0 Introduction. 0.1 Overview of Information Theory and Coding Ch 0 Introduction 0.1 Overview of Information Theory and Coding Overview The information theory was founded by Shannon in 1948. This theory is for transmission (communication system) or recording (storage

More information

Channel Coding for Secure Transmissions

Channel Coding for Secure Transmissions Channel Coding for Secure Transmissions March 27, 2017 1 / 51 McEliece Cryptosystem Coding Approach: Noiseless Main Channel Coding Approach: Noisy Main Channel 2 / 51 Outline We present an overiew of linear

More information

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE

Run-length & Entropy Coding. Redundancy Removal. Sampling. Quantization. Perform inverse operations at the receiver EEE General e Image Coder Structure Motion Video x(s 1,s 2,t) or x(s 1,s 2 ) Natural Image Sampling A form of data compression; usually lossless, but can be lossy Redundancy Removal Lossless compression: predictive

More information

9 Knapsack Cryptography

9 Knapsack Cryptography 9 Knapsack Cryptography In the past four weeks, we ve discussed public-key encryption systems that depend on various problems that we believe to be hard: prime factorization, the discrete logarithm, and

More information

CPSC 467b: Cryptography and Computer Security

CPSC 467b: Cryptography and Computer Security CPSC 467b: Cryptography and Computer Security Michael J. Fischer Lecture 3 January 22, 2013 CPSC 467b, Lecture 3 1/35 Perfect secrecy Caesar cipher Loss of perfection Classical ciphers One-time pad Affine

More information

Digital communication system. Shannon s separation principle

Digital communication system. Shannon s separation principle Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation

More information

An Introduction. Dr Nick Papanikolaou. Seminar on The Future of Cryptography The British Computer Society 17 September 2009

An Introduction. Dr Nick Papanikolaou. Seminar on The Future of Cryptography The British Computer Society 17 September 2009 An Dr Nick Papanikolaou Research Fellow, e-security Group International Digital Laboratory University of Warwick http://go.warwick.ac.uk/nikos Seminar on The Future of Cryptography The British Computer

More information

repetition, part ii Ole-Johan Skrede INF Digital Image Processing

repetition, part ii Ole-Johan Skrede INF Digital Image Processing repetition, part ii Ole-Johan Skrede 24.05.2017 INF2310 - Digital Image Processing Department of Informatics The Faculty of Mathematics and Natural Sciences University of Oslo today s lecture Coding and

More information

Perfectly-Secret Encryption

Perfectly-Secret Encryption Perfectly-Secret Encryption CSE 5351: Introduction to Cryptography Reading assignment: Read Chapter 2 You may sip proofs, but are encouraged to read some of them. 1 Outline Definition of encryption schemes

More information