Topics. Probability Theory. Perfect Secrecy. Information Theory

Similar documents
Outline. CPSC 418/MATH 318 Introduction to Cryptography. Information Theory. Partial Information. Perfect Secrecy, One-Time Pad

Outline. Computer Science 418. Number of Keys in the Sum. More on Perfect Secrecy, One-Time Pad, Entropy. Mike Jacobson. Week 3

THE UNIVERSITY OF CALGARY FACULTY OF SCIENCE DEPARTMENT OF COMPUTER SCIENCE DEPARTMENT OF MATHEMATICS & STATISTICS MIDTERM EXAMINATION 1 FALL 2018

PERFECT SECRECY AND ADVERSARIAL INDISTINGUISHABILITY

Cryptographic Engineering

Cryptography 2017 Lecture 2

Lecture Notes. Advanced Discrete Structures COT S

Lecture 1: Perfect Secrecy and Statistical Authentication. 2 Introduction - Historical vs Modern Cryptography

Private-key Systems. Block ciphers. Stream ciphers

Chapter 2 : Perfectly-Secret Encryption

Lecture Note 3 Date:

Computer Science A Cryptography and Data Security. Claude Crépeau

3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions

William Stallings Copyright 2010

Shannon s Theory of Secrecy Systems

Cryptography - Session 2

Historical cryptography. cryptography encryption main applications: military and diplomacy

Cryptography and Network Security Prof. D. Mukhopadhyay Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur

Solutions for week 1, Cryptography Course - TDA 352/DIT 250

Lecture 2: Perfect Secrecy and its Limitations

Solution of Exercise Sheet 6

Lecture 8 - Cryptography and Information Theory

CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrosky. Lecture 4

Cryptography. Lecture 2: Perfect Secrecy and its Limitations. Gil Segev

Scribe for Lecture #5

Question 2.1. Show that. is non-negligible. 2. Since. is non-negligible so is μ n +

Information-theoretic Secrecy A Cryptographic Perspective

Introduction to Cryptology. Lecture 3

02 Background Minimum background on probability. Random process

Cryptography CS 555. Topic 2: Evolution of Classical Cryptography CS555. Topic 2 1

Lecture 4: Perfect Secrecy: Several Equivalent Formulations

Chapter 2. A Look Back. 2.1 Substitution ciphers

Introduction to Information Theory. B. Škorić, Physical Aspects of Digital Security, Chapter 2

Shift Cipher. For 0 i 25, the ith plaintext character is. E.g. k = 3

Some Basic Concepts of Probability and Information Theory: Pt. 2

Lecture 13: Private Key Encryption

6.02 Fall 2012 Lecture #1

Block ciphers And modes of operation. Table of contents

CS 361: Probability & Statistics

Problem 1. k zero bits. n bits. Block Cipher. Block Cipher. Block Cipher. Block Cipher. removed

CS 282A/MATH 209A: Foundations of Cryptography Prof. Rafail Ostrovsky. Lecture 7

CODING AND CRYPTOLOGY III CRYPTOLOGY EXERCISES. The questions with a * are extension questions, and will not be included in the assignment.

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

5199/IOC5063 Theory of Cryptology, 2014 Fall

Shannon s Theory. Objectives

Classical Cryptography

An Introduction to Probabilistic Encryption

Universal Hash Proofs and a Paradigm for Adaptive Chosen Ciphertext Secure Public-Key Encryption

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

CS 630 Basic Probability and Information Theory. Tim Campbell

Computational security & Private key encryption

CTR mode of operation

CPSC 467b: Cryptography and Computer Security

Unpredictable Binary Strings

DISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2]

SYMMETRIC ENCRYPTION. Syntax. Example: OTP. Correct decryption requirement. A symmetric encryption scheme SE = (K, E, D) consists of three algorithms:

Modern symmetric-key Encryption

Lecture 1. Crypto Background

Introduction to Cybersecurity Cryptography (Part 4)

1. Basics of Information

1 What are Physical Attacks. 2 Physical Attacks on RSA. Today:

Introduction to Cybersecurity Cryptography (Part 4)

Pseudorandom Generators

SYMMETRIC ENCRYPTION. Mihir Bellare UCSD 1

Equivalence between Semantic Security and Indistinguishability against Chosen Ciphertext Attacks

STREAM CIPHER. Chapter - 3

Bernoulli variables. Let X be a random variable such that. 1 with probability p X = 0 with probability q = 1 p

Block Ciphers and Feistel cipher

A Lower Bound on the Key Length of Information-Theoretic Forward-Secure Storage Schemes

6.080/6.089 GITCS April 8, Lecture 15

5 Pseudorandom Generators

Circuit Complexity. Circuit complexity is based on boolean circuits instead of Turing machines.

CPA-Security. Definition: A private-key encryption scheme

El Gamal A DDH based encryption scheme. Table of contents

MATH3302 Cryptography Problem Set 2

Chapter 7 Wednesday, May 26th

Lectures One Way Permutations, Goldreich Levin Theorem, Commitments

... Assignment 3 - Cryptography. Information & Communication Security (WS 2018/19) Abtin Shahkarami, M.Sc.

Perfectly-Secret Encryption

A note on the equivalence of IND-CCA & INT-PTXT and IND-CCA & INT-CTXT

Tutorial on Quantum Computing. Vwani P. Roychowdhury. Lecture 1: Introduction

Public-key Cryptography: Theory and Practice

Provable security. Michel Abdalla

Exercises with solutions (Set B)

Secret Key Systems (block encoding) Encrypting a small block of text (say 64 bits) General considerations for cipher design:

and Other Fun Stuff James L. Massey

and its Extension to Authenticity

INTRODUCTION TO INFORMATION THEORY

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

A block cipher enciphers each block with the same key.

6.080 / Great Ideas in Theoretical Computer Science Spring 2008

Lecture 6. 2 Adaptively-Secure Non-Interactive Zero-Knowledge

Elliptic Curve Cryptography

CS 395T. Probabilistic Polynomial-Time Calculus

Shannon's Theory of Communication

Chapter 2 Random Variables

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Solution to Midterm Examination

Impact of ANSI X9.24 1:2009 Key Check Value on ISO/IEC :2011 MACs

How to Use Short Basis : Trapdoors for Hard Lattices and new Cryptographic Constructions

Transcription:

Topics Probability Theory Perfect Secrecy Information Theory

Some Terms (P,C,K,E,D) Computational Security Computational effort required to break cryptosystem Provable Security Relative to another, difficult problem Unconditional Security Oscar (adversary) can do whatever he wants, as much as he wants 3

Probability Review, pg. 1 A random variable (event) is an experiment whose outcomes are mapped to real numbers. Probability: We denote p X (x) = Pr(X = x). For a subset A, Joint Probability: Sometimes we want to consider more than two events at the same time, in which we case we lump them together into a joint random variable, e.g. Z = (X,Y). p X, Y p(a) xa X,Y x, y PrX x, Y y Independence: We say that two events are independent if p X,Y p X x X,Y x, y p xp y X Y

Probability Review, pg. Conditional Probability: We will often ask questions about the probability of events Y given that we have observed X=x. In particular, we define the conditional probability of Y=y given X=x by p Y (y x) p p XY X (x, y) (x) Independence: We immediately get p Y (y x) p Y (y) Bayes s Theorem: If p X (x)>0 and p Y (y)>0 then p X (x y) = p X (x)p Y (y x) p Y (y)

Example

Perfect Secrecy Defined A cryptosystem (P,C,K,E,D) has perfect secrecy if Ciphertext yields no information about plaintext 7

Perfect Secrecy p(m): (a priori) probability that plaintext M is sent. p(c): probability that ciphertext C was received. P(M C): probability that plaintext M was sent, given that ciphertext C was received. P(C M): probability that ciphertext C was received, given that plaintext M was sent. P(K): probability that key K was chosen. A cryptosystem provides perfect secrecy if p(m C) = p(m) for all M and C with p(c) > 0

Implications Perfect secrecy means exactly that the random variables on M and C are independent. Ciphertext C gives us no information about M Proof using Bayes s theorem A cryptosystem provides perfect secrecy if and only if p(c M) = p(c) for all M;C with p(m) > 0 and p(c) > 0

Theorem If a cryptosystem has perfect secrecy, then K M. Proof idea? There is some message M such that for a given ciphertext C; no key K encrypts M to C P(C M)=0 but P(C)>0

One-Time Pad The one-time pad, which is a provably secure cryptosystem, was developed by Gilbert Vernam in 1918. The message is represented as a binary string (a sequence of 0 s and 1 s using a coding mechanism such as ASCII coding. The key is a truly random sequence of 0 s and 1 s of the same length as the message. The encryption is done by adding the key to the message modulo, bit by bit. This process is often called exclusive or, and is denoted by XOR. The symbol is used

exclusive or Operator a b c = a b 0 0 0 0 1 1 1 0 1 1 1 0

Example message = IF then its ASCII code =(1001001 1000110) key = (1010110 0110001) Encryption: 1001001 1000110 plaintext 1010110 0110001 key 0011111 1110110 ciphertext Decryption: 0011111 1110110 ciphertext 1010110 0110001 key 1001001 1000110 plaintext

OTP Security The security depends on the randomness of the key. It is hard to define randomness. In cryptographic context, we seek two fundamental properties in a binary random key sequence: Unpredictability: Balanced (Equal Distribution):

OTP Security Unpredictability: Independent of the number of the bits of a sequence observed, the probability of guessing the next bit is not better than ½. Therefore, the probability of a certain bit being 1 or 0 is exactly equal to ½. Balanced (Equal Distribution): The number of 1 s and 0 s should be equal.

Entropy Want to be able to measure the uncertainty or information of some random variable X. Entropy Information theory captures the amount of information in a piece of text. How much information or uncertainty is in a cryptosystem? 16

Entropy and Source Coding Theory There is a close relationship between entropy and representing information. Entropy captures the notion of how many Yes-No questions are needed to accurately identify a piece of information that is, how many bits are needed! One of the main focus areas in the field of information theory is on the issue of source-coding: How to efficiently ( Compress ) information into as few bits as possible. One such technique, Huffman Coding.

Entropy and Uncertainty We are concerned with how much uncertainty a random event has, but how do we define or measure uncertainty? We want our measure to have the following properties: 1. To each set of nonnegative numbers p p1,p,, pn with p1 p pn 1, we define the uncertainty by. H(p). H(p) should be a continuous function: A slight change in p should not drastically change H(p) 1 1 1 3. H 1 for all n>0. Uncertainty increases n,, n H, n1, n1 when there are more outcomes.

Entropy, pg. We define the entropy of a random variable by Example: Consider a fair coin toss. There are two outcomes, with probability ½ each. The entropy is 1 H X log 1 x p x 1 log 1bit Example: Consider a non-fair coin toss X with probability p of getting heads and 1-p of getting tails. The entropy is log 1 p(x) X plog p 1 plog 1 p H

Entropy, pg. 3 Entropy may be thought of as the number of yes-no questions needed to accurately determine the outcome of a random event. Example: Flip two coins, and let X be the number of heads. The possibilities are {0,1,} and the probabilities are {1/4, 1/, 1/4}. The Entropy is 1 log 4 1 4 1 log 1 1 log 4 1 4 3 bits So how can we relate this to questions? Half the time you needed one question, half you needed two