Fundamentals of Information Theory Lecture 1. Introduction. Prof. CHEN Jie Lab. 201, School of EIE BeiHang University

Similar documents
Outline of the Lecture. Background and Motivation. Basics of Information Theory: 1. Introduction. Markku Juntti. Course Overview

Information in Biology

ELEMENTS O F INFORMATION THEORY

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Information in Biology

Entropies & Information Theory

Murray Gell-Mann, The Quark and the Jaguar, 1995

An introduction to basic information theory. Hampus Wessman

(Classical) Information Theory III: Noisy channel coding

The Liar Game. Mark Wildon

CSCI 2570 Introduction to Nanocomputing

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

A Mathematical Theory of Communication

Part I. Entropy. Information Theory and Networks. Section 1. Entropy: definitions. Lecture 5: Entropy

Lecture 8: Shannon s Noise Models

Revision of Lecture 4

Noisy channel communication

CSE468 Information Conflict

6.02 Fall 2012 Lecture #1

6.02 Fall 2011 Lecture #9

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Information Theory and Coding Techniques

Chapter 9 Fundamental Limits in Information Theory

ECE INFORMATION THEORY. Fall 2011 Prof. Thinh Nguyen

Principles of Communications

Computing and Communications 2. Information Theory -Entropy

UNIT I INFORMATION THEORY. I k log 2

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Channel Coding I. Exercises SS 2017

Welcome to Comp 411! 2) Course Objectives. 1) Course Mechanics. 3) Information. I thought this course was called Computer Organization

Information Theory and Coding Techniques: Chapter 1.1. What is Information Theory? Why you should take this course?

Lecture 1: Shannon s Theorem

Communication Theory II

Shannon's Theory of Communication

Classification & Information Theory Lecture #8

Remote Sensing I: Basics

Chapter 2 Review of Classical Information Theory

3F1 Information Theory, Lecture 1

The Continuing Miracle of Information Storage Technology Paul H. Siegel Director, CMRR University of California, San Diego

Revision of Lecture 5

Classical Information Theory Notes from the lectures by prof Suhov Trieste - june 2006

Lecture 22: Final Review

Multiple-Input Multiple-Output Systems

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Dept. of Linguistics, Indiana University Fall 2015

MAHALAKSHMI ENGINEERING COLLEGE-TRICHY QUESTION BANK UNIT V PART-A. 1. What is binary symmetric channel (AUC DEC 2006)

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Roll No. :... Invigilator's Signature :.. CS/B.TECH(ECE)/SEM-7/EC-703/ CODING & INFORMATION THEORY. Time Allotted : 3 Hours Full Marks : 70

CS 630 Basic Probability and Information Theory. Tim Campbell

One Lesson of Information Theory

EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY

Lecture 1. Introduction

ELEMENT OF INFORMATION THEORY

Notes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

to mere bit flips) may affect the transmission.

Noisy-Channel Coding

Chaos, Complexity, and Inference (36-462)

(Classical) Information Theory II: Source coding

ITCT Lecture IV.3: Markov Processes and Sources with Memory

1. Basics of Information

Reliable Computation over Multiple-Access Channels

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

MAHALAKSHMI ENGINEERING COLLEGE QUESTION BANK. SUBJECT CODE / Name: EC2252 COMMUNICATION THEORY UNIT-V INFORMATION THEORY PART-A

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information and Entropy

Lecture Notes for Communication Theory

Shannon s noisy-channel theorem

1 Background on Information Theory

Information Theory - Entropy. Figure 3

ECE Information theory Final (Fall 2008)

ECE Information theory Final

Lecture 1: Introduction, Entropy and ML estimation

Some of the best pictures of the planets in our solar system 19 January 2015, by Matt Williams

Information Theory, Statistics, and Decision Trees

Compression and Coding

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

VID3: Sampling and Quantization

Coding into a source: an inverse rate-distortion theorem

Intro to Information Theory

Nanotechnology-inspired Information Processing Systems of the Future

Introduction to Information Theory

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Channel Coding I. Exercises SS 2017

UNIT E: SPACE EXPLORATION

log 2 N I m m log 2 N + 1 m.

Massachusetts Institute of Technology

INTRODUCTION TO MICROWAVE REMOTE SENSING. Dr. A. Bhattacharya

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Introduction to Information Theory

16.36 Communication Systems Engineering

Lecture #27: Saturn. The Main Point. The Jovian Planets. Basic Properties of Saturn. Saturn:

Introduction to Information Theory. Part 2

Geological Mapping Using EO Data for Onshore O&G Exploration

Audio Coding. Fundamentals Quantization Waveform Coding Subband Coding P NCTU/CSIE DSPLAB C.M..LIU

Midterm Exam Information Theory Fall Midterm Exam. Time: 09:10 12:10 11/23, 2016


Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Transcription:

Fundamentals of Information Theory Lecture 1. Introduction Prof. CHEN Jie Lab. 201, School of EIE BeiHang University

Teaching Staff Dr. YU Ze Office: F617, NMB Email: yz613@buaa.edu.cn Dr. SUN Bing Office: F617, NMB Email: bingsun@buaa.edu.cn Prof. CHEN Jie Dean of Depart. of Info.& Com. Eng., SEIE Office: F615, New Main Building (NMB) Email: chenjie@buaa.edu.cn Dr. HUANG Qin Office: F509, NMB Email: qinhuang@buaa.edu.cn

Text Book Thomas M. Cover (1938 2012) Fellow Members of the IEEE IEEE R. W. Hamming Medal Recipients

Lecture Notes

Laboratory Manual

Course Website http://infortheory.buaa.edu.cn/

Outline 1.Introduction and Preview 2.Entropy and Mutual Information 3.Asymptotic Equipartition Property 4.Markov chains 5.Data Compression 6.Channel Capacity 7.Differential Entropy 8.Gaussian Channel 9.Maximum Entropy and Spectral Estimation 10.Rate Distortion Theory 11.Network Information Theory

1. Introduction to Information Theoretical model of a typical communication system? How to distinguish information, signal And message What is Information?

1. Introduction to Information Information, in its general sense, is Knowledge communicated or received concerning a parti -cular fact or circumstance. " Information can t be predicted and resolves uncertainty. The uncertainty of an event is measured by its probability of occurrence and is inversely proportional to that. The more uncertain an event is, the more information is required to resolve uncertainty of that event. The amount of information is measured in bits. Example: information in fair one coin flip: log2(2/1) = 1 bit whereas in fair two coin flip is log2(4/1) = 2 bits.. http://en.wikipedia.org/

http://en.wikipedia.org/ 1. Introduction to Information Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as a message. Information can be recorded as signs, or transmitted as signals. Information is any kind of event that affects the stat e of a dynamic system. Information is the message being conveyed. Information is closely related to notions of constraint, communication, control, data, instruction, knowled ge, meaning, understanding, mental stimuli, pattern, perception, representation, and entropy.

1.1 Concept of information What is information? fair one coin flip S={T,F} operator will receive a call in next one hour

1.1 Concept of information Variations in value of 10 resistor Electromagnetic Interference Noisy Channel transmission

http://en.wikipedia.org/ 1.1 Concept of information To be or not to be, that is the question. William Shakespeare's play Hamlet

1.1 Concept of information Can you give some more examples?

1.1 Concept of information Can we measure information?

http://en.wikipedia.org/ 1.1 Concept of information The logarithmic connection between entropy and probability was first stated by L. Boltzmann in his kinetic theory of gases The famous formula for entropy S S k log e W k = 1.3806505(24) 10 23 J/K, Boltzmann's constant W is the Wahrscheinlichkeit, the frequency of occurrence of a macrostate, more precisely, the number of possible microstates corresponding to the macroscopic state of a system L. Boltzmann(1844-1906)

http://en.wikipedia.org/ 1.1 Concept of information Nyquist s logarithm law (1924) Harry Nyquist (1889 1976)

1.1 Concept of information Hartley s law (1928) Ralph Hartley (1888 1970) http://en.wikipedia.org/

1.1 Concept of information The uncertainty measure Uncertainty log p( x) The average uncertainty, Entropy H ( X ) p( x)log p( x) x Claude Shannon (1916-2001) Atheists /Electrical engineers Mathematicians & Statisticians Computer pioneers IEEE Medal of Honor recipients http://en.wikipedia.org/

http://en.wikipedia.org/ 1.1 Concept of information Claude Shannon (1916-2001) Atheists /Electrical engineers Mathematicians & Statisticians Computer pioneers IEEE Medal of Honor recipients

1.1 Concept of information What is Information: Information causes change; If it doesn t, it isn t information Claude Shannon (1916-2001) http://en.wikipedia.org/

1.2 Timeline of information theory

1.2 Timeline of information theory

1.2 Timeline of information theory

1.2 Timeline of information theory

1.2 Timeline of information theory

1.3 Information, Message and Signals Information: The uncertainty of source transmitted by communication system, which is contained by message and is still an abstract conception Message: More specific concept with all kinds of forms such as language, symbol, image which can be understood by both sides of communication system, or can be acquired/processed/stored by an information systems, e.g. remote sensing, GNSS. Signal: The most physical concept, which is carrier of message, being measurable, visible and physical

http://s4.sinaimg.cn/mw690/ 1.3 Information, Message and Signals Earth Observation System Configuration

http://www.jpl.nasa.gov/ Example1.3.1 VHF Band-Apollo-17/ALSE The Apollo 17 moon craft launched by U.S.(Dec.1972) made the SAR firstly perform in the space VHF radar antenna This SAR was named as Apollo Lunar Sounder Experiment (ALSE) ALSE was the first application in the human history to study the Moon's surface and interior using SAR based on the space probe

Example1.3.2 VHF Band- MARS Express http://www.jpl.nasa.gov/

Example1.3.3 S Band Cassini Huygens Radar image: Titan North Pole Lakes Saturn http://en.wikipedia.org/

Example1.3.5 Shuttle Radar Topography Mission Shuttle Radar Topography Mission (STRM,U.S.) use two radar antenna on board the space shuttle to implement the single-pass SAR interferometry Demonstration of STRM http://www.jpl.nasa.gov/ Demonstration of interferogram acquired by INSAR processing

Example1.3.5 Shuttle Radar Topography Mission http://www.jpl.nasa.gov/

Example1.3.6 SAR image: DEM of volcano Etna http://www.jpl.nasa.gov/

Example1.3.9 TerraSAR-X http://www.dlr.de/

Example1.3.9 TerraSAR-X http://www.dlr.de/

Example1.3.9 TerraSAR-X http://www.dlr.de/

Example1.3.9 TerraSAR-X http://www.dlr.de/

Example1.3.9 TerraSAR-X http://www.dlr.de/

Example1.3.9 TerraSAR-X http://www.dlr.de/

Example1.3.9 TerraSAR-X http://www.dlr.de/

Microwave EO Satellites TerraSAR-X Interferometric tandem X band Resolution 1m~18m RADARSAT-2 Polarimetric radar C band Resolution 1m-100m

Example1.3.10 IKONOS optical satellite IKONOS is one of the most advanced commercial optical satellites. IKONOS played an important role in the modern warfare and military application

Example1.3.10 IKONOS image of Beijing

Optical EO Satellites IKONOS Resolution: Pan=1 m M S(B,G,R,NIR)=4 m Scale: 1: 5,000 Mono and stereo GeoEye-1 Resolution: Pan=0.41/0.5m MS(B,G,R,NIR)=1.6/2 m Scale: 1: 2,000 Mono and stereo

Optical EO Satellites Space Imaging's IKONOS satellite captured these one-meter resolution colour images of the World Trade Center before and after the terrorist attack

Optical EO Satellites QuickBird-2 Resolution: Pan=0.65 m MS(B,G,R,NIR)= 2.62 m Scale: 1: 5,000 Mono only WorldView-1 Resolution: Pan=0.5 m Scale: 1: 2,000 Mono and stereo WorldView-2 Resolution: Pan=0.5 m MS1(B,G,R,NIR) & MS2(CB,Y,RE,NIR2)=2 m Scale: 1: 2,000 Mono and stereo

Optical EO Satellites Pléiades-1/2 Commercial June, 2012 Resolution: Pan=0.7/0.5m MS(B,G,R,NIR)=2.8/2m Scale: 1: 2 000 Mono and stéréo WorldView-3 (2014) Resolution: Pan= 0.30/0.5m Scale: 1: 2,000 Mono and stereo 16bands 4 additional SWIR bands

Example1.3.11 Terahertz image

Example1.3.11 Terahertz image

1.3 Communication system model Channel

1.3 Communication system model

1.3 Communication system model Sound of ring bell Channel p=1/3 Received sound (simulated) Channel p=1/3 Satellite Remote sensing image of NMB, Beihang University http://map.google.com Received image (simulated)

1.3 Communication system model source Binary Symmetric Channel p=0.01 source Binary Symmetric Channel p=0.1 source Binary Symmetric Channel p=0.5

1.3 Communication system model Entropy Shannon argued that random processes such as music and speech have an irreducible complexity below which the signal cannot be compressed. This he named the entropy. Claude Shannon (1916-2001)

1.3 Communication system model Channel capacity In the early 1940s, it was thought that increasing the transmission rate of information over a communication channel increased the probability of error. Shannon surprised the communication theory community by proving that this was not true as long as the communication rate was below channel capacity.

http://en.wikipedia.org/ 1.3 Communication system model Information Theory answers: What is the bound on data compression The entropy rate H What is the limit on transmission rate The channel capacity C

Comparison of Transmission Property Before and After Huffman Coding BSC p=0.01 Huffman Coder BSC p=0.01 Decoder BSC p=0.1 Huffman Coder BSC p=0.1 Decoder

Demonstration of (M,n) Channel Coding BSC p=0.01 (2,3) Channel Coder BSC p=0.01 Decoder BSC p=0.1 (2,3) Channel Coder BSC p=0.1 Decoder

1.4 Information theory applications Information Theory intersects: Physics (statistical mechanics) Mathematics (probability theory) Electrical engineering (communication theory) Computer science (algorithmic complexity) Neurobiology Understanding of black holes Invention of the compact disc Voyage missions Development of to deep space the Internet

Physics AEP Thermodynamics Application Information theory Fisher information Hypothesis testing Statistics Inequalities Mathematics Figure1.1 The relationship of information theory with other fields

Thanks