Formal Modeling in Cognitive Science Lecture 29: Noisy Channel Model and Applications;

Size: px
Start display at page:

Download "Formal Modeling in Cognitive Science Lecture 29: Noisy Channel Model and Applications;"

Transcription

1 Formal Modeling in Cognitive Science Lecture 9: and ; ; Frank Keller School of Informatics University of Edinburgh keller@inf.ed.ac.uk Proerties of 3 March, 6 Frank Keller Formal Modeling in Cognitive Science Frank Keller Formal Modeling in Cognitive Science Proerties of Proerties of So far, we have looked at encoding a message efficiently, ut what about transmitting the message? The transmission of a message can be ed using a noisy channel: a message W is encoded, resulting in a string X ; X is transmitted through a channel with the robability distribution f (y x); the resulting string Y is decoded, yielding an estimate of the message Ŵ. W Message Encoder X Channel f(y x) Y W^ Estimate of message Frank Keller Formal Modeling in Cognitive Science 3 We are interested in the mathematical roerties of the channel used to transmit the message, and in articular in its caacity. Definition: Discrete Channel A discrete channel consists of an inut alhabet X, an outut alhabet Y and a robability distribution f (y x) that exresses the robability of observing symbol y given that symbol x is sent. Definition: The channel caacity of a discrete channel is: C = max f (x) I (X ; Y ) The caacity of a channel is the maximum of the mutual information of X and Y over all inut distributions f (x). Frank Keller Formal Modeling in Cognitive Science

2 Proerties of Proerties of Examle: Noiseless Binary Channel Assume a binary channel whose inut is reroduced exactly at the outut. Each transmitted bit is received without error: The channel caacity of this channel is: C = max I (X ; Y ) = bit f (x) This maximum is achieved with f () = and f () =. Examle: Binary Symmetric Channel Assume a binary channel whose inut is flied ( transmitted a or transmitted as ) with robability : The mutual information of this channel is bounded by: I (X ; Y ) = H(Y ) H(X Y ) = H(Y ) x f (x)h(y X = x) = H(Y ) x f (x)h() = H(Y ) H() H() The channel caacity is therefore: C = max I (X ; Y ) = H() bits f (x) Frank Keller Formal Modeling in Cognitive Science 5 Frank Keller Formal Modeling in Cognitive Science 6 Proerties of Proerties Proerties of A binary data sequence of length, transmitted over a binary symmetric channel with =.: Theorem: Proerties of C since I (X ; Y ) ; C log X, since C = max I (X ; Y ) max H(X ) log X ; 3 C log Y for the same reason. Frank Keller Formal Modeling in Cognitive Science 7 Frank Keller Formal Modeling in Cognitive Science 8

3 Proerties of of the Proerties of of the The noisy channel can be alied to decoding rocesses involving linguistic information. A tyical formulation of such a roblem is: we start with a linguistic inut I ; I is transmitted through a noisy channel with the robability distribution f (o i); the resulting outut O is decoded, yielding an estimate of the inut Î. I Noisy Channel f(o i) O I^ Alication Inut Outut f (i) f (o i) Machine translation Otical character recognition Part of seech tagging Seech recognition target source actual text text with mistakes POS seech signal target robability of POS translation of OCR errors f (w t) acoustic Frank Keller Formal Modeling in Cognitive Science 9 Frank Keller Formal Modeling in Cognitive Science Proerties of of the Examle Outut: Sanish English Proerties of Let s look at machine translation in more detail. Assume that the French text (F ) assed through a noisy channel and came out as English (E). We decode it to estimate the original French ( ˆF ): ^ F Noisy Channel E F f(e f) We comute ˆF using Bayes theorem: f (f )f (e f ) ˆF = arg max f (f e) = arg max f f f (e) = arg max f (f )f (e f ) f Here f (e f ) is the translation, f (f ) is the French, and f (e) is the English (constant). we all know very well that the current treaties are insufficient and that, in the future, it will be necessary to develo a better structure and different for the euroean union, a structure more constitutional also make it clear what the cometences of the member states and which belong to the union. messages of concern in the first lace just before the economic and social roblems for the resent situation, and in site of sustained growth, as a result of years of effort on the art of our citizens. the current situation, unsustainable above all for many self-emloyed drivers and in the area of agriculture, we must imrove without doubt. in itself, it is good to reach an agreement on rocedures, but we have to ensure that this system is not likely to be used as a weaon olicy. now they are also clear rights to be resected. i agree with the signal warning against the return, which some are temted to the intergovernmental methods. there are many of us that we want a federation of nation states. Frank Keller Formal Modeling in Cognitive Science Frank Keller Formal Modeling in Cognitive Science

4 Examle Outut: Finnish English Proerties of the raorteurs have drawn attention to the quality of the debate and also the need to go further : of course, i can only agree with them. we know very well that the current treaties are not enough and that in future, it is necessary to develo a better structure for the union and, therefore erustuslaillisemi structure, which also exressed more clearly what the member states and the union is concerned. first of all, kohtaamiemme economic and social difficulties, there is concern, even if growth is sustainable and the result of the efforts of all, on the art of our citizens. the current situation, which is unaccetable, in articular, for many carriers and resonsible for agriculture, is in any case, to be imroved. agreement on rocedures in itself is a good thing, but there is a need to ensure that the system cannot be used as a olitical lyomaaseena. they also have a clear icture of the rights of now, in which they have to work. i agree with him when he warned of the consenting to return to intergovernmental methods. many of us want of a federal state of the national member states. Definition: For two robability distributions f (x) and g(x) for a random variable X, the Kullback-Leibler divergence or relative entroy is given as: D(f g) = f (x) log f (x) g(x) x X The KL divergence comares the entroy of two distributions over the same random variable. Intuitively, the KL divergence number of additional bits required when encoding a random variable with a distribution f (x) using the alternative distribution g(x). Frank Keller Formal Modeling in Cognitive Science 3 Frank Keller Formal Modeling in Cognitive Science Theorem: Proerties of the D(f g) ; D(f g) = iff f (x) = g(x) for all x X ; 3 D(f g) D(g f ); I (X ; Y ) = D(f (x, y) f (x)f (y)). So the mutual information is the KL divergence between f (x, y) and f (x)f (y). It measures how far a distribution is from indeendence. Examle For a random variable X = {, } assume two distributions f (x) and g(x) with f () = r, f () = r and g() = s, g() = s: D(f g) = ( r) log r s + r log r s D(g f ) = ( s) log s r + s log s r If r = s then D(f g) = D(g f ) =. If r = and r = : D(f g) = log 3 D(g f ) = 3 log 3 + log + log =.75 =.887 Frank Keller Formal Modeling in Cognitive Science 5 Frank Keller Formal Modeling in Cognitive Science 6

5 Definition: For a random variable X with the robability distribution f (x) the cross-entroy for the robability distribution g(x) is given as: H(X, g) = x X f (x) log g(x) The cross-entroy can also be exressed in terms of entroy and KL divergence: H(X, g) = H(X ) + D(f g) Intuitively, the cross-entroy is the total number of bits required when encoding a random variable with a distribution f (x) using the alternative distribution g(x). Examle In the last lecture, we constructed a code for the following distribution using Huffman coding: x a e i o u f (x) log f (x) The entroy of this distribution is H(X ) =.995. Now comute the distribution g(x) = l(x) associated with the Huffman code: x a e i o u C(x) l(x) 3 g(x) Frank Keller Formal Modeling in Cognitive Science 7 Frank Keller Formal Modeling in Cognitive Science 8 Summary Examle Then the cross-entroy for g(x) is: H(X, g) = x X f (x) log g(x) = (. log log +.7 log 6 ) =.3 The KL divergence is: D(f g) = H(X, g) H(X ) = log log This means we are losing on average.35 bits by using the Huffman code rather then the theoretically otimal code given by the Shannon information. Frank Keller Formal Modeling in Cognitive Science 9 The noisy channel can the errors and loss when transmitting a message with inut X and outut Y ; the caacity of the channel is given by the maximum of the mutual information of X and Y ; a binary symmetric channel is one where each bit is flied with robability ; the noisy channel can be alied to linguistic roblems, e.g., machine translation; the Kullback-Leibler divergence is the distance between two distributions (the cost of encoding f (x) through g(x)). Frank Keller Formal Modeling in Cognitive Science

ECE 534 Information Theory - Midterm 2

ECE 534 Information Theory - Midterm 2 ECE 534 Information Theory - Midterm Nov.4, 009. 3:30-4:45 in LH03. You will be given the full class time: 75 minutes. Use it wisely! Many of the roblems have short answers; try to find shortcuts. You

More information

An Introduction to Information Theory: Notes

An Introduction to Information Theory: Notes An Introduction to Information Theory: Notes Jon Shlens jonshlens@ucsd.edu 03 February 003 Preliminaries. Goals. Define basic set-u of information theory. Derive why entroy is the measure of information

More information

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding

Homework Set #3 Rates definitions, Channel Coding, Source-Channel coding Homework Set # Rates definitions, Channel Coding, Source-Channel coding. Rates (a) Channels coding Rate: Assuming you are sending 4 different messages using usages of a channel. What is the rate (in bits

More information

Named Entity Recognition using Maximum Entropy Model SEEM5680

Named Entity Recognition using Maximum Entropy Model SEEM5680 Named Entity Recognition using Maximum Entroy Model SEEM5680 Named Entity Recognition System Named Entity Recognition (NER): Identifying certain hrases/word sequences in a free text. Generally it involves

More information

On Code Design for Simultaneous Energy and Information Transfer

On Code Design for Simultaneous Energy and Information Transfer On Code Design for Simultaneous Energy and Information Transfer Anshoo Tandon Electrical and Comuter Engineering National University of Singaore Email: anshoo@nus.edu.sg Mehul Motani Electrical and Comuter

More information

EE/Stats 376A: Information theory Winter Lecture 5 Jan 24. Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B.

EE/Stats 376A: Information theory Winter Lecture 5 Jan 24. Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B. EE/Stats 376A: Information theory Winter 207 Lecture 5 Jan 24 Lecturer: David Tse Scribe: Michael X, Nima H, Geng Z, Anton J, Vivek B. 5. Outline Markov chains and stationary distributions Prefix codes

More information

Convex Optimization methods for Computing Channel Capacity

Convex Optimization methods for Computing Channel Capacity Convex Otimization methods for Comuting Channel Caacity Abhishek Sinha Laboratory for Information and Decision Systems (LIDS), MIT sinhaa@mit.edu May 15, 2014 We consider a classical comutational roblem

More information

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.

Introduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information. L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission

More information

Lecture 21: Quantum Communication

Lecture 21: Quantum Communication CS 880: Quantum Information Processing 0/6/00 Lecture : Quantum Communication Instructor: Dieter van Melkebeek Scribe: Mark Wellons Last lecture, we introduced the EPR airs which we will use in this lecture

More information

Improved Capacity Bounds for the Binary Energy Harvesting Channel

Improved Capacity Bounds for the Binary Energy Harvesting Channel Imroved Caacity Bounds for the Binary Energy Harvesting Channel Kaya Tutuncuoglu 1, Omur Ozel 2, Aylin Yener 1, and Sennur Ulukus 2 1 Deartment of Electrical Engineering, The Pennsylvania State University,

More information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information 4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information Ramji Venkataramanan Signal Processing and Communications Lab Department of Engineering ramji.v@eng.cam.ac.uk

More information

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code.

Convolutional Codes. Lecture 13. Figure 93: Encoder for rate 1/2 constraint length 3 convolutional code. Convolutional Codes Goals Lecture Be able to encode using a convolutional code Be able to decode a convolutional code received over a binary symmetric channel or an additive white Gaussian channel Convolutional

More information

I - Information theory basics

I - Information theory basics I - Information theor basics Introduction To communicate, that is, to carr information between two oints, we can emlo analog or digital transmission techniques. In digital communications the message is

More information

Lecture 1: Introduction, Entropy and ML estimation

Lecture 1: Introduction, Entropy and ML estimation 0-704: Information Processing and Learning Spring 202 Lecture : Introduction, Entropy and ML estimation Lecturer: Aarti Singh Scribes: Min Xu Disclaimer: These notes have not been subjected to the usual

More information

Lecture 2: August 31

Lecture 2: August 31 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy

More information

Improving on the Cutset Bound via a Geometric Analysis of Typical Sets

Improving on the Cutset Bound via a Geometric Analysis of Typical Sets Imroving on the Cutset Bound via a Geometric Analysis of Tyical Sets Ayfer Özgür Stanford University CUHK, May 28, 2016 Joint work with Xiugang Wu (Stanford). Ayfer Özgür (Stanford) March 16 1 / 33 Gaussian

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

On the capacity of the general trapdoor channel with feedback

On the capacity of the general trapdoor channel with feedback On the caacity of the general tradoor channel with feedback Jui Wu and Achilleas Anastasooulos Electrical Engineering and Comuter Science Deartment University of Michigan Ann Arbor, MI, 48109-1 email:

More information

THE development of efficient and simplified bounding

THE development of efficient and simplified bounding IEEE TRASACTIOS O IFORMATIO THEORY, VO. XX, O. XX, XXXX 4 ist ermutation invariant linear codes: Theory and Alications Kostis Xenoulis Abstract The class of q-ary list ermutation invariant linear codes

More information

Distributed K-means over Compressed Binary Data

Distributed K-means over Compressed Binary Data 1 Distributed K-means over Comressed Binary Data Elsa DUPRAZ Telecom Bretagne; UMR CNRS 6285 Lab-STICC, Brest, France arxiv:1701.03403v1 [cs.it] 12 Jan 2017 Abstract We consider a networ of binary-valued

More information

s v 0 q 0 v 1 q 1 v 2 (q 2) v 3 q 3 v 4

s v 0 q 0 v 1 q 1 v 2 (q 2) v 3 q 3 v 4 Discrete Adative Transmission for Fading Channels Lang Lin Λ, Roy D. Yates, Predrag Sasojevic WINLAB, Rutgers University 7 Brett Rd., NJ- fllin, ryates, sasojevg@winlab.rutgers.edu Abstract In this work

More information

q-ary Symmetric Channel for Large q

q-ary Symmetric Channel for Large q List-Message Passing Achieves Caacity on the q-ary Symmetric Channel for Large q Fan Zhang and Henry D Pfister Deartment of Electrical and Comuter Engineering, Texas A&M University {fanzhang,hfister}@tamuedu

More information

Distributed Rule-Based Inference in the Presence of Redundant Information

Distributed Rule-Based Inference in the Presence of Redundant Information istribution Statement : roved for ublic release; distribution is unlimited. istributed Rule-ased Inference in the Presence of Redundant Information June 8, 004 William J. Farrell III Lockheed Martin dvanced

More information

Lecture 1: Shannon s Theorem

Lecture 1: Shannon s Theorem Lecture 1: Shannon s Theorem Lecturer: Travis Gagie January 13th, 2015 Welcome to Data Compression! I m Travis and I ll be your instructor this week. If you haven t registered yet, don t worry, we ll work

More information

Training sequence optimization for frequency selective channels with MAP equalization

Training sequence optimization for frequency selective channels with MAP equalization 532 ISCCSP 2008, Malta, 12-14 March 2008 raining sequence otimization for frequency selective channels with MAP equalization Imed Hadj Kacem, Noura Sellami Laboratoire LEI ENIS, Route Sokra km 35 BP 3038

More information

Approximating min-max k-clustering

Approximating min-max k-clustering Aroximating min-max k-clustering Asaf Levin July 24, 2007 Abstract We consider the roblems of set artitioning into k clusters with minimum total cost and minimum of the maximum cost of a cluster. The cost

More information

Inequalities for the L 1 Deviation of the Empirical Distribution

Inequalities for the L 1 Deviation of the Empirical Distribution Inequalities for the L 1 Deviation of the Emirical Distribution Tsachy Weissman, Erik Ordentlich, Gadiel Seroussi, Sergio Verdu, Marcelo J. Weinberger June 13, 2003 Abstract We derive bounds on the robability

More information

(Classical) Information Theory III: Noisy channel coding

(Classical) Information Theory III: Noisy channel coding (Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way

More information

FAST AND EFFICIENT SIDE INFORMATION GENERATION IN DISTRIBUTED VIDEO CODING BY USING DENSE MOTION REPRESENTATIONS

FAST AND EFFICIENT SIDE INFORMATION GENERATION IN DISTRIBUTED VIDEO CODING BY USING DENSE MOTION REPRESENTATIONS 18th Euroean Signal Processing Conference (EUSIPCO-2010) Aalborg, Denmark, August 23-27, 2010 FAST AND EFFICIENT SIDE INFORMATION GENERATION IN DISTRIBUTED VIDEO CODING BY USING DENSE MOTION REPRESENTATIONS

More information

dn i where we have used the Gibbs equation for the Gibbs energy and the definition of chemical potential

dn i where we have used the Gibbs equation for the Gibbs energy and the definition of chemical potential Chem 467 Sulement to Lectures 33 Phase Equilibrium Chemical Potential Revisited We introduced the chemical otential as the conjugate variable to amount. Briefly reviewing, the total Gibbs energy of a system

More information

Interactive Hypothesis Testing Against Independence

Interactive Hypothesis Testing Against Independence 013 IEEE International Symosium on Information Theory Interactive Hyothesis Testing Against Indeendence Yu Xiang and Young-Han Kim Deartment of Electrical and Comuter Engineering University of California,

More information

Information Theory and Statistics Lecture 2: Source coding

Information Theory and Statistics Lecture 2: Source coding Information Theory and Statistics Lecture 2: Source coding Łukasz Dębowski ldebowsk@ipipan.waw.pl Ph. D. Programme 2013/2014 Injections and codes Definition (injection) Function f is called an injection

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Outline. Markov Chains and Markov Models. Outline. Markov Chains. Markov Chains Definitions Huizhen Yu

Outline. Markov Chains and Markov Models. Outline. Markov Chains. Markov Chains Definitions Huizhen Yu and Markov Models Huizhen Yu janey.yu@cs.helsinki.fi Det. Comuter Science, Univ. of Helsinki Some Proerties of Probabilistic Models, Sring, 200 Huizhen Yu (U.H.) and Markov Models Jan. 2 / 32 Huizhen Yu

More information

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS

PROBABILITY AND INFORMATION THEORY. Dr. Gjergji Kasneci Introduction to Information Retrieval WS PROBABILITY AND INFORMATION THEORY Dr. Gjergji Kasneci Introduction to Information Retrieval WS 2012-13 1 Outline Intro Basics of probability and information theory Probability space Rules of probability

More information

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 Please submit the solutions on Gradescope. EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018 1. Optimal codeword lengths. Although the codeword lengths of an optimal variable length code

More information

Coding Along Hermite Polynomials for Gaussian Noise Channels

Coding Along Hermite Polynomials for Gaussian Noise Channels Coding Along Hermite olynomials for Gaussian Noise Channels Emmanuel A. Abbe IG, EFL Lausanne, 1015 CH Email: emmanuel.abbe@efl.ch Lizhong Zheng LIDS, MIT Cambridge, MA 0139 Email: lizhong@mit.edu Abstract

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

4. Score normalization technical details We now discuss the technical details of the score normalization method.

4. Score normalization technical details We now discuss the technical details of the score normalization method. SMT SCORING SYSTEM This document describes the scoring system for the Stanford Math Tournament We begin by giving an overview of the changes to scoring and a non-technical descrition of the scoring rules

More information

Information in Biology

Information in Biology Lecture 3: Information in Biology Tsvi Tlusty, tsvi@unist.ac.kr Living information is carried by molecular channels Living systems I. Self-replicating information processors Environment II. III. Evolve

More information

Feedback-error control

Feedback-error control Chater 4 Feedback-error control 4.1 Introduction This chater exlains the feedback-error (FBE) control scheme originally described by Kawato [, 87, 8]. FBE is a widely used neural network based controller

More information

1 Introduction. Sept CS497:Learning and NLP Lec 4: Mathematical and Computational Paradigms Fall Consider the following examples:

1 Introduction. Sept CS497:Learning and NLP Lec 4: Mathematical and Computational Paradigms Fall Consider the following examples: Sept. 20-22 2005 1 CS497:Learning and NLP Lec 4: Mathematical and Computational Paradigms Fall 2005 In this lecture we introduce some of the mathematical tools that are at the base of the technical approaches

More information

Evaluating Circuit Reliability Under Probabilistic Gate-Level Fault Models

Evaluating Circuit Reliability Under Probabilistic Gate-Level Fault Models Evaluating Circuit Reliability Under Probabilistic Gate-Level Fault Models Ketan N. Patel, Igor L. Markov and John P. Hayes University of Michigan, Ann Arbor 48109-2122 {knatel,imarkov,jhayes}@eecs.umich.edu

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Probability Estimates for Multi-class Classification by Pairwise Coupling

Probability Estimates for Multi-class Classification by Pairwise Coupling Probability Estimates for Multi-class Classification by Pairwise Couling Ting-Fan Wu Chih-Jen Lin Deartment of Comuter Science National Taiwan University Taiei 06, Taiwan Ruby C. Weng Deartment of Statistics

More information

Information Theory in Intelligent Decision Making

Information Theory in Intelligent Decision Making Information Theory in Intelligent Decision Making Adaptive Systems and Algorithms Research Groups School of Computer Science University of Hertfordshire, United Kingdom June 7, 2015 Information Theory

More information

LDPC codes for the Cascaded BSC-BAWGN channel

LDPC codes for the Cascaded BSC-BAWGN channel LDPC codes for the Cascaded BSC-BAWGN channel Aravind R. Iyengar, Paul H. Siegel, and Jack K. Wolf University of California, San Diego 9500 Gilman Dr. La Jolla CA 9093 email:aravind,siegel,jwolf@ucsd.edu

More information

UNCERTAINLY MEASUREMENT

UNCERTAINLY MEASUREMENT UNCERTAINLY MEASUREMENT Jan Čaek, Martin Ibl Institute of System Engineering and Informatics, University of Pardubice, Pardubice, Czech Reublic caek@uce.cz, martin.ibl@uce.cz In recent years, a series

More information

Towards understanding the Lorenz curve using the Uniform distribution. Chris J. Stephens. Newcastle City Council, Newcastle upon Tyne, UK

Towards understanding the Lorenz curve using the Uniform distribution. Chris J. Stephens. Newcastle City Council, Newcastle upon Tyne, UK Towards understanding the Lorenz curve using the Uniform distribution Chris J. Stehens Newcastle City Council, Newcastle uon Tyne, UK (For the Gini-Lorenz Conference, University of Siena, Italy, May 2005)

More information

Information Theory. David Rosenberg. June 15, New York University. David Rosenberg (New York University) DS-GA 1003 June 15, / 18

Information Theory. David Rosenberg. June 15, New York University. David Rosenberg (New York University) DS-GA 1003 June 15, / 18 Information Theory David Rosenberg New York University June 15, 2015 David Rosenberg (New York University) DS-GA 1003 June 15, 2015 1 / 18 A Measure of Information? Consider a discrete random variable

More information

Outline. EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Simple Error Detection Coding

Outline. EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Simple Error Detection Coding Outline EECS150 - Digital Design Lecture 26 Error Correction Codes, Linear Feedback Shift Registers (LFSRs) Error detection using arity Hamming code for error detection/correction Linear Feedback Shift

More information

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Chapter 2: Source coding

Chapter 2: Source coding Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent

More information

U Logo Use Guidelines

U Logo Use Guidelines COMP2610/6261 - Information Theory Lecture 15: Arithmetic Coding U Logo Use Guidelines Mark Reid and Aditya Menon logo is a contemporary n of our heritage. presents our name, d and our motto: arn the nature

More information

Information in Biology

Information in Biology Information in Biology CRI - Centre de Recherches Interdisciplinaires, Paris May 2012 Information processing is an essential part of Life. Thinking about it in quantitative terms may is useful. 1 Living

More information

Lecture 9: Connecting PH, P/poly and BPP

Lecture 9: Connecting PH, P/poly and BPP Comutational Comlexity Theory, Fall 010 Setember Lecture 9: Connecting PH, P/oly and BPP Lecturer: Kristoffer Arnsfelt Hansen Scribe: Martin Sergio Hedevang Faester Although we do not know how to searate

More information

Periodic scheduling 05/06/

Periodic scheduling 05/06/ Periodic scheduling T T or eriodic scheduling, the best that we can do is to design an algorithm which will always find a schedule if one exists. A scheduler is defined to be otimal iff it will find a

More information

Information Theory. Week 4 Compressing streams. Iain Murray,

Information Theory. Week 4 Compressing streams. Iain Murray, Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 4 Compressing streams Iain Murray, 2014 School of Informatics, University of Edinburgh Jensen s inequality For convex functions: E[f(x)]

More information

Convex Analysis and Economic Theory Winter 2018

Convex Analysis and Economic Theory Winter 2018 Division of the Humanities and Social Sciences Ec 181 KC Border Conve Analysis and Economic Theory Winter 2018 Toic 16: Fenchel conjugates 16.1 Conjugate functions Recall from Proosition 14.1.1 that is

More information

Chapter 1 Fundamentals

Chapter 1 Fundamentals Chater Fundamentals. Overview of Thermodynamics Industrial Revolution brought in large scale automation of many tedious tasks which were earlier being erformed through manual or animal labour. Inventors

More information

MATHEMATICAL MODELLING OF THE WIRELESS COMMUNICATION NETWORK

MATHEMATICAL MODELLING OF THE WIRELESS COMMUNICATION NETWORK Comuter Modelling and ew Technologies, 5, Vol.9, o., 3-39 Transort and Telecommunication Institute, Lomonosov, LV-9, Riga, Latvia MATHEMATICAL MODELLIG OF THE WIRELESS COMMUICATIO ETWORK M. KOPEETSK Deartment

More information

17.1 Binary Codes Normal numbers we use are in base 10, which are called decimal numbers. Each digit can be 10 possible numbers: 0, 1, 2, 9.

17.1 Binary Codes Normal numbers we use are in base 10, which are called decimal numbers. Each digit can be 10 possible numbers: 0, 1, 2, 9. ( c ) E p s t e i n, C a r t e r, B o l l i n g e r, A u r i s p a C h a p t e r 17: I n f o r m a t i o n S c i e n c e P a g e 1 CHAPTER 17: Information Science 17.1 Binary Codes Normal numbers we use

More information

Covariance Matrix Estimation for Reinforcement Learning

Covariance Matrix Estimation for Reinforcement Learning Covariance Matrix Estimation for Reinforcement Learning Tomer Lancewicki Deartment of Electrical Engineering and Comuter Science University of Tennessee Knoxville, TN 37996 tlancewi@utk.edu Itamar Arel

More information

δq T = nr ln(v B/V A )

δq T = nr ln(v B/V A ) hysical Chemistry 007 Homework assignment, solutions roblem 1: An ideal gas undergoes the following reversible, cyclic rocess It first exands isothermally from state A to state B It is then comressed adiabatically

More information

Amin, Osama; Abediseid, Walid; Alouini, Mohamed-Slim. Institute of Electrical and Electronics Engineers (IEEE)

Amin, Osama; Abediseid, Walid; Alouini, Mohamed-Slim. Institute of Electrical and Electronics Engineers (IEEE) KAUST Reository Outage erformance of cognitive radio systems with Imroer Gaussian signaling Item tye Authors Erint version DOI Publisher Journal Rights Conference Paer Amin Osama; Abediseid Walid; Alouini

More information

FE FORMULATIONS FOR PLASTICITY

FE FORMULATIONS FOR PLASTICITY G These slides are designed based on the book: Finite Elements in Plasticity Theory and Practice, D.R.J. Owen and E. Hinton, 1970, Pineridge Press Ltd., Swansea, UK. 1 Course Content: A INTRODUCTION AND

More information

A Parallel Algorithm for Minimization of Finite Automata

A Parallel Algorithm for Minimization of Finite Automata A Parallel Algorithm for Minimization of Finite Automata B. Ravikumar X. Xiong Deartment of Comuter Science University of Rhode Island Kingston, RI 02881 E-mail: fravi,xiongg@cs.uri.edu Abstract In this

More information

STABILITY ANALYSIS TOOL FOR TUNING UNCONSTRAINED DECENTRALIZED MODEL PREDICTIVE CONTROLLERS

STABILITY ANALYSIS TOOL FOR TUNING UNCONSTRAINED DECENTRALIZED MODEL PREDICTIVE CONTROLLERS STABILITY ANALYSIS TOOL FOR TUNING UNCONSTRAINED DECENTRALIZED MODEL PREDICTIVE CONTROLLERS Massimo Vaccarini Sauro Longhi M. Reza Katebi D.I.I.G.A., Università Politecnica delle Marche, Ancona, Italy

More information

Cryptanalysis of Pseudorandom Generators

Cryptanalysis of Pseudorandom Generators CSE 206A: Lattice Algorithms and Alications Fall 2017 Crytanalysis of Pseudorandom Generators Instructor: Daniele Micciancio UCSD CSE As a motivating alication for the study of lattice in crytograhy we

More information

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

10-704: Information Processing and Learning Fall Lecture 10: Oct 3 0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 0: Oct 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of

More information

Some Results on the Generalized Gaussian Distribution

Some Results on the Generalized Gaussian Distribution Some Results on the Generalized Gaussian Distribution Alex Dytso, Ronit Bustin, H. Vincent Poor, Daniela Tuninetti 3, Natasha Devroye 3, and Shlomo Shamai (Shitz) Abstract The aer considers information-theoretic

More information

16.2. Infinite Series. Introduction. Prerequisites. Learning Outcomes

16.2. Infinite Series. Introduction. Prerequisites. Learning Outcomes Infinite Series 6.2 Introduction We extend the concet of a finite series, met in Section 6., to the situation in which the number of terms increase without bound. We define what is meant by an infinite

More information

Shadow Computing: An Energy-Aware Fault Tolerant Computing Model

Shadow Computing: An Energy-Aware Fault Tolerant Computing Model Shadow Comuting: An Energy-Aware Fault Tolerant Comuting Model Bryan Mills, Taieb Znati, Rami Melhem Deartment of Comuter Science University of Pittsburgh (bmills, znati, melhem)@cs.itt.edu Index Terms

More information

Classification & Information Theory Lecture #8

Classification & Information Theory Lecture #8 Classification & Information Theory Lecture #8 Introduction to Natural Language Processing CMPSCI 585, Fall 2007 University of Massachusetts Amherst Andrew McCallum Today s Main Points Automatically categorizing

More information

General Linear Model Introduction, Classes of Linear models and Estimation

General Linear Model Introduction, Classes of Linear models and Estimation Stat 740 General Linear Model Introduction, Classes of Linear models and Estimation An aim of scientific enquiry: To describe or to discover relationshis among events (variables) in the controlled (laboratory)

More information

Analyses of Orthogonal and Non-Orthogonal Steering Vectors at Millimeter Wave Systems

Analyses of Orthogonal and Non-Orthogonal Steering Vectors at Millimeter Wave Systems Analyses of Orthogonal and Non-Orthogonal Steering Vectors at Millimeter Wave Systems Hsiao-Lan Chiang, Tobias Kadur, and Gerhard Fettweis Vodafone Chair for Mobile Communications Technische Universität

More information

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ.

More information

Damage Identification from Power Spectrum Density Transmissibility

Damage Identification from Power Spectrum Density Transmissibility 6th Euroean Worksho on Structural Health Monitoring - h.3.d.3 More info about this article: htt://www.ndt.net/?id=14083 Damage Identification from Power Sectrum Density ransmissibility Y. ZHOU, R. PERERA

More information

Optimal Recognition Algorithm for Cameras of Lasers Evanescent

Optimal Recognition Algorithm for Cameras of Lasers Evanescent Otimal Recognition Algorithm for Cameras of Lasers Evanescent T. Gaudo * Abstract An algorithm based on the Bayesian aroach to detect and recognise off-axis ulse laser beams roagating in the atmoshere

More information

ELEMENT OF INFORMATION THEORY

ELEMENT OF INFORMATION THEORY History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.050J/2.0J Information and Entropy Spring 2005 Issued: March 7, 2005

More information

Anytime communication over the Gilbert-Eliot channel with noiseless feedback

Anytime communication over the Gilbert-Eliot channel with noiseless feedback Anytime communication over the Gilbert-Eliot channel with noiseless feedback Anant Sahai, Salman Avestimehr, Paolo Minero Deartment of Electrical Engineering and Comuter Sciences University of California

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

Information Theory, Statistics, and Decision Trees

Information Theory, Statistics, and Decision Trees Information Theory, Statistics, and Decision Trees Léon Bottou COS 424 4/6/2010 Summary 1. Basic information theory. 2. Decision trees. 3. Information theory and statistics. Léon Bottou 2/31 COS 424 4/6/2010

More information

Lecture 16 Oct 21, 2014

Lecture 16 Oct 21, 2014 CS 395T: Sublinear Algorithms Fall 24 Prof. Eric Price Lecture 6 Oct 2, 24 Scribe: Chi-Kit Lam Overview In this lecture we will talk about information and compression, which the Huffman coding can achieve

More information

Uncertainty Modeling with Interval Type-2 Fuzzy Logic Systems in Mobile Robotics

Uncertainty Modeling with Interval Type-2 Fuzzy Logic Systems in Mobile Robotics Uncertainty Modeling with Interval Tye-2 Fuzzy Logic Systems in Mobile Robotics Ondrej Linda, Student Member, IEEE, Milos Manic, Senior Member, IEEE bstract Interval Tye-2 Fuzzy Logic Systems (IT2 FLSs)

More information

Elementary Analysis in Q p

Elementary Analysis in Q p Elementary Analysis in Q Hannah Hutter, May Szedlák, Phili Wirth November 17, 2011 This reort follows very closely the book of Svetlana Katok 1. 1 Sequences and Series In this section we will see some

More information

Phase Equilibrium Calculations by Equation of State v2

Phase Equilibrium Calculations by Equation of State v2 Bulletin of Research Center for Comuting and Multimedia Studies, Hosei University, 7 (3) Published online (htt://hdl.handle.net/4/89) Phase Equilibrium Calculations by Equation of State v Yosuke KATAOKA

More information

EE 508 Lecture 13. Statistical Characterization of Filter Characteristics

EE 508 Lecture 13. Statistical Characterization of Filter Characteristics EE 508 Lecture 3 Statistical Characterization of Filter Characteristics Comonents used to build filters are not recisely redictable L C Temerature Variations Manufacturing Variations Aging Model variations

More information

Universal Finite Memory Coding of Binary Sequences

Universal Finite Memory Coding of Binary Sequences Deartment of Electrical Engineering Systems Universal Finite Memory Coding of Binary Sequences Thesis submitted towards the degree of Master of Science in Electrical and Electronic Engineering in Tel-Aviv

More information

Lecture Notes for Communication Theory

Lecture Notes for Communication Theory Lecture Notes for Communication Theory February 15, 2011 Please let me know of any errors, typos, or poorly expressed arguments. And please do not reproduce or distribute this document outside Oxford University.

More information

State Estimation with ARMarkov Models

State Estimation with ARMarkov Models Deartment of Mechanical and Aerosace Engineering Technical Reort No. 3046, October 1998. Princeton University, Princeton, NJ. State Estimation with ARMarkov Models Ryoung K. Lim 1 Columbia University,

More information

HetNets: what tools for analysis?

HetNets: what tools for analysis? HetNets: what tools for analysis? Daniela Tuninetti (Ph.D.) Email: danielat@uic.edu Motivation Seven Ways that HetNets are a Cellular Paradigm Shift, by J. Andrews, IEEE Communications Magazine, March

More information

Model checking, verification of CTL. One must verify or expel... doubts, and convert them into the certainty of YES [Thomas Carlyle]

Model checking, verification of CTL. One must verify or expel... doubts, and convert them into the certainty of YES [Thomas Carlyle] Chater 5 Model checking, verification of CTL One must verify or exel... doubts, and convert them into the certainty of YES or NO. [Thomas Carlyle] 5. The verification setting Page 66 We introduce linear

More information

Iterative Methods for Designing Orthogonal and Biorthogonal Two-channel FIR Filter Banks with Regularities

Iterative Methods for Designing Orthogonal and Biorthogonal Two-channel FIR Filter Banks with Regularities R. Bregović and T. Saramäi, Iterative methods for designing orthogonal and biorthogonal two-channel FIR filter bans with regularities, Proc. Of Int. Worsho on Sectral Transforms and Logic Design for Future

More information

ECE 6960: Adv. Random Processes & Applications Lecture Notes, Fall 2010

ECE 6960: Adv. Random Processes & Applications Lecture Notes, Fall 2010 ECE 6960: Adv. Random Processes & Alications Lecture Notes, Fall 2010 Lecture 16 Today: (1) Markov Processes, (2) Markov Chains, (3) State Classification Intro Please turn in H 6 today. Read Chater 11,

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information