!! Let x n = x 1,x 2,,x n with x j! X!! We say that x n is "-typical with respect to p(x) if

Size: px
Start display at page:

Download "!! Let x n = x 1,x 2,,x n with x j! X!! We say that x n is "-typical with respect to p(x) if"

Transcription

1 Quantu Inforation Theory and Measure Concentration Patrick Hayden (McGill) Overview!! What is inforation theory?!! Entropy, copression, noisy coding and beyond!! What does it have to do with quantu echanics?!! Noise in the quantu echanical foralis!! Density operators, the partial trace, quantu operations!! Classical inforation through quantu channels!! Entangleent in rando subspaces!! Quantu inforation through quantu channels AMS Short Course, Jan Inforation (Shannon) theory!! A practical question:!! How to best ake use of a given counications resource?!! A atheatico-episteological question:!! How to quantify uncertainty and inforation?!! Shannon:!! Solved the first by considering the second.!! A atheatical theory of counication [1948] The Quantifying uncertainty!! Entropy: H(X) = -! x p(x) log 2 p(x)!! Proportional to entropy of statistical physics!! Ter suggested by von Neuann (ore on hi later)!! Can arrive at definition axioatically:!! H(X,Y) = H(X) + H(Y) for independent X, Y, etc.!! Operational point of view X 21 X n Copression Source of independent copies of X 2 nh(x) typical strings Can copress n copies of X to a binary string of length ~nh(x) If X is binary: About np(x=0) 0 s and np(x=1) 1 s {0,1} n : 2 n possible strings Typicality in ore detail!! Let x n = x 1,x 2,,x n with x j! X!! We say that x n is "-typical with respect to p(x) if!! For all a! X with p(a)>0, 1/n N(a x n ) p(a) < " / X!! For all a! X with p(a) = 0, N(a x n )=0.!! For ">0, the probability that a rando string X n is "-typical goes to 1.!! If x n is "-typical, 2 -n[h(x)+"] " p(x n ) " 2 -n[h(x)-"]!! The nuber of "-typical strings is bounded above by 2 n[h(x)+"]#

2 Quantifying inforation Sending inforation H(X) H(X,Y) H(Y) Statistical odel of a noisy channel: # Uncertainty in X when value of Y is known H(X Y) = H(X,Y)-H(Y) = E Y H(X Y=y) H(X Y) I(X;Y) H(Y X) Inforation is that which reduces uncertainty Shannon s noisy coding theore: In the liit of any uses, the optial rate (in bits per channel use) at which Alice can send essages reliably to Bob through $ is given by the forula I(X;Y) = H(X) H(X Y) = H(X)+H(Y)-H(X,Y) Data processing inequality Optiality in Shannon s theore Alice X ( X, Y ) Bob Y tie X n Shannon s noisy coding theore: In the liit of any uses, the optial rate at which Alice can send essages reliably to Bob through $ is given by the forula Y n p(z x) Z I(X;Y)! Y I(Z;Y)! Assue there exists a code with rate R and perfect decoding. Let M be the rando variable corresponding to the unifor distribution over essages. nr = H(M) = I(M;M ) " I(M;Y n ) " I(X n ;Y n ) "! j=1 n I(X j,y j ) " n ax p(x) I(X;Y)! I(X;Y)! I(Z;Y)! Perfect decoding: M=M M has nr bits of entropy Ter by ter Soe fiddling Data processing Shannon theory provides Quantu Shannon Theory provides!! Practically speaking:!! A holy grail for error-correcting codes!! Conceptually speaking:!! An operationally-otivated way of thinking about correlations!! What s issing (for a quantu echanic)?!! Features fro linear structure: Entangleent and non-orthogonality!! Quantitative & operational theory of quantu correlation: qubits, cbits, ebits, cobits, sbits!! Relies on a!! Major siplifying assuption: Coputation is free!! Minor siplifying assuption: Noise and data have regular structure

3 Superdense coding Before we get going: Soe unavoidable foralis To send i! {0,1,2,3} Tie! i 1 ebit + 1 qubit % 2 cbits % 0 $!! We need quantu generalizations of:!! Probability distributions (density operators)!! Marginal distributions (partial trace)!! Noisy channels (quantu operations) % i $ Mixing quantu states: The density operator The partial trace Draw & x $ with probability p(x) Perfor a easureent { 0$, 1$}: Probability of outcoe j: q j =! x p(x) &j & x $ 2 =! x p(x) tr[ j$& j & x $&& x ] ' (# ' ()# {M k }!! Suppose that ' AB is a density operator on A'B!! Alice easures {M k } on A!! Outcoe probability is q k = tr[ (M k ' I B ) ' AB ] # $ $ # % $ #& $ # ' $ = tr[ j$& j ' ], " =! x p(x) # x $&# x Outcoe probability is linear in '#!! Define ' A = tr B [' AB ] =! j B &j ' AB j$ B.!! Then q k = tr[ M k ' A ]!! ' A describes outcoe statistics for all possible experients by Alice alone Purification Quantu (noisy) channels: Analogs of p(y x) ' (# *$!! Suppose that ' A is a density operator on A!! Diagonalize ' A =! i + i, i $&, i!! Let *$ =! i + i 1/2, i $ A i$ B!! Note that ' A = tr B [*]!! *$ is a purification of '#!! Syetry: ' ( =* ( and * ) have sae non-zero eigenvalues# What reasonable constraints ight such a channel $:A( B satisfy? 1)! Take density operators to density operators 2)! Convex linearity: a ixture of input states should be apped to a corresponding ixture of output states Surprising fact: All such aps can, in principle, be realized physically Must be interpreted very strictly Require that ($ ' I C )(' AC ) always be a density operator too Doesn t coe for free! Let T be the transpose ap on A. If *$ = 00$ AC + 11$ AC, then (T' I C )( *$&* ) has negative eigenvalues The resulting set of transforations on density operators are known as trace-preserving, copletely positive aps

4 Quantu channels: exaples Further exaples!! Adjoining ancilla: '! ' ' 0$&0!! Unitary transforations: '! U'U!! Partial trace: ' AB! tr B [' AB ]!! That s it! All channels can be built out of these operations: '# 0$ U Stinespring dilation -# *! =!. A k " A k with!. A k A k = I Operator-su representation!! The depolarizing channel: # #'! (1-p)' + p I/2!! The dephasing channel # #'!! j &j ' j$ j$&j '# 0$ Equivalent to easuring { j$} then forgetting the outcoe -# Notions of distinguishability Basic requireent: quantu channels do not increase distinguishability Fidelity F(',-)=[Tr[(' 1/2 -' 1/2 ) 1/2 ]] 2 F=0 for perfectly distinguishable F=1 for identical F(',-)=ax &* ' * - $ 2 F($('),$(-)) % F(',-) Trace distance T(',-)= '-- 1 T=2 for perfectly distinguishable T=0 for identical T(',-)=2ax p(k=0 ')-p(k=0 -) ax is over easureents {M k } T(',-) % T($('),$(-)) Back to inforation theory! Stateents ade today hold for both easures Quantifying uncertainty Quantifying uncertainty: Exaples!! Let ' =! x p(x) & x $&& x be a density operator!! von Neuann entropy: H(') = - tr [' log ']#!! Equal to Shannon entropy of ' eigenvalues#!! Analog of a joint rando variable:!! ' AB describes a coposite syste A ' B!! H(A) ' = H(' A ) = H( tr B ' AB )!! H( &$&& ) = 0!! H(I/2) = 1!! H(''-) = H(') + H(-)!! H(I/2 n ) = n!! H(p' ) (1-p)-) = H(p,1-p) + ph(') + (1-p)H(-)

5 Copression The typical subspace Source of independent copies of ' () : " " "! " " (aka typical subspace) A A A B B B di(effective supp of ' B ' n ) ~ 2 nh(b) Can copress n copies of B to a syste of ~nh(b) qubits while preserving correlations with A No statistical assuptions: Just quantu echanics! B ' n!! Diagonalize ' =! x p(x) e x $&e x!! Then ' 'n =! x n p(x n ) e x n $&e x n!! The "-typical projector / t is the projector onto the span of the e x n $&e x n such that x n is typical wrt p(.)!! tr[' ' n / t ] ( 1 as n ( + [Schuacher, Petz] Quantifying inforation Quantifying inforation H(A) '# H(AB) '# H(B) '# H(A) '# H(AB) '# H(B) '# Uncertainty in A when value of B is known? H(A B) = H(AB)-H(B) H(A B) '# H(B A) '# Uncertainty in A when value of B is known? H(A B) = H(AB)-H(B) H(A B) '# I(A;B) '# H(B A) '# %$ AB = 0$ A 0$ B + 1$ A 1$ B % B = I/2 H(A B) % = 0 1 = -1 Conditional entropy can be negative! Inforation is that which reduces uncertainty I(A;B) = H(A) H(A B) = H(A)+H(B)-H(AB) % 0 Sending classical inforation Sending classical inforation Physical odel of a noisy channel: (Trace-preserving, copletely positive ap) (! state) (easureent) (! state) (easureent) HSW noisy coding theore: In the liit of any uses, the optial rate at which Alice can send essages reliably to Bob through $ is given by the (regularization of the) forula X 1,X 2,,X n 2 nh(b) B ' n

6 Sending classical inforation Data processing inequality (Strong subadditivity) (! state) (easureent) Alice! AB Bob tie '# I(A;B) '# X 1,X 2,,X n 2 nh(b) B ' n U -# I(A;B) -# Exercise: Show that data-processing iplies H(A B) ( % H(A BC) ( for any (. Distinguish using well-chosen easureent I(A;B) ' % I(A;B) -# Optiality in the HSW theore The additivity conjecture: The liit isn t necessary (! state) - (easureent) Assue there exists a code with rate R with perfect decoding. Let M be the rando variable corresponding to the unifor distribution over essages. nr = H(M) = I(M;M ) " I(A;B) "! Holevo, Datta, Fukuda, King, Ruskai, Schuacher, Shirokov, Shor, Werner Counterexaple by Hastings in 2008 Perfect decoding: M=M M has nr bits of entropy Data processing Why did they care so uch? The additivity conjecture: The liit isn t necessary Surprises in high diension!! Choose a rando pure quantu state: #! R C d A ' C d B!! What can we expect of #? (d A # d B ) Operational interpretation: Alice doesn t need to entangle her inputs across ultiple uses of the channel. Codewords look like! x$ '! x% ' " '! xn Hastings counterexaple based on existence of highly entangled subspaces!! On average, states are highly entangled Lubkin, Lloyd, Page, Foong & Kanno, Sanchez-Ruiz, Sen

7 Concentration of easure Application to entropy A n 3# A n ~ exp[-n f(3)]!! Choose a rando pure quantu state: #! R C d A ' C d B (d A " d B ) S n LEVY: Given an $-Lipschitz function f : S n ( R with edian M, the probability that a rando x! R S n is further than " fro M is bounded above by exp (-n" 2 C/$ 2 ) fro soe C > 0. P H(& A ) U"S, A 2 B Rando subspaces 1)! Choose a fine net F of states on the unit sphere of subspace S. 2)! P( Not all states in UF highly entangled ) " F P( One state isn t ) 3)! Highly entangled for sufficiently fine N iplies sae for all states in S. THEOREM: There exist subspaces of diension Cd A d B 0 3 /(log d A ) 3, all of whose states have entangleent at least log d A The probability that a rando subspace does goes to 1 with d A d B. In qubit language!! In a bipartite syste of n by n+o(n) qubits, there exists a subspace of 2n o(n) qubits in which all states have at least n o(1) ebits of entangleent.!! The subspace of nearly axially entangled states is alost as big as the whole syste! Copare to pairs of qubits Credit credit is due!! The subspace spanned by two or ore Bell pairs always contains soe product states. (No subspaces of entangled states, let alone axially entangled states.)!! Accidental quantu inforation theorists?!! Milan and Schechtan. Asyptotic theory of finite diensional nored spaces. Springer-Verlag, 1986.!! Others: Gowers, Groov, Ledoux, Szarek, Talagrand C 2 ' C 2

8 Sending quantu inforation Take-hoe essage Physical odel of a noisy channel: (Trace-preserving, copletely positive ap) &$! C d (TPCP ap) (TPCP ap) LSD noisy coding theore: In the liit of any uses, the optial rate at which Alice can reliably send qubits to Bob (1/n log d) through $ is given by the (regularization of the) forula & Conditional entropy!!! Inforation theory can be generalized to analyze quantu inforation processing!! Yields a rich theory, surprising conceptual siplicity!! Operational approach to thinking about quantu echanics:!! Copression, essage transission, subspace transission!! Powerful techniques for dealing with noise!! Measure concentration to explore high-diensional spaces Further reading Soe things I haven t shown you!! Nielsen & Chuang: Quantu Coputation and Quantu Inforation!! The additivity conjecture [Holevo ICM Proceedings 2006]!! Hastings counterexaples: arxiv: !! Entangled subspaces: arxiv: !! Quantu capacity proble: Open Systes and Inforation Dynaics special issue 15(1)!! Merging and splitting: The other of all protocols [HOW quant-ph/ ; ADHW quant-ph/ ]!! Quantifying entangleent: Foration, distillation and everything in between [HHHH quant-ph/ ]!! Beating teleportation: Reote state preparation and its cousins [BHLSW quantph/ ; HLSW quant-ph/ ]

Concentration of Measure Effects in Quantum Information. Patrick Hayden (McGill University)

Concentration of Measure Effects in Quantum Information. Patrick Hayden (McGill University) Concentration of Measure Effects in Quantum Information Patrick Hayden (McGill University) Overview Superdense coding Random states and random subspaces Superdense coding of quantum states Quantum mechanical

More information

The Transactional Nature of Quantum Information

The Transactional Nature of Quantum Information The Transactional Nature of Quantu Inforation Subhash Kak Departent of Coputer Science Oklahoa State University Stillwater, OK 7478 ABSTRACT Inforation, in its counications sense, is a transactional property.

More information

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t.

This model assumes that the probability of a gap has size i is proportional to 1/i. i.e., i log m e. j=1. E[gap size] = i P r(i) = N f t. CS 493: Algoriths for Massive Data Sets Feb 2, 2002 Local Models, Bloo Filter Scribe: Qin Lv Local Models In global odels, every inverted file entry is copressed with the sae odel. This work wells when

More information

Entanglement Manipulation

Entanglement Manipulation Entanglement Manipulation Steven T. Flammia 1 1 Perimeter Institute for Theoretical Physics, Waterloo, Ontario, N2L 2Y5 Canada (Dated: 22 March 2010) These are notes for my RIT tutorial lecture at the

More information

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search Quantu algoriths (CO 781, Winter 2008) Prof Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search ow we begin to discuss applications of quantu walks to search algoriths

More information

Block designs and statistics

Block designs and statistics Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent

More information

Quantum Information Theory and Cryptography

Quantum Information Theory and Cryptography Quantum Information Theory and Cryptography John Smolin, IBM Research IPAM Information Theory A Mathematical Theory of Communication, C.E. Shannon, 1948 Lies at the intersection of Electrical Engineering,

More information

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance. 9. Distance measures 9.1 Classical information measures How similar/close are two probability distributions? Trace distance Fidelity Example: Flipping two coins, one fair one biased Head Tail Trace distance

More information

Partial traces and entropy inequalities

Partial traces and entropy inequalities Linear Algebra and its Applications 370 (2003) 125 132 www.elsevier.co/locate/laa Partial traces and entropy inequalities Rajendra Bhatia Indian Statistical Institute, 7, S.J.S. Sansanwal Marg, New Delhi

More information

UCSD Spring School lecture notes: Continuous-time quantum computing

UCSD Spring School lecture notes: Continuous-time quantum computing UCSD Spring School lecture notes: Continuous-tie quantu coputing David Gosset 1 Efficient siulation of quantu dynaics Quantu echanics is described atheatically using linear algebra, so at soe level is

More information

ESE 523 Information Theory

ESE 523 Information Theory ESE 53 Inforation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electrical and Systes Engineering Washington University 11 Urbauer Hall 10E Green Hall 314-935-4173 (Lynda Marha Answers) jao@wustl.edu

More information

Lecture 11: Quantum Information III - Source Coding

Lecture 11: Quantum Information III - Source Coding CSCI5370 Quantum Computing November 25, 203 Lecture : Quantum Information III - Source Coding Lecturer: Shengyu Zhang Scribe: Hing Yin Tsang. Holevo s bound Suppose Alice has an information source X that

More information

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis

E0 370 Statistical Learning Theory Lecture 6 (Aug 30, 2011) Margin Analysis E0 370 tatistical Learning Theory Lecture 6 (Aug 30, 20) Margin Analysis Lecturer: hivani Agarwal cribe: Narasihan R Introduction In the last few lectures we have seen how to obtain high confidence bounds

More information

Fixed-to-Variable Length Distribution Matching

Fixed-to-Variable Length Distribution Matching Fixed-to-Variable Length Distribution Matching Rana Ali Ajad and Georg Böcherer Institute for Counications Engineering Technische Universität München, Gerany Eail: raa2463@gail.co,georg.boecherer@tu.de

More information

Physics 215 Winter The Density Matrix

Physics 215 Winter The Density Matrix Physics 215 Winter 2018 The Density Matrix The quantu space of states is a Hilbert space H. Any state vector ψ H is a pure state. Since any linear cobination of eleents of H are also an eleent of H, it

More information

COS 424: Interacting with Data. Written Exercises

COS 424: Interacting with Data. Written Exercises COS 424: Interacting with Data Hoework #4 Spring 2007 Regression Due: Wednesday, April 18 Written Exercises See the course website for iportant inforation about collaboration and late policies, as well

More information

Remarks on the Additivity Conjectures for Quantum Channels

Remarks on the Additivity Conjectures for Quantum Channels Contemporary Mathematics Remarks on the Additivity Conjectures for Quantum Channels Christopher King Abstract. In this article we present the statements of the additivity conjectures for quantum channels,

More information

Quantum entanglement and entropy

Quantum entanglement and entropy PHYSICAL REVIEW A, VOLUME 64, 0210 Quantu entangleent and entropy Filippo Giraldi 1,2, and Paolo Grigolini 1,2, 1 Dipartiento di Fisica dell Università di Pisa and INFM, Piazza Torricelli 2, 56127 Pisa,

More information

Content-Type Coding. arxiv: v2 [cs.it] 3 Jun Linqi Song UCLA

Content-Type Coding. arxiv: v2 [cs.it] 3 Jun Linqi Song UCLA Content-Type Coding Linqi Song UCLA Eail: songlinqi@ucla.edu Christina Fragouli UCLA and EPFL Eail: christina.fragouli@ucla.edu arxiv:1505.03561v [cs.it] 3 Jun 015 Abstract This paper is otivated by the

More information

arxiv: v1 [quant-ph] 3 Jan 2008

arxiv: v1 [quant-ph] 3 Jan 2008 A paradigm for entanglement theory based on quantum communication Jonathan Oppenheim 1 1 Department of Applied Mathematics and Theoretical Physics, University of Cambridge U.K. arxiv:0801.0458v1 [quant-ph]

More information

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization

Support Vector Machine Classification of Uncertain and Imbalanced data using Robust Optimization Recent Researches in Coputer Science Support Vector Machine Classification of Uncertain and Ibalanced data using Robust Optiization RAGHAV PAT, THEODORE B. TRAFALIS, KASH BARKER School of Industrial Engineering

More information

Optimal Jamming Over Additive Noise: Vector Source-Channel Case

Optimal Jamming Over Additive Noise: Vector Source-Channel Case Fifty-first Annual Allerton Conference Allerton House, UIUC, Illinois, USA October 2-3, 2013 Optial Jaing Over Additive Noise: Vector Source-Channel Case Erah Akyol and Kenneth Rose Abstract This paper

More information

Finite fields. and we ve used it in various examples and homework problems. In these notes I will introduce more finite fields

Finite fields. and we ve used it in various examples and homework problems. In these notes I will introduce more finite fields Finite fields I talked in class about the field with two eleents F 2 = {, } and we ve used it in various eaples and hoework probles. In these notes I will introduce ore finite fields F p = {,,...,p } for

More information

Lecture 18: Quantum Information Theory and Holevo s Bound

Lecture 18: Quantum Information Theory and Holevo s Bound Quantum Computation (CMU 1-59BB, Fall 2015) Lecture 1: Quantum Information Theory and Holevo s Bound November 10, 2015 Lecturer: John Wright Scribe: Nicolas Resch 1 Question In today s lecture, we will

More information

Solutions 1. Introduction to Coding Theory - Spring 2010 Solutions 1. Exercise 1.1. See Examples 1.2 and 1.11 in the course notes.

Solutions 1. Introduction to Coding Theory - Spring 2010 Solutions 1. Exercise 1.1. See Examples 1.2 and 1.11 in the course notes. Solutions 1 Exercise 1.1. See Exaples 1.2 and 1.11 in the course notes. Exercise 1.2. Observe that the Haing distance of two vectors is the iniu nuber of bit flips required to transfor one into the other.

More information

CS Lecture 13. More Maximum Likelihood

CS Lecture 13. More Maximum Likelihood CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood

More information

Chapter 6 1-D Continuous Groups

Chapter 6 1-D Continuous Groups Chapter 6 1-D Continuous Groups Continuous groups consist of group eleents labelled by one or ore continuous variables, say a 1, a 2,, a r, where each variable has a well- defined range. This chapter explores:

More information

Understanding Machine Learning Solution Manual

Understanding Machine Learning Solution Manual Understanding Machine Learning Solution Manual Written by Alon Gonen Edited by Dana Rubinstein Noveber 17, 2014 2 Gentle Start 1. Given S = ((x i, y i )), define the ultivariate polynoial p S (x) = i []:y

More information

Physics 201, Lecture 15

Physics 201, Lecture 15 Physics 0, Lecture 5 Today s Topics q More on Linear Moentu And Collisions Elastic and Perfect Inelastic Collision (D) Two Diensional Elastic Collisions Exercise: Billiards Board Explosion q Multi-Particle

More information

Note-A-Rific: Mechanical

Note-A-Rific: Mechanical Note-A-Rific: Mechanical Kinetic You ve probably heard of inetic energy in previous courses using the following definition and forula Any object that is oving has inetic energy. E ½ v 2 E inetic energy

More information

1 Generalization bounds based on Rademacher complexity

1 Generalization bounds based on Rademacher complexity COS 5: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #0 Scribe: Suqi Liu March 07, 08 Last tie we started proving this very general result about how quickly the epirical average converges

More information

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013 Lecturer: Dr. Mark Tame Introduction With the emergence of new types of information, in this case

More information

Lecture: Quantum Information

Lecture: Quantum Information Lecture: Quantum Information Transcribed by: Crystal Noel and Da An (Chi Chi) November 10, 016 1 Final Proect Information Find an issue related to class you are interested in and either: read some papers

More information

2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all

2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all Lecture 6 Introduction to kinetic theory of plasa waves Introduction to kinetic theory So far we have been odeling plasa dynaics using fluid equations. The assuption has been that the pressure can be either

More information

arxiv: v3 [quant-ph] 18 Oct 2017

arxiv: v3 [quant-ph] 18 Oct 2017 Self-guaranteed easureent-based quantu coputation Masahito Hayashi 1,, and Michal Hajdušek, 1 Graduate School of Matheatics, Nagoya University, Furocho, Chikusa-ku, Nagoya 464-860, Japan Centre for Quantu

More information

1. Basic rules of quantum mechanics

1. Basic rules of quantum mechanics 1. Basic rules of quantum mechanics How to describe the states of an ideally controlled system? How to describe changes in an ideally controlled system? How to describe measurements on an ideally controlled

More information

Feature Extraction Techniques

Feature Extraction Techniques Feature Extraction Techniques Unsupervised Learning II Feature Extraction Unsupervised ethods can also be used to find features which can be useful for categorization. There are unsupervised ethods that

More information

Entanglement-Assisted Capacity of a Quantum Channel and the Reverse Shannon Theorem

Entanglement-Assisted Capacity of a Quantum Channel and the Reverse Shannon Theorem IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 10, OCTOBER 2002 2637 Entanglement-Assisted Capacity of a Quantum Channel the Reverse Shannon Theorem Charles H. Bennett, Peter W. Shor, Member, IEEE,

More information

Error Exponents in Asynchronous Communication

Error Exponents in Asynchronous Communication IEEE International Syposiu on Inforation Theory Proceedings Error Exponents in Asynchronous Counication Da Wang EECS Dept., MIT Cabridge, MA, USA Eail: dawang@it.edu Venkat Chandar Lincoln Laboratory,

More information

Entangling characterization of (SWAP) 1/m and Controlled unitary gates

Entangling characterization of (SWAP) 1/m and Controlled unitary gates Entangling characterization of (SWAP) / and Controlled unitary gates S.Balakrishnan and R.Sankaranarayanan Departent of Physics, National Institute of Technology, Tiruchirappalli 65, India. We study the

More information

to mere bit flips) may affect the transmission.

to mere bit flips) may affect the transmission. 5 VII. QUANTUM INFORMATION THEORY to mere bit flips) may affect the transmission. A. Introduction B. A few bits of classical information theory Information theory has developed over the past five or six

More information

A Holevo-type bound for a Hilbert Schmidt distance measure

A Holevo-type bound for a Hilbert Schmidt distance measure Journal of Quantum Information Science, 205, *,** Published Online **** 204 in SciRes. http://www.scirp.org/journal/**** http://dx.doi.org/0.4236/****.204.***** A Holevo-type bound for a Hilbert Schmidt

More information

On Constant Power Water-filling

On Constant Power Water-filling On Constant Power Water-filling Wei Yu and John M. Cioffi Electrical Engineering Departent Stanford University, Stanford, CA94305, U.S.A. eails: {weiyu,cioffi}@stanford.edu Abstract This paper derives

More information

On the Bell- Kochen -Specker paradox

On the Bell- Kochen -Specker paradox On the Bell- Kochen -Specker paradox Koji Nagata and Tadao Nakaura Departent of Physics, Korea Advanced Institute of Science and Technology, Daejeon, Korea E-ail: ko_i_na@yahoo.co.jp Departent of Inforation

More information

E0 370 Statistical Learning Theory Lecture 5 (Aug 25, 2011)

E0 370 Statistical Learning Theory Lecture 5 (Aug 25, 2011) E0 370 Statistical Learning Theory Lecture 5 Aug 5, 0 Covering Nubers, Pseudo-Diension, and Fat-Shattering Diension Lecturer: Shivani Agarwal Scribe: Shivani Agarwal Introduction So far we have seen how

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee227c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee227c@berkeley.edu October

More information

A Simple Regression Problem

A Simple Regression Problem A Siple Regression Proble R. M. Castro March 23, 2 In this brief note a siple regression proble will be introduced, illustrating clearly the bias-variance tradeoff. Let Y i f(x i ) + W i, i,..., n, where

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE7C (Spring 018: Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee7c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee7c@berkeley.edu October 15,

More information

Quantum Information Types

Quantum Information Types qitd181 Quantum Information Types Robert B. Griffiths Version of 6 February 2012 References: R. B. Griffiths, Types of Quantum Information, Phys. Rev. A 76 (2007) 062320; arxiv:0707.3752 Contents 1 Introduction

More information

Quantum Error Correcting Codes and Quantum Cryptography. Peter Shor M.I.T. Cambridge, MA 02139

Quantum Error Correcting Codes and Quantum Cryptography. Peter Shor M.I.T. Cambridge, MA 02139 Quantum Error Correcting Codes and Quantum Cryptography Peter Shor M.I.T. Cambridge, MA 02139 1 We start out with two processes which are fundamentally quantum: superdense coding and teleportation. Superdense

More information

Quantum Data Compression

Quantum Data Compression PHYS 476Q: An Introduction to Entanglement Theory (Spring 2018) Eric Chitambar Quantum Data Compression With the basic foundation of quantum mechanics in hand, we can now explore different applications.

More information

1 Bounding the Margin

1 Bounding the Margin COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost

More information

Quantum Information Chapter 10. Quantum Shannon Theory

Quantum Information Chapter 10. Quantum Shannon Theory Quantum Information Chapter 10. Quantum Shannon Theory John Preskill Institute for Quantum Information and Matter California Institute of Technology Updated January 2018 For further updates and additional

More information

Classical data compression with quantum side information

Classical data compression with quantum side information Classical data compression with quantum side information I Devetak IBM TJ Watson Research Center, Yorktown Heights, NY 10598, USA arxiv:quant-ph/0209029v3 18 Apr 2003 A Winter Department of Computer Science,

More information

Uniformly Additive Entropic Formulas

Uniformly Additive Entropic Formulas Uniformly Additive Entropic Formulas Andrew Cross, Ke Li, Graeme Smith JILA and Department of Physics, University of Colorado Boulder QMATH2016 Georgia Tech October 9, 2016 Information theory: optimal

More information

Quantum Information Theory

Quantum Information Theory Chapter 5 Quantum Information Theory Quantum information theory is a rich subject that could easily have occupied us all term. But because we are short of time (I m anxious to move on to quantum computation),

More information

ADVANCES ON THE BESSIS- MOUSSA-VILLANI TRACE CONJECTURE

ADVANCES ON THE BESSIS- MOUSSA-VILLANI TRACE CONJECTURE ADVANCES ON THE BESSIS- MOUSSA-VILLANI TRACE CONJECTURE CHRISTOPHER J. HILLAR Abstract. A long-standing conjecture asserts that the polynoial p(t = Tr(A + tb ] has nonnegative coefficients whenever is

More information

A note on the realignment criterion

A note on the realignment criterion A note on the realignent criterion Chi-Kwong Li 1, Yiu-Tung Poon and Nung-Sing Sze 3 1 Departent of Matheatics, College of Willia & Mary, Williasburg, VA 3185, USA Departent of Matheatics, Iowa State University,

More information

Chaotic Coupled Map Lattices

Chaotic Coupled Map Lattices Chaotic Coupled Map Lattices Author: Dustin Keys Advisors: Dr. Robert Indik, Dr. Kevin Lin 1 Introduction When a syste of chaotic aps is coupled in a way that allows the to share inforation about each

More information

Classical and Quantum Channel Simulations

Classical and Quantum Channel Simulations Classical and Quantum Channel Simulations Mario Berta (based on joint work with Fernando Brandão, Matthias Christandl, Renato Renner, Joseph Renes, Stephanie Wehner, Mark Wilde) Outline Classical Shannon

More information

Physics 221B: Solution to HW # 6. 1) Born-Oppenheimer for Coupled Harmonic Oscillators

Physics 221B: Solution to HW # 6. 1) Born-Oppenheimer for Coupled Harmonic Oscillators Physics B: Solution to HW # 6 ) Born-Oppenheier for Coupled Haronic Oscillators This proble is eant to convince you of the validity of the Born-Oppenheier BO) Approxiation through a toy odel of coupled

More information

Bayesian Learning. Chapter 6: Bayesian Learning. Bayes Theorem. Roles for Bayesian Methods. CS 536: Machine Learning Littman (Wu, TA)

Bayesian Learning. Chapter 6: Bayesian Learning. Bayes Theorem. Roles for Bayesian Methods. CS 536: Machine Learning Littman (Wu, TA) Bayesian Learning Chapter 6: Bayesian Learning CS 536: Machine Learning Littan (Wu, TA) [Read Ch. 6, except 6.3] [Suggested exercises: 6.1, 6.2, 6.6] Bayes Theore MAP, ML hypotheses MAP learners Miniu

More information

The Weierstrass Approximation Theorem

The Weierstrass Approximation Theorem 36 The Weierstrass Approxiation Theore Recall that the fundaental idea underlying the construction of the real nubers is approxiation by the sipler rational nubers. Firstly, nubers are often deterined

More information

A remark on a success rate model for DPA and CPA

A remark on a success rate model for DPA and CPA A reark on a success rate odel for DPA and CPA A. Wieers, BSI Version 0.5 andreas.wieers@bsi.bund.de Septeber 5, 2018 Abstract The success rate is the ost coon evaluation etric for easuring the perforance

More information

Multi-Scale/Multi-Resolution: Wavelet Transform

Multi-Scale/Multi-Resolution: Wavelet Transform Multi-Scale/Multi-Resolution: Wavelet Transfor Proble with Fourier Fourier analysis -- breaks down a signal into constituent sinusoids of different frequencies. A serious drawback in transforing to the

More information

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical

ASSUME a source over an alphabet size m, from which a sequence of n independent samples are drawn. The classical IEEE TRANSACTIONS ON INFORMATION THEORY Large Alphabet Source Coding using Independent Coponent Analysis Aichai Painsky, Meber, IEEE, Saharon Rosset and Meir Feder, Fellow, IEEE arxiv:67.7v [cs.it] Jul

More information

e-companion ONLY AVAILABLE IN ELECTRONIC FORM

e-companion ONLY AVAILABLE IN ELECTRONIC FORM OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer

More information

Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Time-Varying Jamming Links

Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Time-Varying Jamming Links Vulnerability of MRD-Code-Based Universal Secure Error-Correcting Network Codes under Tie-Varying Jaing Links Jun Kurihara KDDI R&D Laboratories, Inc 2 5 Ohara, Fujiino, Saitaa, 356 8502 Japan Eail: kurihara@kddilabsjp

More information

3.8 Three Types of Convergence

3.8 Three Types of Convergence 3.8 Three Types of Convergence 3.8 Three Types of Convergence 93 Suppose that we are given a sequence functions {f k } k N on a set X and another function f on X. What does it ean for f k to converge to

More information

i ij j ( ) sin cos x y z x x x interchangeably.)

i ij j ( ) sin cos x y z x x x interchangeably.) Tensor Operators Michael Fowler,2/3/12 Introduction: Cartesian Vectors and Tensors Physics is full of vectors: x, L, S and so on Classically, a (three-diensional) vector is defined by its properties under

More information

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition

Pattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition

More information

Trading classical communication, quantum communication, and entanglement in quantum Shannon theory

Trading classical communication, quantum communication, and entanglement in quantum Shannon theory Trading classical communication, quantum communication, and entanglement in quantum Shannon theory Min-Hsiu Hsieh and Mark M. Wilde arxiv:9.338v3 [quant-ph] 9 Apr 2 Abstract We give trade-offs between

More information

Compression and entanglement, entanglement transformations

Compression and entanglement, entanglement transformations PHYSICS 491: Symmetry and Quantum Information April 27, 2017 Compression and entanglement, entanglement transformations Lecture 8 Michael Walter, Stanford University These lecture notes are not proof-read

More information

P (t) = P (t = 0) + F t Conclusion: If we wait long enough, the velocity of an electron will diverge, which is obviously impossible and wrong.

P (t) = P (t = 0) + F t Conclusion: If we wait long enough, the velocity of an electron will diverge, which is obviously impossible and wrong. 4 Phys520.nb 2 Drude theory ~ Chapter in textbook 2.. The relaxation tie approxiation Here we treat electrons as a free ideal gas (classical) 2... Totally ignore interactions/scatterings Under a static

More information

Lecture 4: Postulates of quantum mechanics

Lecture 4: Postulates of quantum mechanics Lecture 4: Postulates of quantum mechanics Rajat Mittal IIT Kanpur The postulates of quantum mechanics provide us the mathematical formalism over which the physical theory is developed. For people studying

More information

DERIVING PROPER UNIFORM PRIORS FOR REGRESSION COEFFICIENTS

DERIVING PROPER UNIFORM PRIORS FOR REGRESSION COEFFICIENTS DERIVING PROPER UNIFORM PRIORS FOR REGRESSION COEFFICIENTS N. van Erp and P. van Gelder Structural Hydraulic and Probabilistic Design, TU Delft Delft, The Netherlands Abstract. In probles of odel coparison

More information

Uniform Approximation and Bernstein Polynomials with Coefficients in the Unit Interval

Uniform Approximation and Bernstein Polynomials with Coefficients in the Unit Interval Unifor Approxiation and Bernstein Polynoials with Coefficients in the Unit Interval Weiang Qian and Marc D. Riedel Electrical and Coputer Engineering, University of Minnesota 200 Union St. S.E. Minneapolis,

More information

Ocean 420 Physical Processes in the Ocean Project 1: Hydrostatic Balance, Advection and Diffusion Answers

Ocean 420 Physical Processes in the Ocean Project 1: Hydrostatic Balance, Advection and Diffusion Answers Ocean 40 Physical Processes in the Ocean Project 1: Hydrostatic Balance, Advection and Diffusion Answers 1. Hydrostatic Balance a) Set all of the levels on one of the coluns to the lowest possible density.

More information

Tight Bounds for Maximal Identifiability of Failure Nodes in Boolean Network Tomography

Tight Bounds for Maximal Identifiability of Failure Nodes in Boolean Network Tomography Tight Bounds for axial Identifiability of Failure Nodes in Boolean Network Toography Nicola Galesi Sapienza Università di Roa nicola.galesi@uniroa1.it Fariba Ranjbar Sapienza Università di Roa fariba.ranjbar@uniroa1.it

More information

Computational and Statistical Learning Theory

Computational and Statistical Learning Theory Coputational and Statistical Learning Theory Proble sets 5 and 6 Due: Noveber th Please send your solutions to learning-subissions@ttic.edu Notations/Definitions Recall the definition of saple based Radeacher

More information

Lecture 6: Quantum error correction and quantum capacity

Lecture 6: Quantum error correction and quantum capacity Lecture 6: Quantum error correction and quantum capacity Mark M. Wilde The quantum capacity theorem is one of the most important theorems in quantum hannon theory. It is a fundamentally quantum theorem

More information

Introduction to Robotics (CS223A) (Winter 2006/2007) Homework #5 solutions

Introduction to Robotics (CS223A) (Winter 2006/2007) Homework #5 solutions Introduction to Robotics (CS3A) Handout (Winter 6/7) Hoework #5 solutions. (a) Derive a forula that transfors an inertia tensor given in soe frae {C} into a new frae {A}. The frae {A} can differ fro frae

More information

Implementing Non-Projective Measurements via Linear Optics: an Approach Based on Optimal Quantum State Discrimination

Implementing Non-Projective Measurements via Linear Optics: an Approach Based on Optimal Quantum State Discrimination Ipleenting Non-Projective Measureents via Linear Optics: an Approach Based on Optial Quantu State Discriination Peter van Loock 1, Kae Neoto 1, Willia J. Munro, Philippe Raynal 2, Norbert Lütkenhaus 2

More information

Compressive Sensing Over Networks

Compressive Sensing Over Networks Forty-Eighth Annual Allerton Conference Allerton House, UIUC, Illinois, USA Septeber 29 - October, 200 Copressive Sensing Over Networks Soheil Feizi MIT Eail: sfeizi@it.edu Muriel Médard MIT Eail: edard@it.edu

More information

5. Communication resources

5. Communication resources 5. Communication resources Classical channel Quantum channel Entanglement How does the state evolve under LOCC? Properties of maximally entangled states Bell basis Quantum dense coding Quantum teleportation

More information

Secret-Key Sharing Based on Layered Broadcast Coding over Fading Channels

Secret-Key Sharing Based on Layered Broadcast Coding over Fading Channels Secret-Key Sharing Based on Layered Broadcast Coding over Fading Channels Xiaojun Tang WINLAB, Rutgers University Ruoheng Liu Princeton University Predrag Spasojević WINLAB, Rutgers University H. Vincent

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I Contents 1. Preliinaries 2. The ain result 3. The Rieann integral 4. The integral of a nonnegative

More information

Lecture 20 November 7, 2013

Lecture 20 November 7, 2013 CS 229r: Algoriths for Big Data Fall 2013 Prof. Jelani Nelson Lecture 20 Noveber 7, 2013 Scribe: Yun Willia Yu 1 Introduction Today we re going to go through the analysis of atrix copletion. First though,

More information

Distributed Subgradient Methods for Multi-agent Optimization

Distributed Subgradient Methods for Multi-agent Optimization 1 Distributed Subgradient Methods for Multi-agent Optiization Angelia Nedić and Asuan Ozdaglar October 29, 2007 Abstract We study a distributed coputation odel for optiizing a su of convex objective functions

More information

Asymptotic Pure State Transformations

Asymptotic Pure State Transformations Asymptotic Pure State Transformations PHYS 500 - Southern Illinois University April 18, 2017 PHYS 500 - Southern Illinois University Asymptotic Pure State Transformations April 18, 2017 1 / 15 Entanglement

More information

On Conditions for Linearity of Optimal Estimation

On Conditions for Linearity of Optimal Estimation On Conditions for Linearity of Optial Estiation Erah Akyol, Kuar Viswanatha and Kenneth Rose {eakyol, kuar, rose}@ece.ucsb.edu Departent of Electrical and Coputer Engineering University of California at

More information

Quantum Mechanics II: Examples

Quantum Mechanics II: Examples Quantum Mechanics II: Examples Michael A. Nielsen University of Queensland Goals: 1. To apply the principles introduced in the last lecture to some illustrative examples: superdense coding, and quantum

More information

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion

Supplementary Material for Fast and Provable Algorithms for Spectrally Sparse Signal Reconstruction via Low-Rank Hankel Matrix Completion Suppleentary Material for Fast and Provable Algoriths for Spectrally Sparse Signal Reconstruction via Low-Ran Hanel Matrix Copletion Jian-Feng Cai Tianing Wang Ke Wei March 1, 017 Abstract We establish

More information

Quantum decoherence. Éric Oliver Paquette (U. Montréal) -Traces Worshop [Ottawa]- April 29 th, Quantum decoherence p. 1/2

Quantum decoherence. Éric Oliver Paquette (U. Montréal) -Traces Worshop [Ottawa]- April 29 th, Quantum decoherence p. 1/2 Quantum decoherence p. 1/2 Quantum decoherence Éric Oliver Paquette (U. Montréal) -Traces Worshop [Ottawa]- April 29 th, 2007 Quantum decoherence p. 2/2 Outline Quantum decoherence: 1. Basics of quantum

More information

Lecture 11 September 30, 2015

Lecture 11 September 30, 2015 PHYS 7895: Quantum Information Theory Fall 015 Lecture 11 September 30, 015 Prof. Mark M. Wilde Scribe: Mark M. Wilde This document is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike

More information

The Hilbert Schmidt version of the commutator theorem for zero trace matrices

The Hilbert Schmidt version of the commutator theorem for zero trace matrices The Hilbert Schidt version of the coutator theore for zero trace atrices Oer Angel Gideon Schechtan March 205 Abstract Let A be a coplex atrix with zero trace. Then there are atrices B and C such that

More information

Time Reversal and Exchange Symmetries of Unitary Gate Capacities

Time Reversal and Exchange Symmetries of Unitary Gate Capacities Time Reversal and Exchange Symmetries of Unitary Gate Capacities The MIT Faculty has made this article openly available Please share how this access benefits you Your story matters Citation As Published

More information

A Note on Scheduling Tall/Small Multiprocessor Tasks with Unit Processing Time to Minimize Maximum Tardiness

A Note on Scheduling Tall/Small Multiprocessor Tasks with Unit Processing Time to Minimize Maximum Tardiness A Note on Scheduling Tall/Sall Multiprocessor Tasks with Unit Processing Tie to Miniize Maxiu Tardiness Philippe Baptiste and Baruch Schieber IBM T.J. Watson Research Center P.O. Box 218, Yorktown Heights,

More information

Ph 20.3 Numerical Solution of Ordinary Differential Equations

Ph 20.3 Numerical Solution of Ordinary Differential Equations Ph 20.3 Nuerical Solution of Ordinary Differential Equations Due: Week 5 -v20170314- This Assignent So far, your assignents have tried to failiarize you with the hardware and software in the Physics Coputing

More information

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians

Using EM To Estimate A Probablity Density With A Mixture Of Gaussians Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points

More information