Intro Quant Info
Intro Quant Info
Classical Information
Some Relevant Quantum Mechanics
Quantum Information
Jeffrey Bub
Department of Philosophy
and
IPST
University of Maryland
Quantum Information
Resources
Classical Information
Some Relevant Quantum Mechanics
Quantum Information
Outline
1 Resources
Web Resources
Print Resources
2 Classical Information
Shannon Entropy
Conditional Entropy and Mutual Information
3 Some Relevant Quantum Mechanics
Entangled States
Measurement
Quantum Operations
4 Quantum Information
Von Neumann Entropy
Accessible Information
Quantum Information
Resources
Classical Information Web Resources
Some Relevant Quantum Mechanics Print Resources
Quantum Information
Web Resources
Quantum Information
Resources
Classical Information Web Resources
Some Relevant Quantum Mechanics Print Resources
Quantum Information
Print Resources
Quantum Information
Resources
Classical Information Web Resources
Some Relevant Quantum Mechanics Print Resources
Quantum Information
Print Resources
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
If the probabilities pi are all equal (pi = 1/k for all i), then
H(X ) = log k, and if some pj = 1 (and so pi = 0 for i 6= j),
then H(X ) = 0 (taking 0 log 0 = limx →0 x log x = 0). It can
easily be shown that:
0 ≤ H(X ) ≤ log k.
Shannon Entropy
If the probabilities pi are all equal (pi = 1/k for all i), then
H(X ) = log k, and if some pj = 1 (and so pi = 0 for i 6= j),
then H(X ) = 0 (taking 0 log 0 = limx →0 x log x = 0). It can
easily be shown that:
0 ≤ H(X ) ≤ log k.
Shannon Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Shannon’s source coding theorem: the compression rate of
H(X ) bits per symbol produced by a source of i.i.d. random
variables is optimal.
The Shannon entropy H(X ) is a measure of the minimal
physical resources, in terms of the average number of bits per
symbol, that are necessary and sufficient to reliably store the
output of a source of messages. In this sense, it is a measure
of the amount of information per symbol produced by an
information source.
The only relevant feature of a message with respect to reliable
compression and decompression is the sequence of
probabilities associated with the individual symbols: the
nature of the physical systems embodying the representation
of the message through their states is irrelevant to this notion
of compression, as is the content or meaning of the message.
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Shannon’s source coding theorem: the compression rate of
H(X ) bits per symbol produced by a source of i.i.d. random
variables is optimal.
The Shannon entropy H(X ) is a measure of the minimal
physical resources, in terms of the average number of bits per
symbol, that are necessary and sufficient to reliably store the
output of a source of messages. In this sense, it is a measure
of the amount of information per symbol produced by an
information source.
The only relevant feature of a message with respect to reliable
compression and decompression is the sequence of
probabilities associated with the individual symbols: the
nature of the physical systems embodying the representation
of the message through their states is irrelevant to this notion
of compression, as is the content or meaning of the message.
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Shannon’s source coding theorem: the compression rate of
H(X ) bits per symbol produced by a source of i.i.d. random
variables is optimal.
The Shannon entropy H(X ) is a measure of the minimal
physical resources, in terms of the average number of bits per
symbol, that are necessary and sufficient to reliably store the
output of a source of messages. In this sense, it is a measure
of the amount of information per symbol produced by an
information source.
The only relevant feature of a message with respect to reliable
compression and decompression is the sequence of
probabilities associated with the individual symbols: the
nature of the physical systems embodying the representation
of the message through their states is irrelevant to this notion
of compression, as is the content or meaning of the message.
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
a1 : 00
a2 : 01
a3 : 10
a4 : 11
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Shannon Entropy
a1 : 0
a2 : 10
a3 : 110
a4 : 111
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Conditional Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Conditional Entropy
From the probabilities p(x ) we can calculate p(y ) as:
X
p(y ) = p(y |x )p(x )
x
Conditional Entropy
From the probabilities p(x ) we can calculate p(y ) as:
X
p(y ) = p(y |x )p(x )
x
Conditional Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Conditional Entropy
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Conditional Entropy
2nH(X ,Y )
= 2n(H(X ,Y )−H(Y )) = 2nH(X |Y )
2nH(Y )
typical X -sequences associated with a given Y -sequence.
Note that H(X |Y ) 6= H(Y |X ).
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Conditional Entropy
The equality
H(X , Y ) − H(Y ) = H(X |Y )
follows from the ‘chain rule’ equality
H(X , Y ) = H(X ) + H(Y |X ) = H(Y ) + H(X |Y ) = H(Y , X )
derived from the logarithmic definitions of the quantities:
X
H(X , Y ) := − p(x , y ) log p(x , y )
x ,y
X
= − p(x )p(y |x ) log (p(x )p(y |x ))
x ,y
X X
= − p(x )p(y |x ) log p(x ) − p(x )p(y |x ) log p(y |x )
x ,y x ,y
= H(X ) + H(Y |X )
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Mutual Information
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information
Mutual Information
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Any Bell state can be transformed into any other Bell state by a
local unitary transformation, X , Y , or Z , where X , Y , Z are the
Pauli spin matrices:
0 1
X = σx = |0ih1| + |1ih0| =
1 0
0 −i
Y = σy = −i|0ih1| + i|1ih0| =
i 0
1 0
Z = σz = |0ih0| − |1ih1| = .
0 −1
For example:
1 1
X ⊗I ·|4i = X ⊗I · √ (|0ih1|−|1i|0i = − √ (|0ih0|−|1i|1i = −|3i.
2 2
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
hAi = Tr(|ΨihΨ|A ⊗ I)
= TrQ (TrE (|ΨihΨ|A))
X
= TrQ ( pi |iihi|A)
i
= TrQ (ρA)
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
where:
TrQ () = q hqi | · |qi i, for any orthonormal basis in HQ , is the
P
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Note that the states |rj i are not in general orthogonal. Since
the |ej i are orthogonal, we can express the density operator
representing the state of Q as:
X
ρ= wi |ri ihri |.
i
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Note that the states |rj i are not in general orthogonal. Since
the |ej i are orthogonal, we can express the density operator
representing the state of Q as:
X
ρ= wi |ri ihri |.
i
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
and all these different mixtures with the same density operator ρ
must be physically indistinguishable.
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Entangled States
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
from which it follows that U, defined for any product state |ψi|0i
(for any |ψi ∈ HQ ) can be extended to a unitary operator on the
Hilbert space HQ ⊗ HM .
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
If we now perform a standard projective measurement on M to
determine the value m of the pointer observable, defined by the
projection operator
Pm = IQ ⊗ |mihm|
then the probability of obtaining the outcome m is:
p(m) = h0|hψ|U † Pm U|ψi|0i
†
X
= hm0 |hψ|Mm 00
0 (IQ ⊗ |mihm|)Mm00 |ψi|m i
m0 m00
†
X
0 00
= hψ|Mm 0 hm |mihm|m iMm00 |ψi
m0 m00
†
= hψ|Mm Mm |ψi
and, more generally, if the initial state of Q is a mixed state ρ, then
†
p(m) = TrQ (Mm ρMm ).
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
The final state of QM after the projective measurement on M
yielding the outcome m is:
P U|ψi|0i Mm |ψi|mi
p m =q .
†
hψ|U PU|ψi †
hψ|Mm Mm |ψi
So the final state of M is |mi and the final state of Q is:
Mm |ψi
q ;
†
hψ|Mm Mm |ψi
and, more generally, if the initial state of Q is a mixed state ρ,
then the final state of Q is:
†
Mm ρMm
†
.
TrQ (Mm ρMm )
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Pm Pm0 = δmm0 Pm .
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
P
Given a set of positive operators {Em } such that Em = I,
measurement operators Mm can be defined via
p
Mm = U Em ,
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
√
As a special case we can take U = 1 and Mm = Em .
Conversely, given a set of measurement operators {M√
m },
there exist unitary operators Um such that Mm = Um Em ,
where {Em } is a POVM.
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Except for the standard case of projective measurements, one
might wonder why it might be useful to single out such
unitary transformations, and why in the general case such a
process should be called a measurement of Q.
Suppose we know that a system with a 2-dimensional Hilbert
space is in one of two nonorthogonal states:
|ψ1 i = |0i
1
|ψ2 i = √ (|0i + |1i)
2
It is impossible to reliably distinguish these states by a
quantum measurement, even in the above generalized sense.
Here ‘reliably’ means that the state is identified correctly with
zero probability of error.
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
|φ1 i = |ψ2 i⊥
|φ2 i = |ψ1 i⊥
√
(1 + 2)|0i + |1i
|φ3 i = q √ √
2 2(1 + 2)
√ √
2 2
with coefficients √ , √ , 1√
1+ 2 1+ 2 1+ 2
respectively.
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
If the input state is |ψ1 i = |0i, we have the transition:
U √ √
|ψ1 i|0i −→ E 1 |0i|1i + E 3 |0i|3i
= α|φ1 i|1i + β|φ3 i|3i
√ √
(because E 2 |ψ1 i = E 2 |0i = 0).
if the input state is |ψ2 i = √12 (|0i + |1i), we have the
transition:
U √ |0i + |1i √ |0i + |1i
|ψ2 i|0i −→ E2 √ |2i + E 3 √ |3i
2 2
= γ|φ2 i|2i + δ|φ3 i|3i
√ √
(because E 1 |ψ2 i = E 1 |0i+|1i
√
2
= 0), where α, β, γ, δ are
real numerical coefficients.
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Measurement
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Quantum Operations
Pmap E is P
The linear (or convex-linear) in the sense that
E( i pi ρi ) = i pi E(pi ), positive in the sense that E maps
positive operators to positive operators, and completely
positive in the sense that E ⊗ I is a positive map on the
extension of HQ to a Hilbert space HQ ⊗ HE , associated with
the addition of any ancilla system E to Q.
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Quantum Operations
Quantum Operations
Quantum Operations
Ei ρEi†
X
E(ρ) =
i
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Quantum Operations
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Quantum Operations
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information
Quantum Operations
Quantum Operations
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
where the pi are the eigenvalues of ρ and the pure states |ii
are orthonormal eigenstates of ρ.
This is the spectral representation of ρ, and any density
operator—a positive (hence Hermitian) operator—can be
expressed in this way.
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
P
Since ρ has unit trace, pi = 1, and so the spectral
representation of ρ represents a classical probability
distribution of orthogonal, and hence distinguishable, pure
states.
If we measure a Q-observable with eigenstates |ii, then the
outcomes can be associated with the values of a random
variable X , where Pr(X = i) = pi . Then
X
H(X ) = − pi log pi
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Now, X
−Tr(ρ log ρ) = − pi log pi
(because the eigenvalues of ρ log ρ are pi log pi and the trace of an
operator is the sum of the eigenvalues), so a natural generalization
of Shannon entropy for any mixture of quantum states with density
operator ρ is the von Neumann entropy:
S := −Tr(ρ log ρ)
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
and so: X
S(A) = S(B) = − pi log pi .
i
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
ρ0 = |0ih0|ρ|0ih0| + |0ih1|ρ|1ih0|
= Tr(ρ)|0ih0|
= |0ih0|. (1)
So S(ρ0 ) = 0 ≤ S(ρ).
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Accessible Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Accessible Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Accessible Information
Sup H(X : Y )
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Accessible Information
H(X : Y ) ≤ H(X )
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Accessible Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Accessible Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Accessible Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Accessible Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
E Xp p
σ ⊗ |0ih0| −→ Ey σ Ey ⊗ |y ihy |.
y
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
S(P : Q) = S(P : Q, M)
S(P 0 : Q 0 , M 0 ) ≤ S(P : Q, M)
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Finally:
S(P 0 : Q 0 , M 0 )
because discarding systems never increases mutual information,
and so:
S(P 0 : M 0 ) ≤ S(P : Q)
which (following some algebraic manipulation) is the statement of
0 0
P (S(P : M ) ≤ S(P : Q) reduces to
the Holevo bound, i.e.,
H(X : Y ) ≤ S(ρ) − x px S(ρx ).
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information
Quantum Information