0% found this document useful (0 votes)
14 views128 pages

Intro Quant Info

The document provides an introduction to quantum information, outlining key concepts in classical information theory, quantum mechanics, and quantum information itself. It discusses Shannon entropy, its significance in quantifying information, and the optimal compression of messages produced by a source. Additionally, it includes resources for further reading, both web-based and print, to deepen understanding of quantum information and computation.

Uploaded by

DRPSR advocate
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views128 pages

Intro Quant Info

The document provides an introduction to quantum information, outlining key concepts in classical information theory, quantum mechanics, and quantum information itself. It discusses Shannon entropy, its significance in quantifying information, and the optimal compression of messages produced by a source. Additionally, it includes resources for further reading, both web-based and print, to deepen understanding of quantum information and computation.

Uploaded by

DRPSR advocate
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Resources

Classical Information
Some Relevant Quantum Mechanics
Quantum Information

Introduction to Quantum Information

Jeffrey Bub

Department of Philosophy
and
IPST
University of Maryland

RIT on Quantum Information and Computation, 2010

Quantum Information
Resources
Classical Information
Some Relevant Quantum Mechanics
Quantum Information

Outline
1 Resources
Web Resources
Print Resources
2 Classical Information
Shannon Entropy
Conditional Entropy and Mutual Information
3 Some Relevant Quantum Mechanics
Entangled States
Measurement
Quantum Operations
4 Quantum Information
Von Neumann Entropy
Accessible Information

Quantum Information
Resources
Classical Information Web Resources
Some Relevant Quantum Mechanics Print Resources
Quantum Information

Web Resources

Sam Lomonaco: A Rosetta Stone for Quantum Mechanics


with an Introduction to Quantum Computation,
[Link]
Todd Brun: Lecture Notes on Quantum Information
Processing, [Link] tbrun/Course/[Link]
Valerio Scarani: Quantum Information: Primitive Notions and
Quantum Correlations, [Link]
John Preskill: Lecture Notes on Quantum Computation,
[Link]

Quantum Information
Resources
Classical Information Web Resources
Some Relevant Quantum Mechanics Print Resources
Quantum Information

Print Resources

Sam Lomonaco: A Rosetta Stone for Quantum Mechanics


with an Introduction to Quantum Computation, in AMS Short
Course Lecture Notes: Quantum Computation (Providence:
AMS, 2000).
Michael A Nielsen and Isaac L. Chuang: Quantum
Computation and Quantum Information (Cambridge:
Cambridge University Press, 2000).
Chris J. Isham: Lectures on Quantum Theory: Mathematical
and Structural Foundations (London: Imperial College Press,
1995).

Quantum Information
Resources
Classical Information Web Resources
Some Relevant Quantum Mechanics Print Resources
Quantum Information

Print Resources

Hoi-Kwong Lo, Sandu Popescu, Tom Spiller (eds.):


Introduction to Quantum Computation and Information
(World Scientific: 1998).
L. Diosi: A Short Course in Quantum Information Theory
(Springer, 2007).
Michel Le Bellac: A Short Introduction to Quantum
Information and Quantum Computation (Cambridge
University Press, 2005).

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

Fundamental question considered by Shannon: how to


quantify the minimal physical resources required to store
messages produced by a source, so that they could be
communicated via a channel without loss and reconstructed
by a receiver.
Shannon’s source coding theorem (or noiseless channel coding
theorem) answers this question.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

Basic idea: consider a source that produces long sequences


(messages) composed of symbols from a finite alphabet
a1 , a2 , . . . , ak , where the individual symbols are produced with
probabilities p1 , p2 , . . . , pk . A given sequence of symbols is
represented as a sequence of values of independent and
identically distributed (i.i.d.) discrete random variables
X1 , X2 , . . .
A typical sequence of length n, for large n, will contain close
to pi n symbols ai , for i = 1, . . . , n. So the probability of a
sufficiently long typical sequence (assuming independence)
will be:

p(x1 , x2 , . . . , xn ) = p(x1 )p(x2 ) . . . p(xn ) ≈ p1p1 n p2p2 n . . . pkpk n

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

Basic idea: consider a source that produces long sequences


(messages) composed of symbols from a finite alphabet
a1 , a2 , . . . , ak , where the individual symbols are produced with
probabilities p1 , p2 , . . . , pk . A given sequence of symbols is
represented as a sequence of values of independent and
identically distributed (i.i.d.) discrete random variables
X1 , X2 , . . .
A typical sequence of length n, for large n, will contain close
to pi n symbols ai , for i = 1, . . . , n. So the probability of a
sufficiently long typical sequence (assuming independence)
will be:

p(x1 , x2 , . . . , xn ) = p(x1 )p(x2 ) . . . p(xn ) ≈ p1p1 n p2p2 n . . . pkpk n

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

Taking the logarithm (conventionally, in information theory, to the


base 2) yields:

log p(x1 , . . . , xn ) ≈ log p1p1 n p2p2 n . . . pkpk n


X
≈n pi log pi
i
= −nH(X )
P
where H(X ) := − i pi log pi is the Shannon entropy of the
source.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy
If the probabilities pi are all equal (pi = 1/k for all i), then
H(X ) = log k, and if some pj = 1 (and so pi = 0 for i 6= j),
then H(X ) = 0 (taking 0 log 0 = limx →0 x log x = 0). It can
easily be shown that:
0 ≤ H(X ) ≤ log k.

A source that produces one of two distinguishable symbols


with equal probability, such as the toss of a fair coin, is said to
have a Shannon entropy of 1 bit: ascertaining which symbol is
produced is associated with an amount of information equal
to 1 bit. If we already know which symbol will be produced
(so the probabilities are 0 and 1), the entropy is 0: there is no
uncertainty, and no information gain.
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy
If the probabilities pi are all equal (pi = 1/k for all i), then
H(X ) = log k, and if some pj = 1 (and so pi = 0 for i 6= j),
then H(X ) = 0 (taking 0 log 0 = limx →0 x log x = 0). It can
easily be shown that:
0 ≤ H(X ) ≤ log k.

A source that produces one of two distinguishable symbols


with equal probability, such as the toss of a fair coin, is said to
have a Shannon entropy of 1 bit: ascertaining which symbol is
produced is associated with an amount of information equal
to 1 bit. If we already know which symbol will be produced
(so the probabilities are 0 and 1), the entropy is 0: there is no
uncertainty, and no information gain.
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

If we encoded each of the k distinct symbols as a distinct


binary number, i.e., as a distinct string of 0’s and 1’s, we
would need strings composed of log k bits to represent each
symbol (2log k = k).
Shannon’s analysis shows that messages produced by a
stochastic source can be compressed, in the sense that (as
n → ∞ and the probability of an atypical n-length sequence
tends to zero) n-length sequences can be encoded without
loss of information using nH(X ) bits rather than the n log k
bits required if we encoded each of the k symbols ai as a
distinct string of 0’s and 1’s: this is a compression, since
nH(X ) < n log k except for equiprobable distributions.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

If we encoded each of the k distinct symbols as a distinct


binary number, i.e., as a distinct string of 0’s and 1’s, we
would need strings composed of log k bits to represent each
symbol (2log k = k).
Shannon’s analysis shows that messages produced by a
stochastic source can be compressed, in the sense that (as
n → ∞ and the probability of an atypical n-length sequence
tends to zero) n-length sequences can be encoded without
loss of information using nH(X ) bits rather than the n log k
bits required if we encoded each of the k symbols ai as a
distinct string of 0’s and 1’s: this is a compression, since
nH(X ) < n log k except for equiprobable distributions.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy
Shannon’s source coding theorem: the compression rate of
H(X ) bits per symbol produced by a source of i.i.d. random
variables is optimal.
The Shannon entropy H(X ) is a measure of the minimal
physical resources, in terms of the average number of bits per
symbol, that are necessary and sufficient to reliably store the
output of a source of messages. In this sense, it is a measure
of the amount of information per symbol produced by an
information source.
The only relevant feature of a message with respect to reliable
compression and decompression is the sequence of
probabilities associated with the individual symbols: the
nature of the physical systems embodying the representation
of the message through their states is irrelevant to this notion
of compression, as is the content or meaning of the message.
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy
Shannon’s source coding theorem: the compression rate of
H(X ) bits per symbol produced by a source of i.i.d. random
variables is optimal.
The Shannon entropy H(X ) is a measure of the minimal
physical resources, in terms of the average number of bits per
symbol, that are necessary and sufficient to reliably store the
output of a source of messages. In this sense, it is a measure
of the amount of information per symbol produced by an
information source.
The only relevant feature of a message with respect to reliable
compression and decompression is the sequence of
probabilities associated with the individual symbols: the
nature of the physical systems embodying the representation
of the message through their states is irrelevant to this notion
of compression, as is the content or meaning of the message.
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy
Shannon’s source coding theorem: the compression rate of
H(X ) bits per symbol produced by a source of i.i.d. random
variables is optimal.
The Shannon entropy H(X ) is a measure of the minimal
physical resources, in terms of the average number of bits per
symbol, that are necessary and sufficient to reliably store the
output of a source of messages. In this sense, it is a measure
of the amount of information per symbol produced by an
information source.
The only relevant feature of a message with respect to reliable
compression and decompression is the sequence of
probabilities associated with the individual symbols: the
nature of the physical systems embodying the representation
of the message through their states is irrelevant to this notion
of compression, as is the content or meaning of the message.
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

As a simple example of compression, consider an information


source that produces sequences of symbols from a 4-symbol
alphabet a1 , a2 , a3 , a4 with probabilities 1/2, 1/4, 1/8, 1/8.
Each symbol can be represented by a distinct 2-digit binary
number:

a1 : 00
a2 : 01
a3 : 10
a4 : 11

So without compression we need two bits per symbol of


storage space to store the output of the source.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

The Shannon entropy of the source is:


1 1 1 1 1 1 1 1 7
H(X ) = − log − log − log − log =
2 2 4 4 8 8 8 8 4

Shannon’s source coding theorem: there is a compression


scheme that uses an average of 7/4 bits per symbol rather
than two bits per symbol; such a compression scheme is
optimal.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

The Shannon entropy of the source is:


1 1 1 1 1 1 1 1 7
H(X ) = − log − log − log − log =
2 2 4 4 8 8 8 8 4

Shannon’s source coding theorem: there is a compression


scheme that uses an average of 7/4 bits per symbol rather
than two bits per symbol; such a compression scheme is
optimal.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Shannon Entropy

The optimal scheme is provided by the following encoding:

a1 : 0
a2 : 10
a3 : 110
a4 : 111

for which the average length of a compressed sequence is:


1 1 1 1 7
·1+ ·2+ ·3+ ·3=
2 4 8 8 4
bits per symbol.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Conditional Entropy

So far, we’ve assumed a noiseless channel between the source


and the receiver.
An information channel maps inputs consisting of values of a
random variable X onto outputs consisting of values of a
random variable Y , and the map will generally not be 1-1 if
the channel is noisy. So consider the conditional probabilities
p(y |x ) of obtaining an output value y for a given input value
x , for all x , y .

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Conditional Entropy
From the probabilities p(x ) we can calculate p(y ) as:
X
p(y ) = p(y |x )p(x )
x

and we can also calculate p(x |y ) by Bayes’ rule from the


probabilities p(y |x ) and p(x ), for all x , y , and hence the
Shannon entropy of the conditional distribution p(x |y ), for all
x and a fixed y , denoted by H(X |Y = y ).
The quantity
X
H(X |Y ) = p(y )H(X |Y = y )
y

is known as the conditional entropy. It is the expected value


of H(X |Y = y ) for all y .
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Conditional Entropy
From the probabilities p(x ) we can calculate p(y ) as:
X
p(y ) = p(y |x )p(x )
x

and we can also calculate p(x |y ) by Bayes’ rule from the


probabilities p(y |x ) and p(x ), for all x , y , and hence the
Shannon entropy of the conditional distribution p(x |y ), for all
x and a fixed y , denoted by H(X |Y = y ).
The quantity
X
H(X |Y ) = p(y )H(X |Y = y )
y

is known as the conditional entropy. It is the expected value


of H(X |Y = y ) for all y .
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Conditional Entropy

If we think of H(X ), the entropy of the distribution


{p(x ) : x ∈ X }, as a measure of the uncertainty of the
X -value, then H(X |Y = y ) is a measure of the uncertainty of
the X -value, given the Y -value y , and H(X |Y ) is a measure
of the average uncertainty of the X -value, given a Y -value.
Putting it differently, the number of input sequences of length
n that are consistent with a given output sequence (as
n → ∞) is 2nH(X |Y ) , i.e., H(X |Y ) is the number of bits per
symbol of additional information needed, on average, to
identify an input X -sequence from a given Y -sequence.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Conditional Entropy

If we think of H(X ), the entropy of the distribution


{p(x ) : x ∈ X }, as a measure of the uncertainty of the
X -value, then H(X |Y = y ) is a measure of the uncertainty of
the X -value, given the Y -value y , and H(X |Y ) is a measure
of the average uncertainty of the X -value, given a Y -value.
Putting it differently, the number of input sequences of length
n that are consistent with a given output sequence (as
n → ∞) is 2nH(X |Y ) , i.e., H(X |Y ) is the number of bits per
symbol of additional information needed, on average, to
identify an input X -sequence from a given Y -sequence.

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Conditional Entropy

This follows because there are 2nH(X ,Y ) typical sequences of


pairs (x , y ), where the joint entropy H(X , Y ) is calculated
from the joint probability p(x , y ). So there are

2nH(X ,Y )
= 2n(H(X ,Y )−H(Y )) = 2nH(X |Y )
2nH(Y )
typical X -sequences associated with a given Y -sequence.
Note that H(X |Y ) 6= H(Y |X ).

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Conditional Entropy
The equality
H(X , Y ) − H(Y ) = H(X |Y )
follows from the ‘chain rule’ equality
H(X , Y ) = H(X ) + H(Y |X ) = H(Y ) + H(X |Y ) = H(Y , X )
derived from the logarithmic definitions of the quantities:
X
H(X , Y ) := − p(x , y ) log p(x , y )
x ,y
X
= − p(x )p(y |x ) log (p(x )p(y |x ))
x ,y
X X
= − p(x )p(y |x ) log p(x ) − p(x )p(y |x ) log p(y |x )
x ,y x ,y
= H(X ) + H(Y |X )
Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Mutual Information

The mutual information H(X : Y )—sometimes I(X : Y )—of two


random variables is a measure of how much information they have
in common: the sum of the information content of the two random
variables, as measured by the Shannon entropy (in which joint
information is counted twice), minus their joint information.

H(X : Y ) = H(X ) + H(Y ) − H(X , Y )

Quantum Information
Resources
Classical Information Shannon Entropy
Some Relevant Quantum Mechanics Conditional Entropy and Mutual Information
Quantum Information

Mutual Information

Note that H(X : X ) = H(X ), as we would expect.


Also, since H(X , Y ) = H(X ) + H(Y |X ), it follows that

H(X : Y ) = H(X ) − H(X |Y ) = H(Y ) − H(Y |X )

i.e., the mutual information of two random variables


represents the average information gain about one random
variable obtained by measuring the other: the difference
between the initial uncertainty of one of the random variables,
and the average residual uncertainty of that random variable
after ascertaining the value of the other random variable.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

Consider a quantum system Q which is part of a compound


system QE (E for ‘environment,’ although E could be any
quantum system of which Q is a subsystem). Pure states of
QE are represented as rays or unit vectors in a tensor product
Hilbert space HQ ⊗ HE .
A general pure state of QE is a state of the form:
X
|Ψi = cij |qi i|ej i

where |qi i ∈ HQ is a complete set of orthonormal states (a


basis) in HQ and |ej i ∈ HE is a basis in HE . If the
coefficients cij are such that |Ψi cannot be expressed as a
product state |Qi|E i, then |Ψi is called an entangled state.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

Consider a quantum system Q which is part of a compound


system QE (E for ‘environment,’ although E could be any
quantum system of which Q is a subsystem). Pure states of
QE are represented as rays or unit vectors in a tensor product
Hilbert space HQ ⊗ HE .
A general pure state of QE is a state of the form:
X
|Ψi = cij |qi i|ej i

where |qi i ∈ HQ is a complete set of orthonormal states (a


basis) in HQ and |ej i ∈ HE is a basis in HE . If the
coefficients cij are such that |Ψi cannot be expressed as a
product state |Qi|E i, then |Ψi is called an entangled state.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

For any state |Ψi of QE , there exist orthonormal bases


|ii ∈ HQ , |ji ∈ HE such that |Ψi can be expressed in a
biorthogonal correlated form as:
X√
|Ψi = pi |ii|ii
i

where the coefficients
P pi are real and non-negative, and
pi = 1.
This representation is referred to as the Schmidt
decomposition. The Schmidt decomposition is unique if and
only if the pi are all distinct.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

An example is the biorthogonal EPR state:



|Ψi = (|0i|1i − |1i|0i)/ 2;

say, the singlet state of two spin-1/2 particles (the Schmidt


form with positive coefficients is obtained by asborbing the
relative phases in the definition of the basis vectors).
In the singlet state, |0i and |1i can be taken as representing
the two eigenstates of spin in the z-direction, but since the
state is symmetric, |Ψi retains the same form for spin in any
direction.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

The EPR argument exploits the fact that spin measurements


in the same direction on the two particles, which could be
arbitrarily far apart, will yield outcomes that are perfectly
anti-correlated for any spin direction.

Bell’s counterargument exploits the fact that when the spin is


measured on one particle in a direction θ1 to the z-axis, and
on the other particle in a direction θ2 to the z-axis, the
probability of finding the same outcome for both particles
(both 1 or both 0) is sin2 (θ1 − θ2 )/2. It follows that 3/4 of
the outcome pairs are the same when θ1 − θ2 = 2π/3.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

Suppose, for many EPR pairs, spin is measured in one of three


directions 2π/3 apart chosen randomly for each particle.
It follows that, averaging over the nine possible pairs of
measurement directions, half the outcome pairs will be the
same ( 91 (3 · 0 + 6 · 34 ) = 21 ). On the other hand, from Bell’s
inequality, derived under Einstein’s realist assumptions of
separability and locality, it can be shown that no more than
4/9 of the outcome pairs can be the same.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

This means that the dynamical evolution of a quantum


system can result in a state representing correlational
information that no classical computer can simulate.
For example, no classical computer can be programmed to
perform the following task: for any pair of input angles, θ1 , θ2 ,
at different locations, output a pair of values (0 or 1) such
that the values are perfectly correlated when θ1 − θ2 = π,
perfectly anti-correlated when θ1 = θ2 , and 75% correlated
when θ1 − θ2 = 2π/3, where the response time between given
the input and producing the output in each case is less than
the time taken by light to travel between the two locations.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

This means that the dynamical evolution of a quantum


system can result in a state representing correlational
information that no classical computer can simulate.
For example, no classical computer can be programmed to
perform the following task: for any pair of input angles, θ1 , θ2 ,
at different locations, output a pair of values (0 or 1) such
that the values are perfectly correlated when θ1 − θ2 = π,
perfectly anti-correlated when θ1 = θ2 , and 75% correlated
when θ1 − θ2 = 2π/3, where the response time between given
the input and producing the output in each case is less than
the time taken by light to travel between the two locations.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

The four states:


1
|1i = √ (|0i|1i − |1i|0i)
2
1
|2i = √ (|0i|1i + |1i|0i)
2
1
|3i = √ (|0i|0i − |1i|1i)
2
1
|4i = √ (|0i|0i + |1i|1i)
2
form an orthonormal basis, called the Bell basis, in the
2 x 2-dimensional Hilbert space.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States
Any Bell state can be transformed into any other Bell state by a
local unitary transformation, X , Y , or Z , where X , Y , Z are the
Pauli spin matrices:
 
0 1
X = σx = |0ih1| + |1ih0| =
1 0
 
0 −i
Y = σy = −i|0ih1| + i|1ih0| =
i 0
 
1 0
Z = σz = |0ih0| − |1ih1| = .
0 −1
For example:
1 1
X ⊗I ·|4i = X ⊗I · √ (|0ih1|−|1i|0i = − √ (|0ih0|−|1i|1i = −|3i.
2 2
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

If QE is a closed system in an entangled pure state represented by


X√
|Ψi = pi |ii|ii
i

in the Schmidt decomposition, the expected value of any


Q-observable A on HQ can be computed as:

hAi = Tr(|ΨihΨ|A ⊗ I)
= TrQ (TrE (|ΨihΨ|A))
X
= TrQ ( pi |iihi|A)
i
= TrQ (ρA)

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

So the expected value of the Q-observable A can be expressed as:

hAi = TrQ (ρA)

where:
TrQ () = q hqi | · |qi i, for any orthonormal basis in HQ , is the
P

partial trace over HQ ,


TrE () is the partial trace over HE , and
ρ = i pi |iihi| ∈ HQ is the reduced density operator of the
P
open system Q, a positive operator with unit trace.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

Since the density operator ρ yields the statistics of all


Q-observables via the trace equation, ρ is taken as
representing the quantum state of the system Q.
If QE is an entangled pure state, then the open system Q is in
a mixed state ρ, i.e., ρ 6= ρ2 ; for pure states, ρ is a projection
operator onto a ray and ρ = ρ2 .
A mixed
P state represented by a density operator
ρ= ρi |iihi| can be regarded as a mixture of pure states |ii
prepared with prior probabilities pi , but this representation is
not unique—not even if the states combined in the mixture
are orthogonal.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

For example, the equal-weight mixture of orthonormal states


|0i, |1i in a 2-dimensional Hilbert space H2 has precisely the same
statistical properties, and hence the same density operator ρ = I/2,
as the equal weight mixture of any pair of orthonormal states, e.g.,
1 the states √1 (|0i + |1i), √12 (|0i − |1i), or
2
2 the equal-weight
√ mixture √of nonorthogonal states
|0i, 2 |0i + 2 |1i, 12 |0i − 23 |1i 120◦ degrees apart, or
1 3

3 the uniform continuous distribution over all possible states in


H2 .

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

More generally, for any basis of orthonormal states |ei i ∈ HE ,


the entangled state |Ψi can be expressed as:
X X√
|Ψi = cij |qi i|ej i = wj |rj i|ej i
ij j
P c
where the normalized states |rj i = i √wij j |qi i are relative

states to the |ej i ( wj = j |cij |2 ).
P

Note that the states |rj i are not in general orthogonal. Since
the |ej i are orthogonal, we can express the density operator
representing the state of Q as:
X
ρ= wi |ri ihri |.
i

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

More generally, for any basis of orthonormal states |ei i ∈ HE ,


the entangled state |Ψi can be expressed as:
X X√
|Ψi = cij |qi i|ej i = wj |rj i|ej i
ij j
P c
where the normalized states |rj i = i √wij j |qi i are relative

states to the |ej i ( wj = j |cij |2 ).
P

Note that the states |rj i are not in general orthogonal. Since
the |ej i are orthogonal, we can express the density operator
representing the state of Q as:
X
ρ= wi |ri ihri |.
i

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

In effect, a measurement of an E -observable with eigenstates


|ei i will leave the composite system QE in one of the states
|ri i|ei i with probability wi , and a measurement of an
E -observable with eigenstates |ii (the orthogonal states of the
Schmidt decomposition) will leave the system QE in one of
the states |ii|ii with probability pi .
Since Q and E could be widely separated from each other in
space, no measurement at E could affect the statistics of any
Q-observable; or else measurements at E would allow
superluminal signaling between Q and E .

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

In effect, a measurement of an E -observable with eigenstates


|ei i will leave the composite system QE in one of the states
|ri i|ei i with probability wi , and a measurement of an
E -observable with eigenstates |ii (the orthogonal states of the
Schmidt decomposition) will leave the system QE in one of
the states |ii|ii with probability pi .
Since Q and E could be widely separated from each other in
space, no measurement at E could affect the statistics of any
Q-observable; or else measurements at E would allow
superluminal signaling between Q and E .

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

It follows that the mixed state ρ can be realized as a mixture of


orthogonal states |ii (the eigenstates of ρ) with weights pi , or as a
mixture of non-orthogonal relative states |rj i with weights wj in
infinitely many ways, depending on the choice of basis in HE :
X X
ρ= pi |iihi| = wj |rj ihrj |
i j

and all these different mixtures with the same density operator ρ
must be physically indistinguishable.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

Note that any mixed state density operator ρ ∈ HQ can be


‘purified’ by adding a suitable ancilla system E , in the sense
that ρ is the partial trace of a pure state |Ψi ∈ HQ ⊗ HE over
HE .
A purification of a mixed state is not unique, but depends on
the choice of |Ψi in HE .

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Entangled States

The Hughston-Jozsa-Wootters theorem (1993) shows that for


any mixture
P of pure states |ri i with weights wi , where
ρ = j wj |rj ihrj |, there is a purification of ρ and a suitable
measurement on the system E that will leave Q in the
mixture ρ.
So an observer at E can remotely prepare Q in any mixture
that corresponds to the density operator ρ (and of course all
these different mixtures are physically indistinguishable).
Similar results were proved earlier by Schrödinger (1935),
Jaynes (1957), and Gisin (1989).

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

A standard von Neumann ‘yes-no’ measurement is associated


with a projection operator; so a standard observable is
represented in the spectral representation as a sum of
projection operators, with coefficients representing the
eigenvalues of the observable.
Such a measurement is the quantum analogue of the
measurement of a property of a system in classical physics.
Classically, we think of a property of a system as being
associated with a subset in the state space (phase space) of
the system, and determining whether the system has the
property amounts to determining whether the state of the
system lies in the corresponding subset.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

In quantum mechanics, the counterpart of a subset in phase


space is a closed linear subspace in Hilbert space.
Just as the different possible values of an observable
(dynamical quantity) of a classical system correspond to the
subsets in a mutually exclusive and collectively exhaustive set
of subsets covering the classical state space, so the different
values of a quantum observable correspond to the subspaces
in a mutually exclusive (i.e., orthogonal) and collectively
exhaustive set of subspaces spanning the quantum state space.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

In quantum mechanics, and especially in the theory of


quantum information (where any read-out of the quantum
information encoded in a quantum state requires a quantum
measurement), it is useful to consider a more general class of
measurements than the projective measurements associated
with the determination of the value of an observable.
It is common to speak of generalized measurements and
generalized observables. A generalized measurement is not a
procedure that reveals whether or not a quantum system has
some sort of generalized property. Rather, the point of the
generalization is to exploit the difference between quantum
and classical states for new possibilities in the representation
and manipulation of information.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

A quantum measurement can be characterized, completely


generally, as a certain sort of interaction between two
quantum systems, Q (the measured system) and M (the
measuring system).
We suppose that Q is initially in a state |ψi and that M is
initially in some standard state |0i, where |mi is an
orthonormal basis of ‘pointer’ eigenstates in HM .

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

The interaction is defined by a unitary transformation U on the


Hilbert space HQ ⊗ HM that yields the transition:
U
X
|ψi|0i −→ Mm |ψi|mi
m

where {Mm } is a set of linear operators (the Kraus operators)


defined on HQ satisfying the completeness condition:
X

Mm Mm = I.
m

(The symbol † denotes the adjoint or Hermitian conjugate.)

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

The completeness condition guarantees that this evolution is


unitary, because it guarantees that U preserves inner products, i.e.
X
hφ|h0|U † U|ψi|0i = hm|hφ|Mm †
Mm0 |ψi|m0 i
m,m0
X

= hφ|Mm Mm |ψi
m
= hφ|ψi

from which it follows that U, defined for any product state |ψi|0i
(for any |ψi ∈ HQ ) can be extended to a unitary operator on the
Hilbert space HQ ⊗ HM .

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

Any set of linear operators {Mm } defined on the Hilbert space of


the system Q satisfying the completeness condition defines a
measurement in this general sense, with the index m labeling the
possible outcomes of the measurement, and any such set is
referred to as a set of measurement operators.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement
If we now perform a standard projective measurement on M to
determine the value m of the pointer observable, defined by the
projection operator
Pm = IQ ⊗ |mihm|
then the probability of obtaining the outcome m is:
p(m) = h0|hψ|U † Pm U|ψi|0i

X
= hm0 |hψ|Mm 00
0 (IQ ⊗ |mihm|)Mm00 |ψi|m i

m0 m00

X
0 00
= hψ|Mm 0 hm |mihm|m iMm00 |ψi

m0 m00

= hψ|Mm Mm |ψi
and, more generally, if the initial state of Q is a mixed state ρ, then

p(m) = TrQ (Mm ρMm ).
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement
The final state of QM after the projective measurement on M
yielding the outcome m is:
P U|ψi|0i Mm |ψi|mi
p m =q .

hψ|U PU|ψi †
hψ|Mm Mm |ψi
So the final state of M is |mi and the final state of Q is:
Mm |ψi
q ;

hψ|Mm Mm |ψi
and, more generally, if the initial state of Q is a mixed state ρ,
then the final state of Q is:

Mm ρMm

.
TrQ (Mm ρMm )
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

This general notion of measurement covers the case of


standard projective measurements. In this case
{Mm } = {Pm }, where {Pm } is the set of projection operators
defined by the spectral measure of a standard quantum
observable represented by a self-adjoint operator. It also
covers the measurement of generalized observables associated
with positive operator valued measures (POVMs).
Let

Em = Mm Mm
then the set {Em } defines a set of positive operators
(‘effects’) such that X
Em = I

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

A POVM can be regarded as a generalization P of a projection


valued measure (PVM), in the sense that Em = I defines a
‘resolution of the identity’ without requiring the PVM
orthogonality condition:

Pm Pm0 = δmm0 Pm .

Note that for a POVM:

p(m) = hψ|Em |ψi.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

P
Given a set of positive operators {Em } such that Em = I,
measurement operators Mm can be defined via
p
Mm = U Em ,

where U is a unitary operator, from which it follows that


X X

Mm Mm = Em = I
m

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement


As a special case we can take U = 1 and Mm = Em .
Conversely, given a set of measurement operators {M√
m },
there exist unitary operators Um such that Mm = Um Em ,
where {Em } is a POVM.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement
Except for the standard case of projective measurements, one
might wonder why it might be useful to single out such
unitary transformations, and why in the general case such a
process should be called a measurement of Q.
Suppose we know that a system with a 2-dimensional Hilbert
space is in one of two nonorthogonal states:

|ψ1 i = |0i
1
|ψ2 i = √ (|0i + |1i)
2
It is impossible to reliably distinguish these states by a
quantum measurement, even in the above generalized sense.
Here ‘reliably’ means that the state is identified correctly with
zero probability of error.
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

Suppose there is such a measurement, defined by two


measurement operators M1 , M2 satisfying the completeness
condition.
Then we require

p(1) = hψ1 |M1† M1 |ψ1 i = 1,

to represent reliability if the state is |ψ1 i; and

p(2) = hψ2 |M2† M2 |ψ2 i = 1

to represent reliability if the state is |ψ2 i.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

By the completeness condition we must have

hψ1 |M1† M1 + M2† M2 |ψ1 i = 1

from which it follows that hψ1 |M2† M2 |ψ1 i = 0, i.e.,


M2 |ψ1 i = M2 |0i = 0.
Hence
1 1
M2 |ψ2 i = M2 √ (|0i + |1i) = √ M2 |1i
2 2
and so:
1
p(2) = hψ2 |M2† M2 |ψ2 i = h1|M2† M2 |1i
2

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

But by the completeness condition we also have

h1|M2† M2 |1i ≤ h1|M1† M1 + M2† M2 |1i = h1|1i = 1

from which it follows that


1
p(2) ≤
2
which contradicts p(2) = 1.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

However, it is possible to perform a measurement in the


generalized sense, with three possible outcomes, that will allow us
to correctly identify the state some of the time, i.e., for two of the
possible outcomes, while nothing about the identity of the state
can be inferred from the third outcome.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

Here’s how: The three operators



2 (|0i − |1i)(h0| − h1|)
E1 = √
1+ 2 2

2
E2 = √ |1ih1|
1+ 2
E3 = I − E1 − E2

are all positive operators and E1 + E2 + E3 = I, so they define a


POVM.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

In fact, E1 , E2 , E3 are each multiples of projection operators onto


the states

|φ1 i = |ψ2 i⊥
|φ2 i = |ψ1 i⊥

(1 + 2)|0i + |1i
|φ3 i = q √ √
2 2(1 + 2)
√ √
2 2
with coefficients √ , √ , 1√
1+ 2 1+ 2 1+ 2
respectively.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

The measurement involves a system M with three orthogonal


pointer states |1i, |2i, |3i. The appropriate unitary interaction U
results in the transition, for an input state |ψi:
U
X
|ψi|0i −→ Mm |ψi|mi
m

where Mm = Em .

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement
If the input state is |ψ1 i = |0i, we have the transition:
U √ √
|ψ1 i|0i −→ E 1 |0i|1i + E 3 |0i|3i
= α|φ1 i|1i + β|φ3 i|3i
√ √
(because E 2 |ψ1 i = E 2 |0i = 0).
if the input state is |ψ2 i = √12 (|0i + |1i), we have the
transition:
U √ |0i + |1i √ |0i + |1i
|ψ2 i|0i −→ E2 √ |2i + E 3 √ |3i
2 2
= γ|φ2 i|2i + δ|φ3 i|3i
√ √
(because E 1 |ψ2 i = E 1 |0i+|1i

2
= 0), where α, β, γ, δ are
real numerical coefficients.
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Measurement

We see that a projective measurement of the pointer of M


that yields the outcome m = 1 indicates, with certainty, that
the input state was |ψ1 i = |0i. In this case, the measurement
leaves the system Q in the state |φ1 i.
A measurement outcome m = 2 indicates, with certainty, that
the input state was |ψ2 i = √12 (|0i + |1i), and in this case the
measurement leaves the system Q in the state |φ2 i.
If the outcome is m = 3, the input state could have been
either |ψ1 i = |0i or |ψ2 i = √12 (|0i + |1i), and Q is left in the
state |φ3 i.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Quantum Operations

When a closed system QE initially in a product state ρ ⊗ ρE


evolves under a unitary transformation, Q can be shown to
evolve under a quantum operation, i.e., a completely positive
linear map:
E : ρ → ρ0
E(ρ) = TrE (Uρ ⊗ ρE U † )

Pmap E is P
The linear (or convex-linear) in the sense that
E( i pi ρi ) = i pi E(pi ), positive in the sense that E maps
positive operators to positive operators, and completely
positive in the sense that E ⊗ I is a positive map on the
extension of HQ to a Hilbert space HQ ⊗ HE , associated with
the addition of any ancilla system E to Q.

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Quantum Operations

Every quantum operation (i.e., completely positive linear map)


on a Hilbert space HQ has a (non-unique) representation as a
unitary evolution on an extended Hilbert space HQ ⊗ HE , i.e.,

E(ρ) = TrE (U(ρ ⊗ ρE )U † )

where ρE is an appropriately chosen initial state of an ancilla


system E (which we can think of as the environment of Q).
It turns out that it suffices to take ρE as a pure state, i.e.,
|0ih0|, since a mixed state of E can always be purified by
enlarging the Hilbert space (i.e., adding a further ancilla
system). So the evolution of a system Q described by a
quantum operation can always be modeled as the unitary
evolution of a system QE , for an initial pure state of E .
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Quantum Operations

Every quantum operation (i.e., completely positive linear map)


on a Hilbert space HQ has a (non-unique) representation as a
unitary evolution on an extended Hilbert space HQ ⊗ HE , i.e.,

E(ρ) = TrE (U(ρ ⊗ ρE )U † )

where ρE is an appropriately chosen initial state of an ancilla


system E (which we can think of as the environment of Q).
It turns out that it suffices to take ρE as a pure state, i.e.,
|0ih0|, since a mixed state of E can always be purified by
enlarging the Hilbert space (i.e., adding a further ancilla
system). So the evolution of a system Q described by a
quantum operation can always be modeled as the unitary
evolution of a system QE , for an initial pure state of E .
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Quantum Operations

Also, every quantum operation on a Hilbert space HQ has a


(non-unique) operator sum representation intrinsic to HQ :

Ei ρEi†
X
E(ρ) =
i

where Ei = hi|U|0i for some orthonormal basis {|ii} of E .


If the operation is trace-preserving (or nonselective), then
P †
i Ei Ei =P
I. For operations that are not trace-preserving (or
selective), i Ei† Ei ≤ I. This corresponds to the case where
the outcome of a measurement on QE is taken into account
(selected) in the transition E → E(ρ).

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Quantum Operations

If there is no interaction between Q and E , then


E(ρ) = UQ ρUQ† , UQ UQ† = I, i.e., there is only one operator in
the sum. In this case, U = UQ ⊗ UE and

E(ρ) = TrE (UQ ⊗ UE (ρ ⊗ |0ih0|)UQ† ⊗ UE† )


= UQ ρUQ† .

So unitary evolution is a special case of the operator sum


representation of a quantum operation and, of course, another
special case is the transition E → E(ρ) that occurs in a
quantum measurement process, where Ei = Mi .

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Quantum Operations

A trace-preserving operation corresponds to a non-selective


measurement:
Mi ρMi†
X
E(ρ) =
i

while an operation that is not trace-preserving corresponds to a


selective measurement, where the state ‘collapses’ onto the
corresponding measurement outcome:

Mi ρMi† /Tr(Mi ρMi† )

Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Quantum Operations

The operator sum representation applies to quantum


operations between possibly different input and output Hilbert
spaces, and characterizes the following general situation: a
quantum system in an unknown initial state ρ is allowed to
interact unitarily with other systems prepared in standard
states, after which some part of the composite system is
discarded, leaving the final system in a state ρ0 . The
transition ρ → ρ0 is defined by a quantum operation.
So a quantum operation represents, quite generally, the
unitary evolution of a closed quantum system, the nonunitary
evolution of an open quantum system in interaction with its
environment, and evolutions that result from a combination of
unitary interactions and selective or nonselective
measurements.
Quantum Information
Resources
Entangled States
Classical Information
Measurement
Some Relevant Quantum Mechanics
Quantum Operations
Quantum Information

Quantum Operations

The creed of the Church of the Larger Hilbert Space is that


every state can be made pure, every measurement can be
made ideal, and every evolution can be made unitary—on a
larger Hilbert space.
The Creed originates with John Smolin. This formulation is
due to Ben Schumacher. See his Lecture Notes on Quantum
Information Theory.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

Information in Shannon’s sense is a quantifiable resource


associated with the output of a (suitably idealized) stochastic
source of symbolic states, where the physical nature of the
systems embodying these states is irrelevant to the amount of
classical information associated with the source.
The quantity of information associated with a stochastic
source is defined by its optimal compressibility, and this is
given by the Shannon entropy.
The fact that some feature of the output of a stochastic
source can be optimally compressed is, ultimately, what
justifies the attribution of a quantifiable resource to the
source.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

Information is represented physically in the states of physical


systems. The essential difference between classical and
quantum information arises because of the different
distinguishability properties of classical and quantum states.
Only sets of orthogonal quantum states are reliably
distinguishable (i.e., with zero probability of error), as are sets
of different classical states (which are represented by disjoint
singleton subsets in a phase space, and so are orthogonal as
subsets of phase space in a sense analogous to orthogonal
subspaces of a Hilbert space).

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

Classical informationis that sort of information represented in


a set of distinguishable states—states of classical systems, or
orthogonal quantum states—and so can be regarded as a
subcategory of quantum information, where the states may or
may not be distinguishable.
The idea behind quantum information is to extend Shannon’s
notion of compressibility to a stochastic source of quantum
states, which may or may not be distinguishable. For this we
need to define a suitable measure of information for
probability distributions of quantum states—mixtures—as a
generalization of the notion of Shannon entropy.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

Consider a system QE in an entangled state |Ψi. Then the


subsystem Q is in a mixed state ρ, which can always be
expressed as: X
ρ= pi |iihi|
i

where the pi are the eigenvalues of ρ and the pure states |ii
are orthonormal eigenstates of ρ.
This is the spectral representation of ρ, and any density
operator—a positive (hence Hermitian) operator—can be
expressed in this way.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

The representation is unique if and only if the pi are all distinct. If


some of the pi are equal, there is a unique representation of ρ as a
sum of projection operators with the distinct values of the pi as
coefficients, but some of the projection operators will project onto
multi-dimensional subspaces.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

P
Since ρ has unit trace, pi = 1, and so the spectral
representation of ρ represents a classical probability
distribution of orthogonal, and hence distinguishable, pure
states.
If we measure a Q-observable with eigenstates |ii, then the
outcomes can be associated with the values of a random
variable X , where Pr(X = i) = pi . Then
X
H(X ) = − pi log pi

is the Shannon entropy of the probability distribution of


measurement outcomes.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

Now, X
−Tr(ρ log ρ) = − pi log pi
(because the eigenvalues of ρ log ρ are pi log pi and the trace of an
operator is the sum of the eigenvalues), so a natural generalization
of Shannon entropy for any mixture of quantum states with density
operator ρ is the von Neumann entropy:

S := −Tr(ρ log ρ)

which coincides with the Shannon entropy for measurements in the


eigenbasis of ρ.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

For a completely mixed state ρ = I/d, where dimHQ = d, the


d eigenvalues of ρ are all equal to 1/d and S = log d.
log d is the maximum value of S in a d-dimensional Hilbert
space.
The von Neumann entropy S is zero, the minimum value, if
and only if ρ is a pure state, where the eigenvalues of ρ are 1
and 0.
So 0 ≤ S ≤ log d, where d is the dimension of HQ .

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

We can think of the Shannon entropy as a measure of the


average amount of information gained by identifying the state
produced by a known stochastic source. Alternatively, the
Shannon entropy represents the optimal compressibility of the
information produced by an information source.
The von Neumann entropy does not, in general, represent the
amount of information gained by identifying the quantum
state produced by a stochastic source characterized as a
mixed state, because nonorthogonal quantum states in a
mixture cannot be reliably identified.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

We can think of the Shannon entropy as a measure of the


average amount of information gained by identifying the state
produced by a known stochastic source. Alternatively, the
Shannon entropy represents the optimal compressibility of the
information produced by an information source.
The von Neumann entropy does not, in general, represent the
amount of information gained by identifying the quantum
state produced by a stochastic source characterized as a
mixed state, because nonorthogonal quantum states in a
mixture cannot be reliably identified.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

The von Neumann entropy can be interpreted in terms of


compressibility via Schumacher’s source coding theorem for
quantum information, a generalization of Shannon’s source coding
theorem for classical information.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

For an elementary two-state quantum system with a


2-dimensional Hilbert space considered as representing the
output of an elementary quantum information source, S = 1
for an equal weight distribution over two orthogonal states
(i.e., for the density operator ρ = I/2), so Schumacher takes
the basic unit of quantum information as the ‘qubit.’
By analogy with the term ‘bit,’ the term ‘qubit’ refers to the
basic unit of quantum information in terms of the von
Neumann entropy, and to an elementary two-state quantum
system considered as representing the possible outputs of an
elementary quantum information source.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

The difference between quantum information as measured by von


Neumann entropy S and classical information as measured by
Shannon entropy H can be brought out by considering the
quantum notions of conditional entropy and mutual information,
and in particular the peculiar feature of inaccessibility associated
with quantum information.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

For a composite system AB, conditional von Neumann entropy and


mutual information are defined in terms of the joint entropy
S(AB) = −Tr(ρAB log ρAB ) by analogy with the corresponding
notions for Shannon entropy:

S(A|B) = S(A, B) − S(B)


S(A : B) = S(A) − S(A|B)
= S(B) − S(B|A)
= S(A) + S(B) − S(A, B)

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

The joint entropy satisfies the subadditivity inequality:

S(A, B) ≤ S(A) + S(B)

with equality if and only if A and B are uncorrelated, i.e.,


ρAB = ρA ⊗ ρB .

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

S(A|B) can be negative, while the conditional Shannon


entropy is always positive or zero.

Consider the entangled state |Ψi = (|00i + |11i)/ 2. Since
|Ψi is a pure state, S(A, B) = 0. But S(A) = S(B) = 1. So
S(A|B) = S(A, B) − S(A) = −1.
In fact, for a pure state |Ψi of a composite system AB,
S(A|B) < 0 if and only if |Ψi is entangled.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

S(A|B) can be negative, while the conditional Shannon


entropy is always positive or zero.

Consider the entangled state |Ψi = (|00i + |11i)/ 2. Since
|Ψi is a pure state, S(A, B) = 0. But S(A) = S(B) = 1. So
S(A|B) = S(A, B) − S(A) = −1.
In fact, for a pure state |Ψi of a composite system AB,
S(A|B) < 0 if and only if |Ψi is entangled.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

S(A|B) can be negative, while the conditional Shannon


entropy is always positive or zero.

Consider the entangled state |Ψi = (|00i + |11i)/ 2. Since
|Ψi is a pure state, S(A, B) = 0. But S(A) = S(B) = 1. So
S(A|B) = S(A, B) − S(A) = −1.
In fact, for a pure state |Ψi of a composite system AB,
S(A|B) < 0 if and only if |Ψi is entangled.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

For a composite system AB in a product state ρ ⊗ σ, it


follows from the definition of joint entropy that:

S(A, B) = S(ρ ⊗ σ) = S(ρ) + S(σ) = S(A) + S(B).

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

If AB is in a pure state |Ψi, it follows from the Schmidt


decomposition theorem that |Ψi can be expressed as
X√
|Ψi = pi |iihi|
i

from which it follows that


P
ρA = TrB (|ΨihΨ|) = Pi pi |iihi|
ρB = TrA (|ΨihΨ|) = i pi |iihi|;

and so: X
S(A) = S(B) = − pi log pi .
i

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

For a mixed state prepared as a mixture of states ρi with


weights pi , it can be shown that
X X
S( pi ρi ) ≤ H(pi ) + pi S(ρi )
i i

with equality if and only if the states ρi have support on


orthogonal subspaces.
The entropy H(pi ) is referred to as the entropy of preparation
of the mixture ρ.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

For a mixed state prepared as a mixture of states ρi with


weights pi , it can be shown that
X X
S( pi ρi ) ≤ H(pi ) + pi S(ρi )
i i

with equality if and only if the states ρi have support on


orthogonal subspaces.
The entropy H(pi ) is referred to as the entropy of preparation
of the mixture ρ.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

If the states ρi are pure states, then S(ρ) ≤ H(pi )


For example, suppose HQ is 2-dimensional and
p1 = p2 = 1/2, then H(pi ) = 1.
So if we had a classical information source producing the
symbols 1 and 2 with equal probabilities, no compression of
the information would be possible. However, if the symbols 1
and 2 are encoded as nonorthogonal quantum states |r1 i and
|r2 i, then S(ρ) < 1.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

According to Schumacher’s source coding theorem, since S(ρ) < 1,


quantum compression is possible, i.e., we can transmit long
sequences of qubits reliably using S < 1 qubits per quantum state
produced by the source.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

TheP von Neumann entropy of a mixture of states ρi with weights


pi , pi ρi , is a concave function of the states in the distribution,
i.e., X X
S( pi ρi ) ≥ pi S(ρi ).
i i

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy


To see this, consider a composite system AB in the state
X
ρAB = pi ρi ⊗ |iihi|.
We have X
S(A) = S( pi ρi )
i
X
S(B) = S( pi |iihi|) = H(pi )
i
and X
S(A, B) = H(pi ) + pi S(ρi )
i
By subadditivity, S(A) + S(B) ≥ S(A, B), so:
X X
S( pi ρi ) ≥ pi S(ρi ).
i i

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

It turns out that projective measurements always increase


entropy, i.e., if ρ0 = i Pi ρPi , then S(ρ0 ) ≥ S(ρ), but
P
generalized measurements can decrease entropy.
Consider, for example, the generalized measurement on a
qubit in the initial state ρ defined by the measurement
operators M1 = |0ih0| and M2 = |0ih1|. (Note that these
operators do define a generalized measurement because
M1† M1 + M2† M2 = |0ih0| + |1ih1| = I.)

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

It turns out that projective measurements always increase


entropy, i.e., if ρ0 = i Pi ρPi , then S(ρ0 ) ≥ S(ρ), but
P
generalized measurements can decrease entropy.
Consider, for example, the generalized measurement on a
qubit in the initial state ρ defined by the measurement
operators M1 = |0ih0| and M2 = |0ih1|. (Note that these
operators do define a generalized measurement because
M1† M1 + M2† M2 = |0ih0| + |1ih1| = I.)

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Von Neumann Entropy

After the measurement

ρ0 = |0ih0|ρ|0ih0| + |0ih1|ρ|1ih0|
= Tr(ρ)|0ih0|
= |0ih0|. (1)

So S(ρ0 ) = 0 ≤ S(ρ).

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Accessible Information

The ability to exploit quantum states to perform new sorts of


information-processing tasks arises because quantum states
have different distinguishability properties than classical
states. Of course, it is not the mere lack of distinguishability
of quantum states that is relevant here, but the different sort
of distinguishability enjoyed by quantum states.
This indistinguishability is reflected in the limited accessibility
of quantum information.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Accessible Information

Consider a classical information source in Shannon’s sense,


with Shannon entropy H(X ). Suppose the source produces
symbols represented as the values x (in an alphabet X ) of a
random variable X , with probabilities px , and that the
symbols are encoded as quantum states ρx , x ∈ X .
The mutual information H(X : Y ) is a measure of how much
information one gains, on average, about the value of the
random variable X on the basis of the outcome Y of a
measurement on a given quantum state.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Accessible Information

The accessible information is defined as:

Sup H(X : Y )

over all possible measurements.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Accessible Information

The Holevo bound on mutual information provides an


important upper bound to accessible information:
X
H(X : Y ) ≤ S(ρ) − px S(ρx )
x
P
where ρ = x px ρx and the measurement outcome Y is
obtained from a measurement defined by a POVM {Ey }.
P
Since S(ρ) − x px S(ρx ) ≤ H(X ), with equality if and only if
the states ρx have orthogonal support, we have:

H(X : Y ) ≤ H(X )

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Accessible Information

Note that X can be distinguished from Y if and only if


H(X : Y ) = H(X ).
If the states ρx are orthogonal pure states, then in principle
there exists a measurement that will distinguish the states,
and for such a measurement H(X : Y ) = H(X ).
In this case, the accessible information is the same as the
entropy of preparation of the quantum states, H(X ).
But if the states are nonorthogonal, then H(X : Y ) < H(X )
and there is no measurement, even in the generalized sense,
that will enable the reliable identification of X .

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Accessible Information

If the values of X are encoded as the pure states of a qubit,


then H(X : Y ) ≤ S(ρ) and S(ρ) ≤ 1. It follows that at most 1
bit of information can be extracted from a qubit by
measurement.
If X has k equiprobable values, H(X ) = log k. Alice could
encode these k values into a qubit by preparing it in an
equal-weight mixture of k nonorthogonal pure states, but Bob
could only extract at most 1 bit of information about the
value of X .

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Accessible Information

For an n-state quantum system associated with an n-dimensional


Hilbert space, S(ρ) ≤ log n. So even though Alice could encode
any amount of information into such an n-state quantum system
(by preparing the state as a mixture of nonorthogonal states), the
most information that Bob could extract from the state by
measurement is log n, which is the same as the maximum amount
of information that could be encoded into and extracted from an
n-state classical system.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Accessible Information

It might seem, then, that the inaccessibility of quantum


information as quantified by the Holevo bound would thwart
any attempt to exploit quantum information to perform
nonclassical information-processing tasks.
Surprisingly, the inaccessibility of quantum information can
actually be exploited in information-processing tasks that
transcend the scope of classical information.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound

To derive the Holevo bound (see Nielsen and Chuang,


Theorem 12.1), suppose Alice encodes the distinguishable
symbols of a classical information source with entropy H(X )
as quantum states ρx (not necessarily orthogonal).
That is, Alice has a quantum system P, the preparation
device, with an orthonormal pointer basis |x i corresponding to
the values of the random variable X , which are produced by
the source with probabilities px .

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound

The preparation interaction correlates the pointer states |x i


with the states ρx of a quantum system Q, so that the final
state of P and Q after the preparation interaction is:
X
ρPQ = px |x ihx | ⊗ ρx .
x

Alice sends the system Q to Bob, who attempts to determine


the value of the random variable X by measuring the state of
Q.

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound

The initial state of P, Q, and Bob’s measuring instrument M


is: X
ρPQM = px |x ihx | ⊗ ρx ⊗ |0ih0|
x

where |0ih0| is the initial ready state of M.


Bob’s measurement can be described by a quantum operation
E on the Hilbert space HQ ⊗ HM that stores a value of y ,
associated with a POVM {Ey } on HQ , in the pointer state
|y i of M, i.e., E is defined for any state σ ∈ HQ and initial
ready state |0i ∈ HM by:

E Xp p
σ ⊗ |0ih0| −→ Ey σ Ey ⊗ |y ihy |.
y

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound

From the definition of quantum mutual information:

S(P : Q) = S(P : Q, M)

because M is initially uncorrelated with PQ and

S(P 0 : Q 0 , M 0 ) ≤ S(P : Q, M)

because it can be shown that quantum operations never


increase mutual information (primes here indicate states after
the application of E).

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound

The notation S(P : Q, M) refers to the mutual information between


the system P and the composite system consisting of the system Q
and theP measuring device M, in the initial state
PQ
ρ = x px |x ihx | ⊗ ρx . That is, the comma notation refers to
the joint system:
S(P : Q, M) = S(P) − S(P|Q, M) = S(P) + S(Q, M) − S(P, Q, M)

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound

Finally:
S(P 0 : Q 0 , M 0 )
because discarding systems never increases mutual information,
and so:
S(P 0 : M 0 ) ≤ S(P : Q)
which (following some algebraic manipulation) is the statement of
0 0
P (S(P : M ) ≤ S(P : Q) reduces to
the Holevo bound, i.e.,
H(X : Y ) ≤ S(ρ) − x px S(ρx ).

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound


To see this, note that
X
ρPQ = px |x ihx | ⊗ ρx
x
P
So S(P) = H(px ) ), S(Q) = S( x px ρx ) = S(ρ) and:
X
S(P, Q) = H(px ) + px S(ρx )
x
since the states |x ihx | ⊗ ρx have support on orthogonal
subspaces in HP ⊗ HQ .
It follows that
S(P : Q) = S(P) + S(Q) − S(P, Q)
X
= S(ρ) − px S(ρx )
x
which is the right hand side of the Holevo bound.
Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound

For the left hand side:


0M0 0 0 0
ρP = TrQ 0 (ρP Q M )
X p p
= TrQ 0 ( px |x ihx | ⊗ Ey ρx Ey ⊗ |y ihy |)
xy
X
= px Tr(Ey ρx Ey )|x ihx | ⊗ |y ihy |
xy
X
= p(x , y )|x ihx | ⊗ |y ihy |
xy
p p
since p(x , y ) = px p(y | x ) = px Tr(ρx Ey ) = px Tr( Ey ρx Ey ),
and so S(P 0 : M 0 ) = H(X : Y ).

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound

The Holevo bound limits the representation of classical bits by


qubits. Putting it another way, the Holevo bound
characterizes the resource cost of encoding classical bits as
qubits: one qubit is necessary and sufficient.
Can we represent qubits by bits? If so, what is the cost of a
qubit in terms of bits?

Quantum Information
Resources
Classical Information Von Neumann Entropy
Some Relevant Quantum Mechanics Accessible Information
Quantum Information

Deriving the Holevo Bound

This question is answered by a result by Barnum, Hayden,


Jozsa, and Winter (2001): A quantum source of
nonorthogonal signal states can be compressed with arbitarily
high fidelity to α qubits per signal plus any number of
classical bits per signal if and only if α is at least as large as
the von Neumann entropy S of the source.
This means that a generic quantum source cannot be
separated into a classical and quantum part: quantum
information cannot be traded for any amount of classical
information.

Quantum Information

You might also like