Quantum Computing Physics FY Notes
Quantum Computing Physics FY Notes
H ow can you get more and more out of less and less? The smaller computers
get, the more powerful they seem to become: there's more number-crunching ability in a
21st-century cellphone than you'd have found in a room-sized, military computer 50
years ago. Yet, despite such amazing advances, there are still plenty of complex
problems that are beyond the reach of even the world's most powerful computers—and
there's no guarantee we'll ever be able to tackle them. One problem is that the basic
switching and memory units of computers, known as transistors, are now approaching
the point where they'll soon be as small as individual atoms. If we want computers that
are smaller and more powerful than today's, we'll soon need to do our computing in a
radically different way. Entering the realm of atoms opens up powerful new possibilities
in the shape of quantum computing, with processors that could work millions of times
faster than the ones we use today. Sounds amazing, but the trouble is that quantum
computing is hugely more complex than traditional computing and operates in the Alice
in Wonderland world of quantum physics, where the "classical," sensible, everyday laws
of physics no longer apply. What is quantum computing and how does it work? Let's
take a closer look!
Quantum computing means storing and processing information using individual atoms, ions, electrons, or
photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater
complexity of designing computers that can operate in the weird world of quantum physics.
Conventional computers have two tricks that they do really well: they can store numbers
in memory and they can process stored numbers with simple mathematical operations
(like add and subtract). They can do more complex things by stringing together the
simple operations into a series called an algorithm (multiplying can be done as a series
of additions, for example). Both of a computer's key tricks—storage and processing—
are accomplished using switches called transistors, which are like microscopic versions
of the switches you have on your wall for turning on and off the lights. A transistor can
either be on or off, just as a light can either be lit or unlit. If it's on, we can use a
transistor to store a number one (1); if it's off, it stores a number zero (0). Long strings of
ones and zeros can be used to store any number, letter, or symbol using a code based
on binary (so computers store an upper-case letter A as 1000001 and a lower-case one
as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a
string of eight bits, you can store 255 different characters (such as A-Z, a-z, 0-9, and
most common symbols). Computers calculate by using circuits called logic gates, which
are made from a number of transistors connected together. Logic gates compare
patterns of bits, stored in temporary memories called registers, and then turn them into
new patterns of bits—and that's the computer equivalent of what our human brains
would call addition, subtraction, or multiplication. In physical terms, the algorithm that
performs a particular calculation takes the form of an electronic circuit made from a
number of logic gates, with the output from one gate feeding in as the input to the next.
It sounds amazing, and it is, but it misses the point. The more information you need to
store, the more binary ones and zeros—and transistors—you need to do it. Since most
conventional computers can only do one thing at a time, the more complex the problem
you want them to solve, the more steps they'll need to take and the longer they'll need
to do it. Some computing problems are so complex that they need more computing
power and time than any modern machine could reasonably supply; computer scientists
call those intractable problems.
Richard Feynman
Quantum theory is the branch of physics that deals with the world of atoms and the
smaller (subatomic) particles inside them. You might think atoms behave the same way
as everything else in the world, in their own tiny little way—but that's not true: on the
atomic scale, the rules change and the "classical" laws of physics we take for granted in
our everyday world no longer automatically apply. As Richard P. Feynman, one of the
greatest physicists of the 20th century, once put it: "Things on a very small scale
behave like nothing you have any direct experience about... or like anything that you
have ever seen." (Six Easy Pieces, p116.)
If you've studied light, you may already know a bit about quantum theory. You might
know that a beam of light sometimes behaves as though it's made up of particles (like a
steady stream of cannonballs), and sometimes as though it's waves of energy rippling
through space (a bit like waves on the sea). That's called wave-particle duality and it's
one of the ideas that comes to us from quantum theory. It's hard to grasp that
something can be two things at once—a particle and a wave—because it's totally alien
to our everyday experience: a car is not simultaneously a bicycle and a bus. In quantum
theory, however, that's just the kind of crazy thing that can happen. The most striking
example of this is the baffling riddle known as Schrödinger's cat. Briefly, in the weird
world of quantum theory, we can imagine a situation where something like a cat could
be alive and dead at the same time!
What does all this have to do with computers? Suppose we keep on pushing Moore's
Law—keep on making transistors smaller until they get to the point where they obey not
the ordinary laws of physics (like old-style transistors) but the more bizarre laws of
quantum mechanics. The question is whether computers designed this way can do
things our conventional computers can't. If we can predict mathematically that they
might be able to, can we actually make them work like that in practice?
People have been asking those questions for several decades. Among the first were
IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the
door for quantum computing in the 1960s when he proposed that information is a
physical entity that could be manipulated according to the laws of physics. One
important consequence of this is that computers waste energy manipulating the bits
inside them (which is partly why computers use so much energy and get so hot, even
though they appear to be doing not very much at all). In the 1970s, building on
Landauer's work, Bennett showed how a computer could circumvent this problem by
working in a "reversible" way, implying that a quantum computer could carry out
massively complex computations without using massive amounts of energy. In 1981,
physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic
machine that would work in a similar way to an ordinary computer but according to the
principles of quantum physics. The following year, Richard Feynman sketched out
roughly how a machine using quantum principles could carry out basic computations. A
few years later, Oxford University's David Deutsch (one of the leading lights in quantum
computing) outlined the theoretical basis of a quantum computer in more detail. How did
these great scientists imagine that quantum computers might work?
Just as a quantum computer can store multiple numbers at once, so it can process
them simultaneously. Instead of working in serial (doing a series of things one at a time
in a sequence), it can work in parallel (doing multiple things at the same time). Only
when you try to find out what state it's actually in at any given moment (by measuring it,
in other words) does it "collapse" into one of its possible states—and that gives you the
answer to your problem. Estimates suggest a quantum computer's ability to work in
parallel would make it millions of times faster than any conventional computer... if only
we could build it! So how would we do that?
Photo: A single atom or ion can be trapped in an optical cavity—the space between mirrors—and controlled by
precise pulses from laser beams.
In practice, there are lots of possible ways of containing atoms and changing their
states using laser beams, electromagnetic fields, radio waves, and an assortment of
other techniques. One method is to make qubits using quantum dots, which are
nanoscopically tiny particles of semiconductors inside which individual charge carriers,
electrons and holes (missing electrons), can be controlled. Another method makes
qubits from what are called ion traps: you add or take away electrons from an atom to
make an ion, hold it steady in a kind of laser spotlight (so it's locked in place like a
nanoscopic rabbit dancing in a very bright headlight), and then flip it into different states
with laser pulses. In another technique, the qubits are photons inside optical cavities
(spaces between extremely tiny mirrors). Don't worry if you don't understand; not many
people do. Since the entire field of quantum computing is still largely abstract and
theoretical, the only thing we really need to know is that qubits are stored by atoms or
other quantum-scale particles that can exist in different states and be switched between
them.
What can quantum computers do that ordinary
computers can't?
Although people often assume that quantum computers must automatically be better
than conventional ones, that's by no means certain. So far, just about the only thing we
know for certain that a quantum computer could do better than a normal one is
factorisation: finding two unknown prime numbers that, when multiplied together, give
a third, known number. In 1994, while working at Bell Laboratories, mathematician Peter
Shor demonstrated an algorithm that a quantum computer could follow to find the "prime
factors" of a large number, which would speed up the problem enormously. Shor's
algorithm really excited interest in quantum computing because virtually every modern
computer (and every secure, online shopping and banking website) uses public-key
encryption technology based on the virtual impossibility of finding prime factors quickly
(it is, in other words, essentially an "intractable" computer problem). If quantum
computers could indeed factor large numbers quickly, today's online security could be
rendered obsolete at a stroke. But what goes around comes around, and some
researchers believe quantum technology will lead to much stronger forms of encryption.
(In 2017, Chinese researchers demonstrated for the first time how quantum encryption
could be used to make a very secure video call from Beijing to Vienna.)
Does that mean quantum computers are better than conventional ones? Not exactly.
Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any
other algorithms have been discovered that would be better performed by quantum
methods. Given enough time and computing power, conventional computers should still
be able to solve any problem that quantum computers could solve, eventually. In other
words, it remains to be proven that quantum computers are generally superior to
conventional ones, especially given the difficulties of actually building them. Who knows
how conventional computers might advance in the next 50 years, potentially making the
idea of quantum computers irrelevant—and even absurd.
Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in
quantum computers). Photo courtesy of Argonne National Laboratory.
Why is it so hard to make a quantum computer?
We have decades of experience building ordinary, transistor-based computers with
conventional architectures; building quantum machines means reinventing the whole
idea of a computer from the bottom up. First, there are the practical difficulties of making
qubits, controlling them very precisely, and having enough of them to do really useful
things. Next, there's a major difficulty with errors inherent in a quantum system—"noise"
as this is technically called—which seriously compromises any calculations a quantum
computer might make. There are ways around this ("quantum error correction"), but
they introduce a great deal more complexity. There's also the fundamental issue of how
you get data in and out of a quantum computer, which is, itself, a complex computing
problem. Some critics believe these issues are insurmountable; others acknowledge the
problems but argue the mission is too important to abandon.
Theoretical physics isn’t the easiest field in science to translate into laymen’s terms. Until
recently it’s solely been the domain of geniuses like the late Stephen Hawking and fictional
characters such as The Big Bang Theory’s Sheldon Cooper. As companies like Google, IBM, and
Intel work to build quantum computer systems that offer supremacy over binary computers, it’s a
good time to learn some basic terms and concepts. We’ve got you covered fam.
Quantum computers are devices capable of performing computations using quantum bits, or
qubits. You can learn why they’re important here, and read some fun and wacky facts about them
here.
Qubits
The first thing we need to understand is what a qubit actually is. A “classical computer,” like the
one you’re reading this on (either desktop, laptop, tablet, or phone), is also referred to as a
“binary computer” because all of the functions it performs are based on either ones or zeros.
On a binary computer the processor uses transistors to perform calculations. Each transistor can
be on or off, which indicates the one or zero used to compute the next step in a program or
algorithm.
There’s more to it than that, but the important thing to know about binary computers is the ones
and zeros they use as the basis for computations are called “bits.”
Quantum computers don’t use bits; they use qubits. Qubits, aside from sounding way cooler,
have extra functions that bits don’t. Instead of only being represented as a one or zero, qubits can
actually be both at the same time. Often qubits, when unobserved, are considered to be
“spinning.” Instead of referring to these types of “spin qubits” using ones or zeros, they’re
measured in states of “up,” “down,” and “both.”
Superposition
Qubits can be more than one thing at a time because of a strange phenomenon called
superposition. Quantum superposition in qubits can be explained by flipping a coin. We know
that the coin will land in one of two states: heads or tails. This is how binary computers think.
While the coin is still spinning in the air, assuming your eye isn’t quick enough to ‘observe’ the
actual state it’s in, the coin is actually in both states at the same time. Essentially until the coin
lands it has to be considered both heads and tails simultaneously.
A clever scientist by the name of Schrodinger explained this phenomenon using a cat which he
demonstrated as being both alive and dead at the same time.
Observation theory
Qubits work on the same principle. An area of related study called “observation theory” dictates
that when a quantum particle is being watched it can act like a wave. Basically the universe acts
one way when we’re looking, another way when we aren’t. This means quantum computers,
using their qubits, can simulate the subatomic particles of the universe in a way that’s actually
natural: they speak the same language as an electron or proton, basically.
Different companies are approaching qubits in different ways because, as of right now, working
with them is incredibly difficult. Since observing them changes their state, and using them
creates noise – the more qubits you have the more errors you get – measuring them is
challenging to say the least.
This challenge is exacerbated by the fact that most quantum processors have to be kept at near
perfect-zero temperatures (colder than space) and require an amount power that is unsustainably
high for the quality of computations. Right now, quantum computers aren’t worth the trouble and
money they take to build and operate.
In the future, however, they’ll change our entire understanding of biology, chemistry, and
physics. Simulations at the molecular level could be conducted that actually imitate physical
concepts in the universe we’ve never been able to reproduce or study.
Quantum Supremacy
For quantum computers to become useful to society we’ll have to achieve certain milestones
first. The point at which a quantum computer can process information and perform calculations
that a binary computer can’t is called quantum supremacy.
Quantum supremacy isn’t all fun and games though, it presents another set of problems. When
quantum computers are fully functional even modest systems in the 100 qubit range may be able
to bypass binary security like a hot knife through butter.
This is because those qubits, which can be two things at once, figure out multiple solutions to a
problem at once. They don’t have to follow binary logic like “if one thing happens do this but if
another thing happens do something else.” Individual qubits can do both at the same time while
spinning, for example, and then produce the optimum result when properly observed.
Currently there’s a lot of buzz about quantum computers, and rightfully so. Google is pretty sure
its new Bristlecone processor will achieve quantum supremacy this year. And it’s hard to bet
against Google or one of the other big tech companies. Especially when Intel has already put a
quantum processor on a silicon chip and you can access IBM’s in the cloud right now.
No matter your feelings on quantum computers, qubits, or half-dead/half-alive cats, the odds are
pretty good that quantum computers will follow the same path that IBM’s mainframes did.
They’ll get smaller, faster, more powerful, and eventually we’ll all be using them, even if we
don’t understand the science behind them.
Quantum computing
The Bloch sphere is a representation of a qubit, the fundamental building block of quantum computers.
The field of quantum computing is actually a sub-field of quantum information science, which
includes quantum cryptography and quantum communication. Quantum computing was started
in the early 1980s when Richard Feynman and Yuri Manin expressed the idea that a quantum
computer had the potential to simulate things that a classical computer could not.[2][3] In 1994,
Peter Shor published an algorithm that is able to efficiently solve some problems that are used in
asymmetric cryptography that are considered hard for classical computers.[4]
Qubits are fundamental to quantum computing and are somewhat analogous to bits in a classical
computer. Qubits can be in a 1 or 0 quantum state. But they can also be in a superposition of the
1 and 0 states. However, when qubits are measured the result is always either a 0 or a 1; the
probabilities of the two outcomes depends on the quantum state they were in.
In the real world where bits are perceived as "0" o "1," only one of the four possible states can exist
at any time in space. However, in a quantum superposition state, all four of the possible states can
co-exist in time and space simultaneously.
• The particles that make up the universe are inherently uncertain and can
simultaneously exist in more than one place or more than one state of being.
• Photons are generated randomly in one of two quantum states.
• You can’t measure a quantum property without changing or disturbing it.
• You can clone some quantum properties of a particle, but not the whole
particle.
All these principles play a role in how quantum cryptography works.
These properties make it impossible to measure the quantum state of any system
without disturbing that system.
Photons are used for quantum cryptography because they offer all the necessary
qualities needed: Their behavior is well understood, and they are information
carriers in optical fiber cables. One of the best-known examples of quantum
cryptography currently is quantum key distribution (QKD), which provides a
secure method for key exchange.
How does quantum cryptography work?
In theory, quantum cryptography works by following a model that was developed
in 1984.
The model assumes there are two people named Alice and Bob who wish to
exchange a message securely. Alice initiates the message by sending Bob a key.
The key is a stream of photons that travel in one direction. Each photon represents
a single bit of data -- either a 0 or 1. However, in addition to their linear travel,
these photons are oscillating, or vibrating, in a certain manner.
So, before Alice, the sender, initiates the message, the photons travel through a
polarizer. The polarizer is a filter that enables certain photons to pass through it
with the same vibrations and lets others pass through in a changed state of
vibration. The polarized states could be vertical (1 bit), horizontal (0 bit), 45
degrees right (1 bit) or 45 degrees left (0 bit). The transmission has one of two
polarizations representing a single bit, either 0 or 1, in either scheme she uses.
The photons now travel across optical fiber from the polarizer toward the receiver,
Bob. This process uses a beam splitter that reads the polarization of each photon.
When receiving the photon key, Bob does not know the correct polarization of the
photons, so one polarization is chosen at random. Alice now compares what Bob
used to polarize the key and then lets Bob know which polarizer she used to send
each photon. Bob then confirms if he used the correct polarizer. The photons read
with the wrong splitter are then discarded, and the remaining sequence is
considered the key.
Let's suppose there is an eavesdropper present, named Eve. Eve attempts to listen
in and has the same tools as Bob. But Bob has the advantage of speaking to Alice
to confirm which polarizer type was used for each photon; Eve doesn't. Eve ends
up rendering the final key incorrectly.
Alice and Bob would also know if Eve was eavesdropping on them. Eve observing
the flow of photons would then change the photon positions that Alice and Bob
expect to see.
This method of cryptography has yet to be fully developed; however, there have
been successful implementations of it:
In addition to QKD, some of the more notable protocols and quantum algorithms
used in quantum cryptography are the following:
Although the subject has been around for a couple of decades, quantum
cryptography (not to be confused with post-quantum cryptography) is quickly
becoming more critically relevant to our everyday lives because of how it can
safeguard vital data in a way that current encryption methods can’t.
Consider, for example, the trust you place in banks and commercial enterprises to
keep your credit card and other information safe while conducting business
transactions online. What if those companies – using current encryption methods –
could no longer guarantee the security of your private information? Granted,
cybercriminals are always trying to gain access to secure data, but when quantum
computers come online, that information will be even more vulnerable to being
hacked. In fact, hackers don’t even need to wait for quantum computers to start the
process because they’re collecting encrypted data now to decrypt later when the
quantum computers are ready. With quantum encryption, that’s not the case
because your information will be unhackable.
https://youtu.be/_5NQf8k3Jo0
https://quantumxc.com/blog/quantum-cryptography-explained/
https://www.youtube.com/watch?v=44G9UuB2RWI
Introduction to Quantum Teleportation
Alice wants to send quantum information to Bob. Specifically, suppose she wants
to send the qubit state |ψ⟩=α|0⟩+β|1⟩|ψ⟩=α|0⟩+β|1⟩. This entails passing on
information about αα and ββ to Bob.
There exists a theorem in quantum mechanics which states that you cannot simply
make an exact copy of an unknown quantum state. This is known as the no-cloning
theorem. As a result of this we can see that Alice can't simply generate a copy
of |ψ⟩|ψ⟩ and give the copy to Bob. We can only copy classical states (not
superpositions).
However, by taking advantage of two classical bits and an entangled qubit pair,
Alice can transfer her state |ψ⟩|ψ⟩ to Bob. We call this teleportation because, at
the end, Bob will have |ψ⟩|ψ⟩ and Alice won't anymore.
To transfer a quantum bit, Alice and Bob must use a third party (Telamon) to send
them an entangled qubit pair. Alice then performs some operations on her qubit,
sends the results to Bob over a classical communication channel, and Bob then
performs some operations on his end to receive Alice’s qubit.
The no cloning theorem prevents us from using classical error correction techniques
on quantum states. For example, we cannot create backup copies of a state in the
middle of a quantum computation, and use them to correct subsequent errors.