0% found this document useful (0 votes)
27 views19 pages

Quantum Computing Physics FY Notes

Quantum computing utilizes quantum bits (qubits) to store and process information, allowing for potentially much faster computations than conventional computers, which rely on transistors. While quantum computers could solve complex problems like prime factorization more efficiently, significant challenges remain in their development, including error correction and data management. The future of quantum computing is uncertain, as advancements in conventional computing may continue to address many complex problems without the need for quantum solutions.

Uploaded by

omi.dabholkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views19 pages

Quantum Computing Physics FY Notes

Quantum computing utilizes quantum bits (qubits) to store and process information, allowing for potentially much faster computations than conventional computers, which rely on transistors. While quantum computers could solve complex problems like prime factorization more efficiently, significant challenges remain in their development, including error correction and data management. The future of quantum computing is uncertain, as advancements in conventional computing may continue to address many complex problems without the need for quantum solutions.

Uploaded by

omi.dabholkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Quantum computing

H ow can you get more and more out of less and less? The smaller computers

get, the more powerful they seem to become: there's more number-crunching ability in a
21st-century cellphone than you'd have found in a room-sized, military computer 50
years ago. Yet, despite such amazing advances, there are still plenty of complex
problems that are beyond the reach of even the world's most powerful computers—and
there's no guarantee we'll ever be able to tackle them. One problem is that the basic
switching and memory units of computers, known as transistors, are now approaching
the point where they'll soon be as small as individual atoms. If we want computers that
are smaller and more powerful than today's, we'll soon need to do our computing in a
radically different way. Entering the realm of atoms opens up powerful new possibilities
in the shape of quantum computing, with processors that could work millions of times
faster than the ones we use today. Sounds amazing, but the trouble is that quantum
computing is hugely more complex than traditional computing and operates in the Alice
in Wonderland world of quantum physics, where the "classical," sensible, everyday laws
of physics no longer apply. What is quantum computing and how does it work? Let's
take a closer look!

Quantum computing means storing and processing information using individual atoms, ions, electrons, or
photons. On the plus side, this opens up the possibility of faster computers, but the drawback is the greater
complexity of designing computers that can operate in the weird world of quantum physics.

What is conventional computing?


You probably think of a computer as a neat little gadget that sits on your lap and lets
you send emails, shop online, chat to your friends, or play games—but it's much more
and much less than that. It's more, because it's a completely general-purpose machine:
you can make it do virtually anything you like. It's less, because inside it's little more
than an extremely basic calculator, following a prearranged set of instructions called a
program. Like the Wizard of Oz, the amazing things you see in front of you conceal
some pretty mundane stuff under the covers.

Conventional computers have two tricks that they do really well: they can store numbers
in memory and they can process stored numbers with simple mathematical operations
(like add and subtract). They can do more complex things by stringing together the
simple operations into a series called an algorithm (multiplying can be done as a series
of additions, for example). Both of a computer's key tricks—storage and processing—
are accomplished using switches called transistors, which are like microscopic versions
of the switches you have on your wall for turning on and off the lights. A transistor can
either be on or off, just as a light can either be lit or unlit. If it's on, we can use a
transistor to store a number one (1); if it's off, it stores a number zero (0). Long strings of
ones and zeros can be used to store any number, letter, or symbol using a code based
on binary (so computers store an upper-case letter A as 1000001 and a lower-case one
as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a
string of eight bits, you can store 255 different characters (such as A-Z, a-z, 0-9, and
most common symbols). Computers calculate by using circuits called logic gates, which
are made from a number of transistors connected together. Logic gates compare
patterns of bits, stored in temporary memories called registers, and then turn them into
new patterns of bits—and that's the computer equivalent of what our human brains
would call addition, subtraction, or multiplication. In physical terms, the algorithm that
performs a particular calculation takes the form of an electronic circuit made from a
number of logic gates, with the output from one gate feeding in as the input to the next.

The trouble with conventional computers is that they depend on conventional


transistors. This might not sound like a problem if you go by the amazing progress
made in electronics over the last few decades. When the transistor was invented, back
in 1947, the switch it replaced (which was called the vacuum tube) was about as big as
one of your thumbs. Now, a state-of-the-art microprocessor (single-chip computer)
packs hundreds of millions (and up to 30 billion) transistors onto a chip of silicon the
size of your fingernail! Chips like these, which are called integrated circuits, are an
incredible feat of miniaturization. Back in the 1960s, Intel co-founder Gordon Moore
realized that the power of computers doubles roughly 18 months—and it's been doing
so ever since. This apparently unshakeable trend is known as Moore's Law.

It sounds amazing, and it is, but it misses the point. The more information you need to
store, the more binary ones and zeros—and transistors—you need to do it. Since most
conventional computers can only do one thing at a time, the more complex the problem
you want them to solve, the more steps they'll need to take and the longer they'll need
to do it. Some computing problems are so complex that they need more computing
power and time than any modern machine could reasonably supply; computer scientists
call those intractable problems.

As Moore's Law advances, so the number of intractable problems diminishes:


computers get more powerful and we can do more with them. The trouble is, transistors
are just about as small as we can make them: we're getting to the point where the laws
of physics seem likely to put a stop to Moore's Law. Unfortunately, there are still hugely
difficult computing problems we can't tackle because even the most powerful computers
find them intractable. That's one of the reasons why people are now getting interested in
quantum computing.
What is quantum computing?
“Things on a very small scale behave like nothing you have any direct
experience about... or like anything that you have ever seen.”

Richard Feynman
Quantum theory is the branch of physics that deals with the world of atoms and the
smaller (subatomic) particles inside them. You might think atoms behave the same way
as everything else in the world, in their own tiny little way—but that's not true: on the
atomic scale, the rules change and the "classical" laws of physics we take for granted in
our everyday world no longer automatically apply. As Richard P. Feynman, one of the
greatest physicists of the 20th century, once put it: "Things on a very small scale
behave like nothing you have any direct experience about... or like anything that you
have ever seen." (Six Easy Pieces, p116.)

If you've studied light, you may already know a bit about quantum theory. You might
know that a beam of light sometimes behaves as though it's made up of particles (like a
steady stream of cannonballs), and sometimes as though it's waves of energy rippling
through space (a bit like waves on the sea). That's called wave-particle duality and it's
one of the ideas that comes to us from quantum theory. It's hard to grasp that
something can be two things at once—a particle and a wave—because it's totally alien
to our everyday experience: a car is not simultaneously a bicycle and a bus. In quantum
theory, however, that's just the kind of crazy thing that can happen. The most striking
example of this is the baffling riddle known as Schrödinger's cat. Briefly, in the weird
world of quantum theory, we can imagine a situation where something like a cat could
be alive and dead at the same time!

What does all this have to do with computers? Suppose we keep on pushing Moore's
Law—keep on making transistors smaller until they get to the point where they obey not
the ordinary laws of physics (like old-style transistors) but the more bizarre laws of
quantum mechanics. The question is whether computers designed this way can do
things our conventional computers can't. If we can predict mathematically that they
might be able to, can we actually make them work like that in practice?

People have been asking those questions for several decades. Among the first were
IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the
door for quantum computing in the 1960s when he proposed that information is a
physical entity that could be manipulated according to the laws of physics. One
important consequence of this is that computers waste energy manipulating the bits
inside them (which is partly why computers use so much energy and get so hot, even
though they appear to be doing not very much at all). In the 1970s, building on
Landauer's work, Bennett showed how a computer could circumvent this problem by
working in a "reversible" way, implying that a quantum computer could carry out
massively complex computations without using massive amounts of energy. In 1981,
physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic
machine that would work in a similar way to an ordinary computer but according to the
principles of quantum physics. The following year, Richard Feynman sketched out
roughly how a machine using quantum principles could carry out basic computations. A
few years later, Oxford University's David Deutsch (one of the leading lights in quantum
computing) outlined the theoretical basis of a quantum computer in more detail. How did
these great scientists imagine that quantum computers might work?

Quantum + computing = quantum computing


The key features of an ordinary computer—bits, registers, logic gates, algorithms, and
so on—have analogous features in a quantum computer. Instead of bits, a quantum
computer has quantum bits or qubits, which work in a particularly intriguing way. Where
a bit can store either a zero or a 1, a qubit can store a zero, a one, both zero and one,
or an infinite number of values in between—and be in multiple states (store multiple
values) at the same time! If that sounds confusing, think back to light being a particle
and a wave at the same time, Schrödinger's cat being alive and dead, or a car being a
bicycle and a bus. A gentler way to think of the numbers qubits store is through the
physics concept of superposition (where two waves add to make a third one that
contains both of the originals). If you blow on something like a flute, the pipe fills up with
a standing wave: a wave made up of a fundamental frequency (the basic note you're
playing) and lots of overtones or harmonics (higher-frequency multiples of the
fundamental). The wave inside the pipe contains all these waves simultaneously: they're
added together to make a combined wave that includes them all. Qubits use
superposition to represent multiple states (multiple numeric values) simultaneously in a
similar way.

Just as a quantum computer can store multiple numbers at once, so it can process
them simultaneously. Instead of working in serial (doing a series of things one at a time
in a sequence), it can work in parallel (doing multiple things at the same time). Only
when you try to find out what state it's actually in at any given moment (by measuring it,
in other words) does it "collapse" into one of its possible states—and that gives you the
answer to your problem. Estimates suggest a quantum computer's ability to work in
parallel would make it millions of times faster than any conventional computer... if only
we could build it! So how would we do that?

What would a quantum computer be like in reality?


In reality, qubits would have to be stored by atoms, ions (atoms with too many or too
few electrons), or even smaller things such as electrons and photons (energy packets),
so a quantum computer would be almost like a table-top version of the kind of particle
physics experiments they do at Fermilab or CERN. Now you wouldn't be racing particles
round giant loops and smashing them together, but you would need mechanisms for
containing atoms, ions, or subatomic particles, for putting them into certain states (so
you can store information), knocking them into other states (so you can make them
process information), and figuring out what their states are after particular operations
have been performed.

Photo: A single atom or ion can be trapped in an optical cavity—the space between mirrors—and controlled by
precise pulses from laser beams.

In practice, there are lots of possible ways of containing atoms and changing their
states using laser beams, electromagnetic fields, radio waves, and an assortment of
other techniques. One method is to make qubits using quantum dots, which are
nanoscopically tiny particles of semiconductors inside which individual charge carriers,
electrons and holes (missing electrons), can be controlled. Another method makes
qubits from what are called ion traps: you add or take away electrons from an atom to
make an ion, hold it steady in a kind of laser spotlight (so it's locked in place like a
nanoscopic rabbit dancing in a very bright headlight), and then flip it into different states
with laser pulses. In another technique, the qubits are photons inside optical cavities
(spaces between extremely tiny mirrors). Don't worry if you don't understand; not many
people do. Since the entire field of quantum computing is still largely abstract and
theoretical, the only thing we really need to know is that qubits are stored by atoms or
other quantum-scale particles that can exist in different states and be switched between
them.
What can quantum computers do that ordinary
computers can't?
Although people often assume that quantum computers must automatically be better
than conventional ones, that's by no means certain. So far, just about the only thing we
know for certain that a quantum computer could do better than a normal one is
factorisation: finding two unknown prime numbers that, when multiplied together, give
a third, known number. In 1994, while working at Bell Laboratories, mathematician Peter
Shor demonstrated an algorithm that a quantum computer could follow to find the "prime
factors" of a large number, which would speed up the problem enormously. Shor's
algorithm really excited interest in quantum computing because virtually every modern
computer (and every secure, online shopping and banking website) uses public-key
encryption technology based on the virtual impossibility of finding prime factors quickly
(it is, in other words, essentially an "intractable" computer problem). If quantum
computers could indeed factor large numbers quickly, today's online security could be
rendered obsolete at a stroke. But what goes around comes around, and some
researchers believe quantum technology will lead to much stronger forms of encryption.
(In 2017, Chinese researchers demonstrated for the first time how quantum encryption
could be used to make a very secure video call from Beijing to Vienna.)

Does that mean quantum computers are better than conventional ones? Not exactly.
Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any
other algorithms have been discovered that would be better performed by quantum
methods. Given enough time and computing power, conventional computers should still
be able to solve any problem that quantum computers could solve, eventually. In other
words, it remains to be proven that quantum computers are generally superior to
conventional ones, especially given the difficulties of actually building them. Who knows
how conventional computers might advance in the next 50 years, potentially making the
idea of quantum computers irrelevant—and even absurd.

Quantum dots are probably best known as colorful nanoscale crystals, but they can also be used as qubits in
quantum computers). Photo courtesy of Argonne National Laboratory.
Why is it so hard to make a quantum computer?
We have decades of experience building ordinary, transistor-based computers with
conventional architectures; building quantum machines means reinventing the whole
idea of a computer from the bottom up. First, there are the practical difficulties of making
qubits, controlling them very precisely, and having enough of them to do really useful
things. Next, there's a major difficulty with errors inherent in a quantum system—"noise"
as this is technically called—which seriously compromises any calculations a quantum
computer might make. There are ways around this ("quantum error correction"), but
they introduce a great deal more complexity. There's also the fundamental issue of how
you get data in and out of a quantum computer, which is, itself, a complex computing
problem. Some critics believe these issues are insurmountable; others acknowledge the
problems but argue the mission is too important to abandon.

Understanding quantum computers: The basics


by TRISTAN GREENE — Mar 16, 2018 in ARTIFICIAL INTELLIGENCE

Credit: Nicole Gray

Theoretical physics isn’t the easiest field in science to translate into laymen’s terms. Until
recently it’s solely been the domain of geniuses like the late Stephen Hawking and fictional
characters such as The Big Bang Theory’s Sheldon Cooper. As companies like Google, IBM, and
Intel work to build quantum computer systems that offer supremacy over binary computers, it’s a
good time to learn some basic terms and concepts. We’ve got you covered fam.
Quantum computers are devices capable of performing computations using quantum bits, or
qubits. You can learn why they’re important here, and read some fun and wacky facts about them
here.

Qubits
The first thing we need to understand is what a qubit actually is. A “classical computer,” like the
one you’re reading this on (either desktop, laptop, tablet, or phone), is also referred to as a
“binary computer” because all of the functions it performs are based on either ones or zeros.

On a binary computer the processor uses transistors to perform calculations. Each transistor can
be on or off, which indicates the one or zero used to compute the next step in a program or
algorithm.

There’s more to it than that, but the important thing to know about binary computers is the ones
and zeros they use as the basis for computations are called “bits.”

Quantum computers don’t use bits; they use qubits. Qubits, aside from sounding way cooler,
have extra functions that bits don’t. Instead of only being represented as a one or zero, qubits can
actually be both at the same time. Often qubits, when unobserved, are considered to be
“spinning.” Instead of referring to these types of “spin qubits” using ones or zeros, they’re
measured in states of “up,” “down,” and “both.”

Superposition
Qubits can be more than one thing at a time because of a strange phenomenon called
superposition. Quantum superposition in qubits can be explained by flipping a coin. We know
that the coin will land in one of two states: heads or tails. This is how binary computers think.
While the coin is still spinning in the air, assuming your eye isn’t quick enough to ‘observe’ the
actual state it’s in, the coin is actually in both states at the same time. Essentially until the coin
lands it has to be considered both heads and tails simultaneously.

A clever scientist by the name of Schrodinger explained this phenomenon using a cat which he
demonstrated as being both alive and dead at the same time.

Observation theory
Qubits work on the same principle. An area of related study called “observation theory” dictates
that when a quantum particle is being watched it can act like a wave. Basically the universe acts
one way when we’re looking, another way when we aren’t. This means quantum computers,
using their qubits, can simulate the subatomic particles of the universe in a way that’s actually
natural: they speak the same language as an electron or proton, basically.

Different companies are approaching qubits in different ways because, as of right now, working
with them is incredibly difficult. Since observing them changes their state, and using them
creates noise – the more qubits you have the more errors you get – measuring them is
challenging to say the least.

This challenge is exacerbated by the fact that most quantum processors have to be kept at near
perfect-zero temperatures (colder than space) and require an amount power that is unsustainably
high for the quality of computations. Right now, quantum computers aren’t worth the trouble and
money they take to build and operate.

In the future, however, they’ll change our entire understanding of biology, chemistry, and
physics. Simulations at the molecular level could be conducted that actually imitate physical
concepts in the universe we’ve never been able to reproduce or study.

Quantum Supremacy
For quantum computers to become useful to society we’ll have to achieve certain milestones
first. The point at which a quantum computer can process information and perform calculations
that a binary computer can’t is called quantum supremacy.

Quantum supremacy isn’t all fun and games though, it presents another set of problems. When
quantum computers are fully functional even modest systems in the 100 qubit range may be able
to bypass binary security like a hot knife through butter.

This is because those qubits, which can be two things at once, figure out multiple solutions to a
problem at once. They don’t have to follow binary logic like “if one thing happens do this but if
another thing happens do something else.” Individual qubits can do both at the same time while
spinning, for example, and then produce the optimum result when properly observed.

Currently there’s a lot of buzz about quantum computers, and rightfully so. Google is pretty sure
its new Bristlecone processor will achieve quantum supremacy this year. And it’s hard to bet
against Google or one of the other big tech companies. Especially when Intel has already put a
quantum processor on a silicon chip and you can access IBM’s in the cloud right now.

No matter your feelings on quantum computers, qubits, or half-dead/half-alive cats, the odds are
pretty good that quantum computers will follow the same path that IBM’s mainframes did.
They’ll get smaller, faster, more powerful, and eventually we’ll all be using them, even if we
don’t understand the science behind them.
Quantum computing

The Bloch sphere is a representation of a qubit, the fundamental building block of quantum computers.

Quantum computing is the use of quantum-mechanical phenomena such as superposition and


entanglement to perform computation. A quantum computer is used to perform such
computation, which can be implemented theoretically or physically.[1]:I-5

The field of quantum computing is actually a sub-field of quantum information science, which
includes quantum cryptography and quantum communication. Quantum computing was started
in the early 1980s when Richard Feynman and Yuri Manin expressed the idea that a quantum
computer had the potential to simulate things that a classical computer could not.[2][3] In 1994,
Peter Shor published an algorithm that is able to efficiently solve some problems that are used in
asymmetric cryptography that are considered hard for classical computers.[4]

Qubits are fundamental to quantum computing and are somewhat analogous to bits in a classical
computer. Qubits can be in a 1 or 0 quantum state. But they can also be in a superposition of the
1 and 0 states. However, when qubits are measured the result is always either a 0 or a 1; the
probabilities of the two outcomes depends on the quantum state they were in.

What is Quantum Computing?


In 1981, at Argonne National Labs, a man by the name of Paul Benioff used Max Planck's idea that
energy exists in individual units, as does matter, to theorize the concept of quantum computing.
Since that year, the mere idea of manufacturing quantum computers for everyday use is becoming
more tangible with new technological advances in quantum theories. Quantum computing focuses
on the principles of quantum theory, which deals with modern physics that explain the behavior of
matter and energy of an atomic and subatomic level. Quantum computing makes use of quantum
phenomena, such as quantum bits, superposition, and entanglement to perform data operations.
Computing in this manner essentially tackles extremely difficult tasks that ordinary computers cannot
perform on their own.
In classical computing, a bit is a term to represent information by computers. Quantum computing
uses quantum bits or qubits for a unit of memory. Qubits are comprised of a two-state quantum-
mechanical system. A quantum-mechanical system is one that can exist in any two distinguishable
quantum states. Seeing the terms superposition and entanglement might be baffling, but it's
okay: we don't experience these phenomena in our lives. Superposition is a principle that
states while we do not know the state of an object at a given time, it is possible that it is in all states
simultaneously, as long as we do not look at it to check its state. The way that energy and mass
become correlated to interact with each other regardless of distance is called entanglement.

Why do these phenomena matter?


Entanglement and superpositioning are extremely important in advancing computing and
communications that can benefit us in numerous ways. These two phenomena can be used to
process an extremely large number of calculations, whereas ordinary computers cannot. The power
of quantum computing is astonishing and not many grasp the full capabilities it has to offer. While
classical computers process information as 1's and 0's, quantum computers operate according to the
laws of physics; this means information can be processed as 1's and 0's, or 1 and 0 simultaneously.
This is possible because of the quantum-mechanical principle superposition.

In the real world where bits are perceived as "0" o "1," only one of the four possible states can exist
at any time in space. However, in a quantum superposition state, all four of the possible states can
co-exist in time and space simultaneously.

What is quantum cryptography?

Quantum cryptography is a method of encryption that uses the naturally occurring


properties of quantum mechanics to secure and transmit data in a way that cannot
be hacked.
Cryptography is the process of encrypting and protecting data so that only the
person who has the right secret key can decrypt it. Quantum cryptography is
different from traditional cryptographic systems in that it relies on physics, rather
than mathematics, as the key aspect of its security model.

Quantum cryptography is a system that is completely secure against being


compromised without the knowledge of the message sender or the receiver. That
is, it is impossible to copy or view data encoded in a quantum state without alerting
the sender or receiver. Quantum cryptography should also remain safe against
those using quantum computing as well.

Quantum cryptography uses individual particles of light, or photons, to transmit


data over fiber optic wire. The photons represent binary bits. The security of the
system relies on quantum mechanics. the principles of quantum mechanics behind
quantum cryptography, such as:

• The particles that make up the universe are inherently uncertain and can
simultaneously exist in more than one place or more than one state of being.
• Photons are generated randomly in one of two quantum states.
• You can’t measure a quantum property without changing or disturbing it.
• You can clone some quantum properties of a particle, but not the whole
particle.
All these principles play a role in how quantum cryptography works.

These properties make it impossible to measure the quantum state of any system
without disturbing that system.

Photons are used for quantum cryptography because they offer all the necessary
qualities needed: Their behavior is well understood, and they are information
carriers in optical fiber cables. One of the best-known examples of quantum
cryptography currently is quantum key distribution (QKD), which provides a
secure method for key exchange.
How does quantum cryptography work?
In theory, quantum cryptography works by following a model that was developed
in 1984.

The model assumes there are two people named Alice and Bob who wish to
exchange a message securely. Alice initiates the message by sending Bob a key.
The key is a stream of photons that travel in one direction. Each photon represents
a single bit of data -- either a 0 or 1. However, in addition to their linear travel,
these photons are oscillating, or vibrating, in a certain manner.

So, before Alice, the sender, initiates the message, the photons travel through a
polarizer. The polarizer is a filter that enables certain photons to pass through it
with the same vibrations and lets others pass through in a changed state of
vibration. The polarized states could be vertical (1 bit), horizontal (0 bit), 45
degrees right (1 bit) or 45 degrees left (0 bit). The transmission has one of two
polarizations representing a single bit, either 0 or 1, in either scheme she uses.

The photons now travel across optical fiber from the polarizer toward the receiver,
Bob. This process uses a beam splitter that reads the polarization of each photon.
When receiving the photon key, Bob does not know the correct polarization of the
photons, so one polarization is chosen at random. Alice now compares what Bob
used to polarize the key and then lets Bob know which polarizer she used to send
each photon. Bob then confirms if he used the correct polarizer. The photons read
with the wrong splitter are then discarded, and the remaining sequence is
considered the key.

Let's suppose there is an eavesdropper present, named Eve. Eve attempts to listen
in and has the same tools as Bob. But Bob has the advantage of speaking to Alice
to confirm which polarizer type was used for each photon; Eve doesn't. Eve ends
up rendering the final key incorrectly.
Alice and Bob would also know if Eve was eavesdropping on them. Eve observing
the flow of photons would then change the photon positions that Alice and Bob
expect to see.

What quantum cryptography is used for and examples


Quantum cryptography enables users to communicate more securely compared to
traditional cryptography. After keys are exchanged between the involved parties,
there is little concern that a malicious actor could decode the data without the key.
If the key is observed when it is being constructed, the expected outcome changes,
alerting both the sender and the receiver.

This method of cryptography has yet to be fully developed; however, there have
been successful implementations of it:

• The University of Cambridge and Toshiba Corp. created a high-bit rate


QKD system using the BB84 quantum cryptography protocol.
• The Defense Advanced Research Projects Agency Quantum Network,
which ran from 2002 to 2007, was a 10-node QKD network developed
by Boston University, Harvard University, and IBM Research.
• Quantum Xchange launched the first quantum network in the U.S.,
featuring 1,000 kilometers (km) of fiber optic cable.
• Commercial companies, such as ID Quantique, Toshiba, Quintessence
Labs, and MagiQ Technologies Inc., also developed commercial QKD
systems.

In addition to QKD, some of the more notable protocols and quantum algorithms
used in quantum cryptography are the following:

• quantum coin flipping.


• position-based quantum cryptography; and
• device-independent quantum cryptography.
Benefits of quantum cryptography
Benefits that come with quantum cryptography include the following:

• Provides secure communication. Instead of difficult-to-crack numbers,


quantum cryptography is based on the laws of physics, which is a more
sophisticated and secure method of encryption.
• Detects eavesdropping. If a third-party attempts to read the encoded
data, then the quantum state changes, modifying the expected outcome
for the users.
• Offers multiple methods for security. There are numerous quantum
cryptography protocols used. Some, like QKD, for example, can combine
with classical encryption methods to increase security.

Limitations of quantum cryptography


Potential downsides and limitations that come with quantum cryptography include
the following:
• Changes in polarization and error rates. Photons may change
polarization in transit, which potentially increases error rates.
• Range. The maximum range of quantum cryptography has typically been
around 400 to 500 km, except for Terra Quantum, as noted below.
• Expense. Quantum cryptography typically requires its own
infrastructure, using fiber optic lines and repeaters.
• Number of destinations. It is not possible to send keys to two or more
locations in a quantum channel.

Differences between traditional cryptography and quantum cryptography


The classic form of cryptography is the process of mathematically scrambling data,
where only one person who has access to the right key can read it. Traditional
cryptography has two different types of key distribution: symmetric key and
asymmetric key. Symmetric key algorithms work by using a single key to encrypt
and decrypt information, whereas asymmetric cryptography uses two keys --
a public key to encrypt messages and a private key to decode them. Traditional
cryptography methods have been trusted since it would take an impractical time
frame for classical computers to factor the needed large numbers that make up
public and private keys.

Unlike traditional cryptography, which is based on mathematics, quantum


cryptography is based on the laws of quantum mechanics. And, whereas traditional
cryptography is based on mathematical computation, quantum cryptography is
much harder to decrypt since the act of observing the involved photons changes the
expected outcome, making both the sender and receiver aware of the presence of
an eavesdropper. Quantum cryptography also typically has a distance or range
associated with it since the process requires fiber optic cables and repeaters spaced
out to boost the signal.
What’s the difference between post-quantum cryptography and quantum
cryptography?

Post-quantum cryptography refers to cryptographic algorithms (usually public-key


algorithms) that are thought to be secure against an attack by a quantum computer.
These complex mathematical equations take traditional computers months or even
years to break. However, quantum computers running Shor’s algorithm will be
able to break math-based systems in moments.

Quantum cryptography, on the other hand, uses the principles of quantum


mechanics to send secure messages, and unlike mathematical encryption, is truly
un-hackable.

Unlike mathematical encryption, quantum cryptography uses the principles of


quantum mechanics to encrypt data and making it virtually unhackable.

Future of quantum cryptography implementation


Quantum computers are in the early phases and need more development before a
broad audience can start using quantum communications. Even though there are
limitations to quantum cryptography, such as not being able to send keys to two
locations at once, the field is still steadily growing.
Recent advances, for example, include improvements in range. Swiss quantum
technology company Terra Quantum announced a breakthrough for quantum
cryptography in terms of range. Previously, the distance for quantum cryptography
was limited to a maximum of 400 to 500 km. Terra Quantum's development
enables quantum cryptography keys to be transmitted over more than 40,000 km.
Instead of building a new optical line filled with numerous repeaters, Terra
Quantum's development enables quantum keys to be distributed inside standard
optical fiber lines that are already being used in telecom networks.

Quantum cryptography sounds complex – probably because it is. That’s why we


put together this “encryption guide for dummies” as a way of explaining what
quantum cryptography is and taking some of the complexity out of it.

Although the subject has been around for a couple of decades, quantum
cryptography (not to be confused with post-quantum cryptography) is quickly
becoming more critically relevant to our everyday lives because of how it can
safeguard vital data in a way that current encryption methods can’t.

Consider, for example, the trust you place in banks and commercial enterprises to
keep your credit card and other information safe while conducting business
transactions online. What if those companies – using current encryption methods –
could no longer guarantee the security of your private information? Granted,
cybercriminals are always trying to gain access to secure data, but when quantum
computers come online, that information will be even more vulnerable to being
hacked. In fact, hackers don’t even need to wait for quantum computers to start the
process because they’re collecting encrypted data now to decrypt later when the
quantum computers are ready. With quantum encryption, that’s not the case
because your information will be unhackable.

Quantum cryptography video links:

https://youtu.be/_5NQf8k3Jo0
https://quantumxc.com/blog/quantum-cryptography-explained/
https://www.youtube.com/watch?v=44G9UuB2RWI
Introduction to Quantum Teleportation
Alice wants to send quantum information to Bob. Specifically, suppose she wants
to send the qubit state |ψ⟩=α|0⟩+β|1⟩|ψ⟩=α|0⟩+β|1⟩. This entails passing on
information about αα and ββ to Bob.
There exists a theorem in quantum mechanics which states that you cannot simply
make an exact copy of an unknown quantum state. This is known as the no-cloning
theorem. As a result of this we can see that Alice can't simply generate a copy
of |ψ⟩|ψ⟩ and give the copy to Bob. We can only copy classical states (not
superpositions).
However, by taking advantage of two classical bits and an entangled qubit pair,
Alice can transfer her state |ψ⟩|ψ⟩ to Bob. We call this teleportation because, at
the end, Bob will have |ψ⟩|ψ⟩ and Alice won't anymore.

To transfer a quantum bit, Alice and Bob must use a third party (Telamon) to send
them an entangled qubit pair. Alice then performs some operations on her qubit,
sends the results to Bob over a classical communication channel, and Bob then
performs some operations on his end to receive Alice’s qubit.

The no-cloning theorem


The no cloning theorem is a result of quantum mechanics which forbids the creation
of identical copies of an arbitrary unknown quantum state. It was stated
by Wootters, Zurek, and Dieks in 1982, and has profound implications in quantum
computing and related fields.

The no cloning theorem prevents us from using classical error correction techniques
on quantum states. For example, we cannot create backup copies of a state in the
middle of a quantum computation, and use them to correct subsequent errors.

The no cloning theorem is a vital ingredient in quantum cryptography, as it forbids


eavesdroppers from creating copies of a transmitted quantum cryptographic key.

Fundamentally, the no-cloning theorem protects the uncertainty principle in quantum


mechanics. If one could clone an unknown state, then one could make as many copies
of it as one wished, and measure each dynamical variable with arbitrary precision,
thereby bypassing the uncertainty principle. This is prevented by the non-cloning
theorem.

You might also like