Programming Quantum Neural Networks On NISQ Systems: An Overview of Technologies and Methodologies
Programming Quantum Neural Networks On NISQ Systems: An Overview of Technologies and Methodologies
738
Article
Stefano Markidis
Special Issue
Quantum Machine Learning 2022
Edited by
Prof. Dr. Andreas Wichert
https://doi.org/10.3390/e25040694
entropy
Article
Programming Quantum Neural Networks on NISQ Systems:
An Overview of Technologies and Methodologies
Stefano Markidis
Abstract: Noisy Intermediate-Scale Quantum (NISQ) systems and associated programming interfaces
make it possible to explore and investigate the design and development of quantum computing
techniques for Machine Learning (ML) applications. Among the most recent quantum ML approaches,
Quantum Neural Networks (QNN) emerged as an important tool for data analysis. With the QNN
advent, higher-level programming interfaces for QNN have been developed. In this paper, we survey
the current state-of-the-art high-level programming approaches for QNN development. We discuss
target architectures, critical QNN algorithmic components, such as the hybrid workflow of Quantum
Annealers and Parametrized Quantum Circuits, QNN architectures, optimizers, gradient calculations,
and applications. Finally, we overview the existing programming QNN frameworks, their software
architecture, and associated quantum simulators.
Keywords: Quantum Neural Networks; QNN programming frameworks; Amazon Braket; D-Wave
Ocean; Intel HQCL; Microsoft QDK; Nvidia CUDA Quantum; OriginQ QPanda; Qiskit Machine
Leaning; PennyLane; Rigetti Grove; Strawberry Fields; TensorFlow Quantum; Torch Quantum;
Zapata Orquestra
1. Introduction
Quantum computing is emerging as a disruptive and promising approach to attacking
computational and data analysis problems. Quantum computing relies on three essential
Citation: Markidis, S. Programming quantum effects inaccessible directly by classical computing systems [1,2]: (i) calculation
Quantum Neural Networks on NISQ on a superposition of quantum states somehow reminiscent of parallel computing, (ii) en-
Systems: An Overview of tanglement to correlate different quantum states, and (iii) quantum tunneling. These three
Technologies and Methodologies. effects can be used to seek the so-called quantum advantage [3] over classical algorithms by,
Entropy 2023, 25, 694. https:// for instance, computing in a superposition or hopping between optimization landscapes
doi.org/10.3390/e25040694 via quantum tunneling. The first critical quantum computing applications with quantum
Academic Editor: Andreas Wichert
advantage are in the area of cryptology and search algorithms with the most famous Shor’s
and Grover’s algorithms. Today, researchers’ attention started focusing on the possibility
Received: 1 March 2023 of developing quantum Machine Learning (ML) applications [4,5] for classical and quantum
Revised: 3 April 2023 data, e.g., data encoded as a superposition of quantum states, resulting from quantum
Accepted: 4 April 2023 simulations or sensing.
Published: 20 April 2023 The early quantum ML approaches rely on the so-called quantum Basic Linear Algebra
Subprograms (qBLAS) primitives [4]. Examples of qBLAS routines are the Quantum
Fourier Transform (QFT), Quantum Phase Estimation (QPE) for obtaining eigenstates
and eigenphases, and the Harrow–Hassidim–Lloyd (HHL) algorithm for solving linear
Copyright: © 2023 by the author.
systems [6]. These qBLAS-based ML methods consist of classical ML approaches, such
Licensee MDPI, Basel, Switzerland.
This article is an open access article
as the quantum Principal Component Analysis (PCA) [7], quantum regression with least-
distributed under the terms and
square fitting [8], quantum topological analysis [9], quantum Bayesian inference [10], and
conditions of the Creative Commons quantum Support Vector Machine (SVD) [11]. While these quantum ML methods exhibit a
Attribution (CC BY) license (https:// clear quantum advantage concerning corresponding classical algorithms, severe constraints,
creativecommons.org/licenses/by/ such as embedding classical data into quantum states, the need for quantum memory,
4.0/).
qRAM [12], and output analysis and post-processing, limit their immediate applicability
on Noisy Intermediate-Scale Quantum (NISQ) computers [13].
Conversely, a second family of quantum ML methods, based on heuristics and hybrid
classical-quantum computing instead of purely quantum BLAS primitives, can readily
exploit the NISQ systems, albeit not demonstrating a crystal clear quantum advantage
yet [14,15]. These methods target the development of the so-called Quantum Neural
Networks (QNN). Similarly to classical Neural Networks (NN), in QNNs, an optimization
process provides the weights and biases of a neural network by minimizing a loss function
measured (or sampled) on the quantum computers. This survey focuses on this second
family of quantum ML methods that can readily use NISQ systems.
With quantum computers hardware becoming widely available on several cloud ser-
vices (e.g., via IBM, Google, Rigetti, Amazon Braket, and Microsoft Quantum Azure clouds,
to mention a few examples), there is an increased interest on the software quantum com-
puting side, specifically in designing and developing programming abstractions, patterns,
and templates to assist application developers and data scientists in implementing QNNs
in a productive and high-performance manner.
Regarding software development for quantum computing, there is already an ecosys-
tem of programming approaches to express quantum algorithms in terms of the quantum
gate and circuit abstractions. Examples of established programming systems [16] for the
quantum computing models are the QASM [17], akin to the assembly language for classical
CPUs, IBM’s Qiskit [18], Google’s Cirq and Rigetti’s PyQuil [19], to mention a few. These
programming models use an offloading paradigm, similar to the one used for Graphical
Processing Units (GPU) programming languages: the quantum language provides means
to define quantum circuits on a CPU, offload or launch the quantum circuit on the QPU from
the CPU (via a connection to the cloud), execute the circuit, measure an observable several
times, and finally return the measurements to the CPU.
While these programming systems enable the development and implementation of
quantum computing primitives, such as QFT and QPE, data scientists and application
developers require higher-level programming models that allow them to express their algo-
rithms in terms of quantum neural units, layers, loss functions, optimizers, and automatic
differentiation (to cite a few of the technologies critical to QNN development). Higher-level
programming frameworks, such as TensorFlow [20] or PyTorch [21] for quantum comput-
ers, are needed to increase the programmer’s productivity in developing applications on
quantum computers. In addition, together with means to express neural network concepts
and abstractions for training QNNs, quantum programming frameworks must integrate
with classical deep-learning frameworks to leverage existing software infrastructure.
In the last years, the number of QNN software has bloomed, leading to the transition
of classical NN software to quantum-enabled versions (examples are TensorFlow Quantum
and Torch Quantum), development of QNN abstractions and templates on top of existing
quantum computing frameworks (for instance, the Qiskit machine learning library built
on the top of IBM Qiskit) and creation of new programming frameworks, such as the
Xanadu’s PennyLane, targeting specifically differentiable programming and QNNs.
This article aims to provide an outlook on the different technologies and methodolo-
gies used for developing QNNs, and an overview of existing higher-level QNN program-
ming frameworks. Section 2 reviews the current target quantum computer architectures,
approaches for implementing QNNs, and methodologies, including QNN approaches,
optimizers, differentiation techniques, and applications. In Section 3, we overview different
and emerging software frameworks for developing QNNs, emphasizing characteristic fea-
tures, software organization, and associated computer simulators. Finally, we summarize
the review and outline future challenges for QNN frameworks in Section 4.
technologies in use, such as QNN building blocks, optimizers, and automatic differentia-
tion techniques.
NNs [40], such as Hopfield networks [41], Boltzmann machines [42], Restricted Boltzmann
Machines (RBM) [43], and used as a part of the Deep Belief Network (DBN) model [44].
Quantum Annealer
Ising Model / QUBO Embedding
Transition to the ground state
Formulate the Loss Function with Embed the loss function into the
the Ising Model / QUBO Matrix. Chimera Graph.
Resample
Sample the solution (the minimum of the loss function and corresponding
weights and biases) several times and obtain a distribution.
Finally, similarly to NN, we can use the back-propagation step to update the QNN
parameters. The loss function value drives an optimization step to determine new updated
parameter values (w and b) to minimize the loss function. We repeat this process for each
training sample. An essential point about PQC loss functions is that they are not limited
to QUBO problems such as QA but are more general. In fact, it is possible to solve Ising
problems using PQC.
QNNs, implemented with QPC, are a very active and fast-growing research area.
Several QNNs architectures, often mimicking the classical counterparts, have developed, in-
cluding quantum fully connected, convolutional [30,50]/quanvolutional [51], recurrent [30],
GAN [52], and tensor networks [53].
A significant research effort is made to address the so-called barren plateau problem [54]
for the QPC optimization landscape: in several PQCs, the average value of the gradient
tends to zero, and as the Hilbert dimension increases, the more states will lead to a flat
optimization landscape. For this reason, the optimizer cannot converge to the minimum
of the loss function. To address this issue, a few techniques are proposed, including an
initialization technique to initialize randomly only a subset of the parameters [55], using a
local instead of a global loss function [56], and data re-uploading [57].
Entropy 2023, 25, 694 6 of 19
PQC
⎸0 >
⎸0 >
L(w(t), b(t)) w (t)
0
⎸0 > U(w, b))
… …
w0(t+1)
⎸0 >
Update
Re-execute
Figure 2. Diagram of the basic workflow for training a PQC-based QNN.
layers have a CNOT gate chain connecting every qubit with its neighbor. Strongly
entangling layers feature a CNOT gate chain also connecting non-neighbor qubits [65].
Random entangling layers have single qubit rotations and CNOT gates, acting on ran-
domly chosen qubits. Another entangling layer is the so-called 2-design, consisting of
qubit rotations and Controlled-Z (CZ gate) entangling layers [56].
• Pooling Layers. Pooling layers reduce the quantum circuit size by typically grouping
together several qubits and performing operations that reduce the quantum state
dimensionality. The way to implement pooling layers is to measure a qubit subset of
the qubits and then use the measurement to control the following operations. Pooling
layers are an important component of quantum convolutional networks [66].
• Measurement Layers. Measurement layers are used to measure classical information
(bit) from the superposition of quantum states in the QNN. Measurements layers
typically are single-qubit measurements of the output qubits that provide classical
values for the QNN output.
In addition, the basic CV QNN layer consists of displacement, squeezing gates, in-
terferometers to mimic the linear transformation of a neural network, and a Kerr gate to
introduce nonlinearity to mimic the neural network activation function [30]. Figure 3 shows
a few simple QNN examples used to construct the full PQC.
U(!1) R(!1) ● ●
U(!2) R(!2) ●
S(x)
U(!3) R(!3) ● ●
U(!4) R(!4) ●
How to compose QNN layers automatically into PQC for solving a specific problem
and minimizing the noise impact on real quantum machines is an active research area and
led to the development of the SuperCircuit [67] and Supernet [68].
2.7. Applications
QNNs have been used in many applications similarly to classical NN. QAs and D-
Wave machines are among the most successful quantum computing platforms in existing
QML applications. Few examples include image classification (MNIST dataset) [44,84],
computational biology [85], and high-energy physics [86]. The PennyLane QNN framework
has found applications in image classification [87], cyber-security [88], medical [89], and
high-energy physics problems [90,91]. TensorFlow Quantum has been used for image
classification [66], remote sensing [92], and medical applications [93].
allows us to map the problem defined in the problem definition phase onto the
hardware qubits of the QA.
• Utilities. This component provides a set of utility functions that can be used to
analyze the results of the quantum annealing runs, visualize the embeddings, and
debug the models.
OpenJIJ (https://github.com/OpenJij/OpenJij, accessed on 3 April 2023) is an open-
source library that simulates the QAs and can be used to experiment without the D-
Wave computers.
QPanda provides both C++ and Python interfaces. Regarding PQC development, QPanda
exploits the quantum machine learning VQNet library [103,104]. QPanda also provides
several noiseless and adjustable simulation backends.
3.7. PennyLane
PennyLane is a Python library designed explicitly for differentiable computing, focus-
ing on QNNs and quantum simulations. PennyLane is developed by Xanadu and is one of
the best existing tools for prototyping and designing new QNN methods and architectures.
The PennyLane framework can be divided into the following software components:
• Pennylane Templates. The software component provides higher-level building blocks
for constructing QNNs. Templates are a library of ready-to-use templates of widely
used PQC architectures. For instance, templates can be used to encode data into
quantum states or to select pre-made QNN layers.
• Gradients and Training. This software layer provides optimization tools to train the
quantum circuits. It includes automatic differentiation libraries, such as libraries from
NumPy [105], PyTorch [21], JAX [106], and TensorFlow [20], and integrates them into
the quantum computing framework.
• Quantum Operators and Measurements. This software layer provides different quan-
tum operators, including quantum gates, noisy channels, state preparations, and
measurements. As for the measurement, PennyLane supports results from quantum
devices: observable expectation, its variance, single measurement samples, and com-
putational basis state probabilities.
• Quantum Circuit/Device The software component provides the interface between
the software and the hardware. In PennyLane, calculations involving the execution of
one or more quantum circuits are formulated as quantum node objects. The quantum
nodes are used to express the quantum circuit, pin the computation to a specific
device, and execute it. This software layer comprises PennyLane plugins for different
quantum hardware devices and simulators. These plugins enable users to execute
quantum circuits on different devices and return the measurement outcomes.
PennyLane provides several quantum computer simulators, including a state simula-
tor of qubit-based quantum systems, Gaussian states (for operations on CV architectures),
qubit-based quantum circuit architectures written in TensorFlow for automatic differentia-
tion, and qubit-based quantum circuit architectures for automatic differentiation with the
autograd library [107].
classes provide methods for configuring the PQC, its initialization, and performing
the forward and backward passes.
• Classifiers and Regressors. To train and use Quantum Neural Networks, qiskit-
machine-learning provides different learning algorithms such as the NeuralNet-
workClassifier and NeuralNetworkRegressor. These take a QNN as input and then
use it for classification or regression. Two convenience implementations are provided
to allow an easy start: the Variational Quantum Classifier (VQC) and the Variational
Quantum Regressor (VQR).
• Qiskit. At the bottom of the qiskit-machine-learning software stack, there is
Qiskit that provides quantum gate and circuits primitives (including parametrized
gates), gradients, and optimizers.
In addition, qiskit-machine-learning provides a connector to PyTorch for imple-
menting hybrid classical-quantum NNs, e.g., some nodes are classical, and some are
quantum. This hybrid architecture is obtained by embedding a quantum layer in a classi-
cal PyTorch network. Regarding quantum computer simulators, the Qiskit Aer module
provides different quantum computer simulator backends, including ideal and noisy state
vectors, density matrix, and unitary simulation backends.
3.14. Summary
To summarize the feature of the different QNN programming frameworks, we provide
an overview of current QNN programming frameworks in Table 1, providing the target
quantum architectures (possibly, also of future implementations), main programming lan-
guages, availability of quantum simulators, and distinctive features of the
programming frameworks.
Entropy 2023, 25, 694 14 of 19
Table 1. Overview of different QNN frameworks for programming QNN on NISQ systems.
4. Conclusions
In this paper, we surveyed the current state-of-the-art high-level programming ap-
proaches for QNN development. We discussed target architectures, quantum data, critical
QNN algorithmic components, such as the hybrid workflow of QA and PQC, optimizers,
and techniques for performing gradient calculations on quantum computer hardware and
simulators. We also presented existing programming QNN frameworks. The field of QNN
methods and programming frameworks quickly evolves, and new techniques and methods
will certainly develop to tackle current QNN limitations. Currently, one of the main QNN
challenges is to address the problem of barren plateau in the optimization landscape.
Additional quantum computer architectures will become available for QNN devel-
opers and users in the future. An example is the PsiQuantum’s photonics fusion-based
quantum chip [112] or the Microsoft topological quantum computers [113]. Despite the
potential Cambrian explosion of different quantum computer architectures, programming
these new quantum systems will likely retain the existing quantum computing abstractions
(gates, circuit, measurements, QNN layer, ...) and reuse existing programming approaches
to ensure portability across different platforms, an important issue already in the HPC field.
An example of a portable quantum programming framework is PennyLane, which allows
for developing specific plugins to support different and possibly new QPU devices.
Following the existing development of machine learning frameworks, such as Tensor-
Flow, it is likely that in the future, QNN frameworks will rely more and more on domain-
specific languages and compiler technologies to provide an Intermediate Representation (IR)
that can be translated to different quantum hardware (and simulator) backends. Compiler
toolchains, such as LLVM and MLIR [114–116], are already in use by the Intel Quantum
SDK [98], and CUDA Quantum. These technologies might have a prominent role in the future
of programming QNN on a quantum computer.
Funding: Funded by the European Union. This work has received funding from the European High
Performance Computing Joint Undertaking (JU) and Sweden, Finland, Germany, Greece, France,
Slovenia, Spain, and the Czech Republic under grant agreement No. 101093261.
Institutional Review Board Statement: Not applicable.
Data Availability Statement: Not applicable.
Conflicts of Interest: The author declares no conflict of interest.
Abbreviations
The following abbreviations are used in this manuscript:
References
1. Nielsen, M.A.; Chuang, I. Quantum Computation and Quantum Information; Cambridge University Press: Cambridge, UK, 2002.
2. Rieffel, E.G.; Polak, W.H. Quantum Computing: A Gentle Introduction; MIT Press: Cambridge, MA, USA, 2011.
3. Bravyi, S.; Gosset, D.; König, R. Quantum advantage with shallow circuits. Science 2018, 362, 308–311. [CrossRef] [PubMed]
4. Biamonte, J.; Wittek, P.; Pancotti, N.; Rebentrost, P.; Wiebe, N.; Lloyd, S. Quantum machine learning. Nature 2017, 549, 195–202.
[CrossRef]
5. Schuld, M.; Petruccione, F. Supervised Learning with Quantum Computers; Springer: Berlin/Heidelberg, Germany, 2018; Volume 17.
6. Harrow, A.W.; Hassidim, A.; Lloyd, S. Quantum algorithm for linear systems of equations. Phys. Rev. Lett. 2009, 103, 150502.
[CrossRef] [PubMed]
7. Lloyd, S.; Mohseni, M.; Rebentrost, P. Quantum principal component analysis. Nat. Phys. 2014, 10, 631–633. [CrossRef]
8. Wiebe, N.; Braun, D.; Lloyd, S. Quantum algorithm for data fitting. Phys. Rev. Lett. 2012, 109, 050505. [CrossRef]
9. Lloyd, S.; Garnerone, S.; Zanardi, P. Quantum algorithms for topological and geometric analysis of data. Nat. Commun. 2016,
7, 10138. [CrossRef]
10. Low, G.H.; Yoder, T.J.; Chuang, I.L. Quantum inference on Bayesian networks. Phys. Rev. A 2014, 89, 062315. [CrossRef]
11. Rebentrost, P.; Mohseni, M.; Lloyd, S. Quantum support vector machine for big data classification. Phys. Rev. Lett. 2014,
113, 130503. [CrossRef]
12. Giovannetti, V.; Lloyd, S.; Maccone, L. Quantum random access memory. Phys. Rev. Lett. 2008, 100, 160501. [CrossRef]
13. Preskill, J. Quantum computing in the NISQ era and beyond. Quantum 2018, 2, 79. [CrossRef]
14. Schuld, M.; Killoran, N. Is quantum advantage the right goal for quantum machine learning? Prx Quantum 2022, 3, 030101.
[CrossRef]
15. Boixo, S.; Smelyanskiy, V.N.; Shabani, A.; Isakov, S.V.; Dykman, M.; Denchev, V.S.; Amin, M.H.; Smirnov, A.Y.; Mohseni, M.;
Neven, H. Computational multiqubit tunnelling in programmable quantum annealers. Nat. Commun. 2016, 7, 10327. [CrossRef]
[PubMed]
16. Heim, B.; Soeken, M.; Marshall, S.; Granade, C.; Roetteler, M.; Geller, A.; Troyer, M.; Svore, K. Quantum programming languages.
Nat. Rev. Phys. 2020, 2, 709–722. [CrossRef]
17. Cross, A.; Javadi, A.; Alexander, T.; Bishop, L.; Ryan, C.A.; Heidel, S.; de Beaudrap, N.; Smolin, J.; Gambetta, J.; Johnson, B.R.
Open Quantum Assembly Language. In Proceedings of the ACM SIGPLAN Conference on Programming Language Design and
Implementation, Virtual, 20 June 2021.
18. Wille, R.; Van Meter, R.; Naveh, Y. IBM’s Qiskit tool chain: Working with and developing for real quantum computers.
In Proceedings of the 2019 Design, Automation & Test in Europe Conference & Exhibition (2019), Florence, Italy, 25–29 March
2019; pp. 1234–1240.
19. Smith, R.S.; Curtis, M.J.; Zeng, W.J. A practical quantum instruction set architecture. arXiv 2016, arXiv:1608.03355.
20. Abadi, M.; Barham, P.; Chen, J.; Chen, Z.; Davis, A.; Dean, J.; Devin, M.; Ghemawat, S.; Irving, G.; Isard, M.; et al. Tensorflow:
A system for large-scale machine learning. In Proceedings of the Osdi, Savannah, GA, USA, 2–4 November 2016; Volume 16,
pp. 265–283.
21. Paszke, A.; Gross, S.; Massa, F.; Lerer, A.; Bradbury, J.; Chanan, G.; Killeen, T.; Lin, Z.; Gimelshein, N.; Antiga, L.; et al. Pytorch:
An imperative style, high-performance deep learning library. In Proceedings of the 33rd International Conference on Neural
Information Processing Systems, Vancouver, Canada, 8 December 2019; pp. 8026–8037.
22. Yarkoni, S.; Raponi, E.; Bäck, T.; Schmitt, S. Quantum annealing for industry applications: Introduction and review. Rep. Prog.
Phys. 2022, 85, 104001. [CrossRef] [PubMed]
23. Aramon, M.; Rosenberg, G.; Valiante, E.; Miyazawa, T.; Tamura, H.; Katzgraber, H.G. Physics-inspired optimization for quadratic
unconstrained problems using a digital annealer. Front. Phys. 2019, 7, 48. [CrossRef]
24. Nakayama, H.; Koyama, J.; Yoneoka, N.; Miyazawa, T. Description: Third Generation Digital Annealer Technology; Fujitsu Limited:
Tokyo, Japan, 2021.
25. Goto, H. Quantum computation based on quantum adiabatic bifurcations of Kerr-nonlinear parametric oscillators. J. Phys. Soc.
Jpn. 2019, 88, 061015. [CrossRef]
26. Susa, Y.; Nishimori, H. Variational optimization of the quantum annealing schedule for the Lechner-Hauke-Zoller scheme. Phys.
Rev. A 2021, 103, 022619. [CrossRef]
27. Kaye, P.; Laflamme, R.; Mosca, M. An Introduction to Quantum Computing; OUP: Oxford, UK, 2006.
Entropy 2023, 25, 694 17 of 19
28. Henriet, L.; Beguin, L.; Signoles, A.; Lahaye, T.; Browaeys, A.; Reymond, G.O.; Jurczak, C. Quantum computing with neutral
atoms. Quantum 2020, 4, 327. [CrossRef]
29. Lloyd, S.; Braunstein, S.L. Quantum computation over continuous variables. Phys. Rev. Lett. 1999, 82, 1784. [CrossRef]
30. Killoran, N.; Bromley, T.R.; Arrazola, J.M.; Schuld, M.; Quesada, N.; Lloyd, S. Continuous-variable quantum neural networks.
Phys. Rev. Res. 2019, 1, 033063. [CrossRef]
31. Markidis, S. On the Physics-Informed Neural Networks for Quantum Computers. arXiv 2022, arXiv:2209.14754.
32. LaRose, R.; Coyle, B. Robust data encodings for quantum classifiers. Phys. Rev. A 2020, 102, 032420. [CrossRef]
33. Broughton, M.; Verdon, G.; McCourt, T.; Martinez, A.J.; Yoo, J.H.; Isakov, S.V.; Massey, P.; Halavati, R.; Niu, M.Y.; Zlokapa, A.;
et al. Tensorflow quantum: A software framework for quantum machine learning. arXiv 2020, arXiv:2003.02989.
34. McClean, J.R.; Rubin, N.C.; Sung, K.J.; Kivlichan, I.D.; Bonet-Monroig, X.; Cao, Y.; Dai, C.; Fried, E.S.; Gidney, C.; Gimby, B.; et al.
OpenFermion: The electronic structure package for quantum computers. Quantum Sci. Technol. 2020, 5, 034014. [CrossRef]
35. Sun, Q.; Berkelbach, T.C.; Blunt, N.S.; Booth, G.H.; Guo, S.; Li, Z.; Liu, J.; McClain, J.D.; Sayfutyarova, E.R.; Sharma, S.; et al.
PySCF: The Python-based simulations of chemistry framework. Wiley Interdiscip. Rev. Comput. Mol. Sci. 2018, 8, e1340. [CrossRef]
36. Hu, F.; Wang, B.N.; Wang, N.; Wang, C. Quantum machine learning with D-wave quantum computer. Quantum Eng. 2019, 1, e12.
[CrossRef]
37. Nath, R.K.; Thapliyal, H.; Humble, T.S. A review of machine learning classification using quantum annealing for real-world
applications. SN Comput. Sci. 2021, 2, 365. [CrossRef]
38. Boothby, T.; King, A.D.; Roy, A. Fast clique minor generation in Chimera qubit connectivity graphs. Quantum Inf. Process. 2016,
15, 495–508. [CrossRef]
39. Klymko, C.; Sullivan, B.D.; Humble, T.S. Adiabatic quantum programming: Minor embedding with hard faults. Quantum Inf.
Process. 2014, 13, 709–729. [CrossRef]
40. MacKay, D.J.; Mac Kay, D.J. Information Theory, Inference and Learning Algorithms; Cambridge University Press: Cambridge,
UK, 2003.
41. Bauckhage, C.; Sanchez, R.; Sifa, R. Problem solving with Hopfield networks and adiabatic quantum computing. In Proceedings
of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; pp. 1–6.
42. Dorband, J.E. A Boltzmann machine implementation for the d-wave. In Proceedings of the 2015 12th International Conference on
Information Technology-New Generations, Las Vegas, NV, USA, 13–15 April 2015; pp. 703–707.
43. Dixit, V.; Selvarajan, R.; Alam, M.A.; Humble, T.S.; Kais, S. Training restricted boltzmann machines with a d-wave quantum
annealer. Front. Phys. 2021, 9, 589626. [CrossRef]
44. Adachi, S.H.; Henderson, M.P. Application of quantum annealing to training of deep neural networks. arXiv 2015,
arXiv:1510.06356.
45. Benedetti, M.; Lloyd, E.; Sack, S.; Fiorentini, M. Parameterized quantum circuits as machine learning models. Quantum Sci.
Technol. 2019, 4, 043001. [CrossRef]
46. Farhi, E.; Neven, H. Classification with quantum neural networks on near term processors. arXiv 2018, arXiv:1802.06002.
47. Chen, H.; Wossnig, L.; Severini, S.; Neven, H.; Mohseni, M. Universal discriminative quantum neural networks. Quantum Mach.
Intell. 2021, 3, 1. [CrossRef]
48. Ruder, S. An overview of gradient descent optimization algorithms. arXiv 2016, arXiv:1609.04747.
49. Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980.
50. Cong, I.; Choi, S.; Lukin, M.D. Quantum convolutional neural networks. Nat. Phys. 2019, 15, 1273–1278. [CrossRef]
51. Henderson, M.; Shakya, S.; Pradhan, S.; Cook, T. Quanvolutional neural networks: Powering image recognition with quantum
circuits. Quantum Mach. Intell. 2020, 2, 2. [CrossRef]
52. Huang, H.L.; Du, Y.; Gong, M.; Zhao, Y.; Wu, Y.; Wang, C.; Li, S.; Liang, F.; Lin, J.; Xu, Y.; et al. Experimental quantum generative
adversarial networks for image generation. Phys. Rev. Appl. 2021, 16, 024051. [CrossRef]
53. Huggins, W.; Patil, P.; Mitchell, B.; Whaley, K.B.; Stoudenmire, E.M. Towards quantum machine learning with tensor networks.
Quantum Sci. Technol. 2019, 4, 024001. [CrossRef]
54. McClean, J.R.; Boixo, S.; Smelyanskiy, V.N.; Babbush, R.; Neven, H. Barren plateaus in quantum neural network training
landscapes. Nat. Commun. 2018, 9, 4812. [CrossRef] [PubMed]
55. Grant, E.; Wossnig, L.; Ostaszewski, M.; Benedetti, M. An initialization strategy for addressing barren plateaus in parametrized
quantum circuits. Quantum 2019, 3, 214. [CrossRef]
56. Cerezo, M.; Sone, A.; Volkoff, T.; Cincio, L.; Coles, P.J. Cost function dependent barren plateaus in shallow parametrized quantum
circuits. Nat. Commun. 2021, 12, 1791. [CrossRef]
57. Pérez-Salinas, A.; Cervera-Lierta, A.; Gil-Fuster, E.; Latorre, J.I. Data re-uploading for a universal quantum classifier. Quantum
2020, 4, 226. [CrossRef]
58. Wolf, M.M. Quantum Channels & Operations: Guided Tour; Lecture Notes; Niels-Bohr Institute: Copenhagen, Denmark, 2012;
Volume 5, p. 13. Available online: https://mediatum.ub.tum.de/doc/1701036/document.pdf (accessed on 3 April 2023).
59. Schuld, M.; Killoran, N. Quantum machine learning in feature Hilbert spaces. Phys. Rev. Lett. 2019, 122, 040504. [CrossRef]
60. Schuld, M.; Sweke, R.; Meyer, J.J. Effect of data encoding on the expressive power of variational quantum-machine-learning
models. Phys. Rev. A 2021, 103, 032430. [CrossRef]
Entropy 2023, 25, 694 18 of 19
61. Havlíček, V.; Córcoles, A.D.; Temme, K.; Harrow, A.W.; Kandala, A.; Chow, J.M.; Gambetta, J.M. Supervised learning with
quantum-enhanced feature spaces. Nature 2019, 567, 209–212. [CrossRef]
62. Kyriienko, O.; Paine, A.E.; Elfving, V.E. Solving nonlinear differential equations with differentiable quantum circuits. Phys. Rev.
A 2021, 103, 052416. [CrossRef]
63. Paine, A.E.; Elfving, V.E.; Kyriienko, O. Quantum kernel methods for solving differential equations. arXiv 2022, arXiv:2203.08884.
64. Heim, N.; Ghosh, A.; Kyriienko, O.; Elfving, V.E. Quantum model-discovery. arXiv 2021, arXiv:2111.06376.
65. Schuld, M.; Bocharov, A.; Svore, K.M.; Wiebe, N. Circuit-centric quantum classifiers. Phys. Rev. A 2020, 101, 032308. [CrossRef]
66. Chen, G.; Chen, Q.; Long, S.; Zhu, W.; Yuan, Z.; Wu, Y. Quantum convolutional neural network for image classification. Pattern
Anal. Appl. 2022, 26, 655–667. [CrossRef]
67. Wang, H.; Ding, Y.; Gu, J.; Lin, Y.; Pan, D.Z.; Chong, F.T.; Han, S. QuantumNAS: Noise-adaptive search for robust quantum
circuits. In Proceedings of the 2022 IEEE International Symposium on High-Performance Computer Architecture (HPCA), Seoul,
Republic of Korea, 2–6 April 2022; pp. 692–708.
68. Du, Y.; Huang, T.; You, S.; Hsieh, M.H.; Tao, D. Quantum circuit architecture search for variational quantum algorithms. NPJ
Quantum Inf. 2022, 8, 62. [CrossRef]
69. Zhu, D.; Linke, N.M.; Benedetti, M.; Landsman, K.A.; Nguyen, N.H.; Alderete, C.H.; Perdomo-Ortiz, A.; Korda, N.; Garfoot, A.;
Brecque, C.; et al. Training of quantum circuits on a hybrid quantum computer. Sci. Adv. 2019, 5, eaaw9918. [CrossRef] [PubMed]
70. Nelder, J.A.; Mead, R. A simplex method for function minimization. Comput. J. 1965, 7, 308–313. [CrossRef]
71. Bonet-Monroig, X.; Wang, H.; Vermetten, D.; Senjean, B.; Moussa, C.; Bäck, T.; Dunjko, V.; O’Brien, T.E. Performance comparison
of optimization methods on variational quantum algorithms. arXiv 2021, arXiv:2111.13454.
72. Virtanen, P.; Gommers, R.; Oliphant, T.E.; Haberland, M.; Reddy, T.; Cournapeau, D.; Burovski, E.; Peterson, P.; Weckesser, W.;
Bright, J.; et al. SciPy 1.0: Fundamental algorithms for scientific computing in Python. Nat. Methods 2020, 17, 261–272. [CrossRef]
73. Sweke, R.; Wilde, F.; Meyer, J.; Schuld, M.; Fährmann, P.K.; Meynard-Piganeau, B.; Eisert, J. Stochastic gradient descent for hybrid
quantum-classical optimization. Quantum 2020, 4, 314. [CrossRef]
74. Spall, J.C. An overview of the simultaneous perturbation method for efficient optimization. Johns Hopkins Apl Tech. Dig. 1998,
19, 482–492.
75. Stokes, J.; Izaac, J.; Killoran, N.; Carleo, G. Quantum natural gradient. Quantum 2020, 4, 269. [CrossRef]
76. Amari, S.I. Natural gradient works efficiently in learning. Neural Comput. 1998, 10, 251–276. [CrossRef]
77. Baydin, A.G.; Pearlmutter, B.A.; Radul, A.A.; Siskind, J.M. Automatic differentiation in machine learning: A survey. J. Marchine
Learn. Res. 2018, 18, 1–43.
78. Bergholm, V.; Izaac, J.; Schuld, M.; Gogolin, C.; Alam, M.S.; Ahmed, S.; Arrazola, J.M.; Blank, C.; Delgado, A.; Jahangiri, S.; et al.
Pennylane: Automatic differentiation of hybrid quantum-classical computations. arXiv 2018, arXiv:1811.04968.
79. Guerreschi, G.G.; Smelyanskiy, M. Practical optimization for hybrid quantum-classical algorithms. arXiv 2017, arXiv:1701.01450.
80. Schuld, M.; Bergholm, V.; Gogolin, C.; Izaac, J.; Killoran, N. Evaluating analytic gradients on quantum hardware. Phys. Rev. A
2019, 99, 032331. [CrossRef]
81. Wierichs, D.; Izaac, J.; Wang, C.; Lin, C.Y.Y. General parameter-shift rules for quantum gradients. Quantum 2022, 6, 677. [CrossRef]
82. Jones, T.; Gacon, J. Efficient calculation of gradients in classical simulations of variational quantum algorithms. arXiv 2020,
arXiv:2009.02823.
83. Koczor, B.; Benjamin, S.C. Quantum analytic descent. Phys. Rev. Res. 2022, 4, 023017. [CrossRef]
84. Liu, J.; Spedalieri, F.M.; Yao, K.T.; Potok, T.E.; Schuman, C.; Young, S.; Patton, R.; Rose, G.S.; Chamka, G. Adiabatic quantum
computation applied to deep learning networks. Entropy 2018, 20, 380. [CrossRef]
85. Li, R.Y.; Di Felice, R.; Rohs, R.; Lidar, D.A. Quantum annealing versus classical machine learning applied to a simplified
computational biology problem. NPJ Quantum Inf. 2018, 4, 14. [CrossRef] [PubMed]
86. Mott, A.; Job, J.; Vlimant, J.R.; Lidar, D.; Spiropulu, M. Solving a Higgs optimization problem with quantum annealing for
machine learning. Nature 2017, 550, 375–379. [CrossRef] [PubMed]
87. Konar, D.; Gelenbe, E.; Bhandary, S.; Sarma, A.D.; Cangi, A. Random quantum neural networks (RQNN) for noisy image
recognition. arXiv 2022, arXiv:2203.01764.
88. Suryotrisongko, H.; Musashi, Y. Evaluating hybrid quantum-classical deep learning for cybersecurity botnet DGA detection.
Procedia Comput. Sci. 2022, 197, 223–229. [CrossRef]
89. Shahwar, T.; Zafar, J.; Almogren, A.; Zafar, H.; Rehman, A.U.; Shafiq, M.; Hamam, H. Automated detection of Alzheimer’s via
hybrid classical quantum neural networks. Electronics 2022, 11, 721. [CrossRef]
90. Blance, A.; Spannowsky, M. Quantum machine learning for particle physics using a variational quantum classifier. J. High Energy
Phys. 2021, 2021, 212. [CrossRef]
91. Guan, W.; Perdue, G.; Pesah, A.; Schuld, M.; Terashi, K.; Vallecorsa, S.; Vlimant, J.R. Quantum machine learning in high energy
physics. Mach. Learn. Sci. Technol. 2021, 2, 011003. [CrossRef]
92. Otgonbaatar, S.; Datcu, M. Classification of remote sensing images with parameterized quantum gates. IEEE Geosci. Remote Sens.
Lett. 2021, 19, 8020105. [CrossRef]
93. Sengupta, K.; Srivastava, P.R. Quantum algorithm for quicker clinical prognostic analysis: An application and experimental
study using CT scan images of COVID-19 patients. BMC Med. Inform. Decis. Mak. 2021, 21, 227. [CrossRef]
Entropy 2023, 25, 694 19 of 19
94. Garcia-Alonso, J.; Rojo, J.; Valencia, D.; Moguel, E.; Berrocal, J.; Murillo, J.M. Quantum software as a service through a quantum
API gateway. IEEE Internet Comput. 2021, 26, 34–41. [CrossRef]
95. Liu, D.C.; Nocedal, J. On the limited memory BFGS method for large scale optimization. Math. Program. 1989, 45, 503–528.
[CrossRef]
96. Zaman, M.; Tanahashi, K.; Tanaka, S. PyQUBO: Python library for mapping combinatorial optimization problems to QUBO form.
IEEE Trans. Comput. 2021, 71, 838–850. [CrossRef]
97. Wu, X.C.; Khalate, P.; Schmitz, A.; Premaratne, S.; Rasch, K.; Daraeizadeh, S.; Kotlyar, R.; Ren, S.; Paykin, J.; Rose, F.; et al. Intel
Quantum SDK Version 1.0: Extended C++ Compiler, Runtime and Quantum Hardware Simulators for Hybrid Quantum-Classical
Applications. Bull. Am. Phys. Soc. 2023. Available online: https://meetings.aps.org/Meeting/MAR23/Session/RR08.4 (accessed
on 3 April 2023).
98. Khalate, P.; Wu, X.C.; Premaratne, S.; Hogaboam, J.; Holmes, A.; Schmitz, A.; Guerreschi, G.G.; Zou, X.; Matsuura, A. An
LLVM-based C++ Compiler Toolchain for Variational Hybrid Quantum-Classical Algorithms and Quantum Accelerators. arXiv
2022, arXiv:2202.11142.
99. Matsuura, A.; Premaratne, S.; Wu, X.C.; Sawaya, N.; Schmitz, A.; Khalate, P.; Daraeizadeh, S.; Guerreschi, G.G.; Khammassi, N.;
Rasch, K.; et al. An Intel Quantum Software Development Kit for Efficient Execution of Variational Algorithms. In Proceedings
of the APS March Meeting Abstracts, Chicago, IL, USA, 14–18 March 2022; Volume 2022, p. N36-006.
100. Wecker, D.; Svore, K.M. LIQUi|>: A software design architecture and domain-specific language for quantum computing. arXiv
2014, arXiv:1402.4467.
101. Ngo, T.A.; Nguyen, T.; Thang, T.C. A Survey of Recent Advances in Quantum Generative Adversarial Networks. Electronics 2023,
12, 856. [CrossRef]
102. Rao, P.; Chandani, Z.; Wilson, A.; Schweitz, E.; Schmitt, B.; Santana, A.; Lelbach, B.; McCaskey, A. Benchmarking of quantum
generative adversarial networks using NVIDIA’s Quantum Optimized Device Architecture. Bull. Am. Phys. Soc. 2023. Available
online: https://meetings.aps.org/Meeting/MAR23/Session/AAA05.4 (accessed on 3 April 2023).
103. Chen, Z.Y.; Xue, C.; Chen, S.M.; Guo, G.P. Vqnet: Library for a quantum-classical hybrid neural network. arXiv 2019,
arXiv:1901.09133.
104. Bian, H.; Jia, Z.; Dou, M.; Fang, Y.; Li, L.; Zhao, Y.; Wang, H.; Zhou, Z.; Wang, W.; Zhu, W.; et al. VQNet 2.0: A New Generation
Machine Learning Framework that Unifies Classical and Quantum. arXiv 2023, arXiv:2301.03251.
105. Van Der Walt, S.; Colbert, S.C.; Varoquaux, G. The NumPy array: A structure for efficient numerical computation. Comput. Sci.
Eng. 2011, 13, 22–30. [CrossRef]
106. Frostig, R.; Johnson, M.J.; Leary, C. Compiling machine learning programs via high-level tracing. Syst. Mach. Learn. 2018, 4, 1–3.
107. Paszke, A.; Gross, S.; Chintala, S.; Chanan, G.; Yang, E.; DeVito, Z.; Lin, Z.; Desmaison, A.; Antiga, L.; Lerer, A. Automatic
differentiation in pytorch. 2017. Available online: https://openreview.net/forum?id=BJJsrmfCZ (accessed on 3 April 2023).
108. Killoran, N.; Izaac, J.; Quesada, N.; Bergholm, V.; Amy, M.; Weedbrook, C. Strawberry fields: A software platform for photonic
quantum computing. Quantum 2019, 3, 129. [CrossRef]
109. Gulli, A.; Pal, S. Deep Learning with Keras; Packt Publishing Ltd.: Birmingham, UK, 2017.
110. Hibat-Allah, M.; Mauri, M.; Carrasquilla, J.; Perdomo-Ortiz, A. A Framework for Demonstrating Practical Quantum Advantage:
Racing Quantum against Classical Generative Models. arXiv 2023, arXiv:2303.15626.
111. Dou, M.; Zou, T.; Fang, Y.; Wang, J.; Zhao, D.; Yu, L.; Chen, B.; Guo, W.; Li, Y.; Chen, Z.; et al. QPanda: High-performance
quantum computing framework for multiple application scenarios. arXiv 2022, arXiv:2212.14201.
112. Bartolucci, S.; Birchall, P.; Bombin, H.; Cable, H.; Dawson, C.; Gimeno-Segovia, M.; Johnston, E.; Kieling, K.; Nickerson, N.; Pant,
M.; et al. Fusion-based quantum computation. Nat. Commun. 2023, 14, 912. [CrossRef]
113. Nayak, C.; Simon, S.H.; Stern, A.; Freedman, M.; Sarma, S.D. Non-Abelian anyons and topological quantum computation. Rev.
Mod. Phys. 2008, 80, 1083. [CrossRef]
114. McCaskey, A.; Nguyen, T. A MLIR dialect for quantum assembly languages. In Proceedings of the 2021 IEEE International
Conference on Quantum Computing and Engineering (QCE), Broomfield, CO, USA, 17–22 October 2021; pp. 255–264.
115. Ittah, D.; Häner, T.; Kliuchnikov, V.; Hoefler, T. QIRO: A static single assignment-based quantum program representation for
optimization. ACM Trans. Quantum Comput. 2022, 3, 1–32. [CrossRef]
116. Ittah, D.; Häner, T.; Kliuchnikov, V.; Hoefler, T. Enabling dataflow optimization for quantum programs. arXiv 2021,
arXiv:2101.11030.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.