Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2018, ArXiv
Clustering is a complex process in finding the relevant hidden patterns in unlabeled datasets, broadly known as unsupervised learning. Support vector clustering algorithm is a well-known clustering algorithm based on support vector machines and Gaussian kernels. In this paper, we have investigated the support vector clustering algorithm in quantum paradigm. We have developed a quantum algorithm which is based on quantum support vector machine and the quantum kernel (Gaussian kernel and polynomial kernel) formulation. The investigation exhibits approximately exponential speed up in the quantum version with respect to the classical counterpart.
Quantum Information Processing
The support vector clustering algorithm is a well-known clustering algorithm based on support vector machines using Gaussian or polynomial kernels. The classical support vector clustering algorithm works well in general, but its performance degrades when applied on big data. In this paper, we have investigated the performance of support vector clustering algorithm implemented in a quantum paradigm for possible runtime improvements. We have developed and analyzed a quantum version of the support vector clustering algorithm. The proposed approach is based on the quantum support vector machine [1] and quantum kernels (i.e., Gaussian and polynomial). The classical support vector clustering algorithm converges in (2) runtime complexity, where is the number of input objects and is the dimension of the feature space. Our proposed quantum version converges in ~() runtime complexity. The clustering identification phase with adjacency matrix exhibits (√ 3) runtime complexity in the quantum version, whereas the runtime complexity in the classical implementation is (2). The proposed quantum version of the SVM clustering method demonstrates a significant speed-up gain on the overall runtime complexity as compared to the classical counterpart.
Physical review letters, 2014
Supervised machine learning is the classification of new data based on already classified training examples. In this work, we show that the support vector machine, an optimized binary classifier, can be implemented on a quantum computer, with complexity logarithmic in the size of the vectors and the number of training examples. In cases where classical sampling algorithms require polynomial time, an exponential speedup is obtained. At the core of this quantum big data algorithm is a nonsparse matrix exponentiation technique for efficiently performing a matrix inversion of the training data inner-product (kernel) matrix.
Machine-learning tasks frequently involve problems of manipulating and classifying large numbers of vectors in high-dimensional spaces. Classical algorithms for solving such problems typically take time polynomial in the number of vectors and the dimension of the space. Quantum computers are good at manipulating high-dimensional vectors in large tensor product spaces. This paper provides supervised and unsupervised quantum machine learning algorithms for cluster assignment and cluster finding. Quantum machine learning can take time logarithmic in both the number of vectors and their dimension, an exponential speed-up over classical algorithms. In machine learning, information processors perform tasks of sorting, assembling, assimilating, and classifying information [1-2]. In supervised learning, the machine infers a function from a set of training examples. In unsupervised learning the machine tries to find hidden structure in unlabeled data. Recent studies and applications focus in particular on the problem of large-scale machine learning [2]-big data-where the training set and/or the number of features is large. Various results on quantum machine learning investigate the use of quantum information processors to perform machine learning tasks [3-9], including pattern matching [3], Probably Approximately Correct learning [4], feedback learning for quantum measurement [5], binary classifiers [6-7], and quantum support vector machines [8].
arXiv (Cornell University), 2022
Quantum computing is a promising paradigm based on quantum theory for performing fast computations. Quantum algorithms are expected to surpass their classical counterparts in terms of computational complexity for certain tasks, including machine learning. In this paper, we design, implement, and evaluate three hybrid quantum k-Means algorithms, exploiting different degree of parallelism. Indeed, each algorithm incrementally leverages quantum parallelism to reduce the complexity of the cluster assignment step up to a constant cost. In particular, we exploit quantum phenomena to speed up the computation of distances. The core idea is that the computation of distances between records and centroids can be executed simultaneously, thus saving time, especially for big datasets. We show that our hybrid quantum k-Means algorithms can be more efficient than the classical version, still obtaining comparable clustering results.
ArXiv, 2019
In this paper, we have proposed a deep quantum SVM formulation, and further demonstrated a quantum-clustering framework based on the quantum deep SVM formulation, deep convolutional neural networks, and quantum K-Means clustering. We have investigated the run time computational complexity of the proposed quantum deep clustering framework and compared with the possible classical implementation. Our investigation shows that the proposed quantum version of deep clustering formulation demonstrates a significant performance gain (exponential speed up gains in many sections) against the possible classical implementation. The proposed theoretical quantum deep clustering framework is also interesting & novel research towards the quantum-classical machine learning formulation to articulate the maximum performance.
International Journal of Advanced Research in Computer and Communication Engineering., 2023
The lack of enough labeled data is a great issue when designing a real-life scheme. Data labeling is timeconsuming as well as costly. Semi-supervised learning (SSL) is a way to solve the issues of data labeling. SSL uses a tiny quantity of labeled data to find labels of massive quantities of unlabeled data. This paper presents a quantum-classical SSL mechanism named "Iterative Labels Finding (ILF)" by combining the Quantum Support Vector Machine algorithm (QSVM) and Ising Models Based Binary Clustering algorithm. The proposed method performs a matching and iteration process to discover the labels of unlabeled data. ILF is designed for binary classification purposes. We have illustrated the experimental result of ILF with a real-time dataset and with a practical example. From experimental results, we have found ILF as a highly efficient approach for quantum SSL.
2020 International Joint Conference on Neural Networks (IJCNN), 2020
Recently, more researchers are interested in the domain of quantum machine learning as it can manipulate and classify large numbers of vectors in high dimensional space in reasonable time. In this paper, we propose a new approach called Quantum Collaborative K-means which is based on combining several clustering models based on quantum K-means. This collaboration consists of exchanging the information of each algorithm locally in order to find a common underlying structure for clustering. Comparing the classical version of collaborative clustering to our approach, we notice that we have an exponential speed up: while the classical version takes O(K × L × M × N), the quantum version takes only O(K×L×log(M ×N)). And comparing to the quantum version of K-means, we get a better solution in terms of the criteria of validation which means in terms of clustering. The empirical evaluations validate the benefits of the proposed approach.
IJEER, 2022
Quantum machine learning (QML) is an evolving field which is capable of surpassing the classical machine learning in solving classification and clustering problems. The enormous growth in data size started creating barrier for classical machine learning techniques. QML stand out as a best solution to handle big and complex data. In this paper quantum support vector machine (QSVM) based models for the classification of three benchmarking datasets namely, Iris species, Pumpkin seed and Raisin has been constructed. These QSVM based classification models are implemented on real-time superconducting quantum computers/simulators. The performance of these classification models is evaluated in the context of execution time and accuracy and compared with the classical support vector machine (SVM) based models. The kernel based QSVM models for the classification of datasets when run on IBMQ_QASM_simulator appeared to be 232, 207 and 186 times faster than the SVM based classification model. The results indicate that quantum computers/algorithms deliver quantum speed-up.
arXiv (Cornell University), 2023
In this paper, two novel measurement-based clustering algorithms are proposed based on quantum parallelism and entanglement. The Euclidean distance metric is used as a measure of 'similarity' between the data points. The first algorithm follows a divisive approach and the bound for each cluster is determined based on the number of ancillae used to label the clusters. The second algorithm is based on unsharp measurements where we construct the set of effect operators with a gaussian probability distribution to cluster similar data points. We specifically implemented the algorithm on a concentric circle data set for which the classical clustering approach fails. It is found that the presented clustering algorithms perform better than the classical divisive one; both in terms of clustering and time complexity which is found to be O(kN logN) for the first and O(N 2) for the second one. Along with that we also implemented the algorithm on the Churrtiz data set of cities and the Wisconsin breast cancer dataset where we found an accuracy of approximately 97.43% which For the later case is achieved by the appropriate choice of the variance of the gaussian window.
2017
The aim of the project is to study two of the most widely used machine learning strategies, namely KNearest Neighbours algorithm and Perceptron Learning algorithm, in a quantum setting, and study the speedups that the quantum modules allow over the classical counterparts. The study is primarily based on the following 3 papers: 1. Quantum Perceptron Models, by N. Wiebe, A. Kapoor and K. M. Svore. 2. Quantum Algorithm for K-Nearest Neighbors Classification Based on the Metric of Hamming Distance, by Y. Ruan, X. Xue, H. Liu, J. Tan, and X. Li. 3. Quantum Algorithms for Nearest-Neighbor Methods for Supervised and Unsupervised Learning, by N. Wiebe, A. Kapoor and K. M. Svore.
Machine Learning, 2013
We show how the quantum paradigm can be used to speed up unsupervised learning algorithms. More precisely, we explain how it is possible to accelerate learning algorithms by quantizing some of their subroutines. Quantization refers to the process that partially or totally converts a classical algorithm to its quantum counterpart in order to improve performance. In particular, we give quantized versions of clustering via minimum spanning tree, divisive clustering and k-medians that are faster than their classical analogues. We also describe a distributed version of k-medians that allows the participants to save on the global communication cost of the protocol compared to the classical version. Finally, we design quantum algorithms for the construction of a neighbourhood graph, outlier detection as well as smart initialization of the cluster centres. Keywords Unsupervised learning • Clustering • Quantum learning • Quantum information processing • Grover's algorithm 1 Introduction Consider the following scenario, which illustrates a highly challenging clustering task. Imagine that you are an employee of the Department of Statistics of the United Nations. Your boss gives you the demographic data of all the Earth inhabitants and asks you to anal
2020
Classical Support Vector Machine is hugely popular for classifying data efficiently whether it is linear or non-linear in nature. SVM has been used immensely to assist a precise classification of a data point. The kernel trick of SVM has also elevated the performance of the classical algorithm. But, SVM suffers a lot of problems on a classical machine when higher dimensions are introduced or large datasets are taken up. So, in order to enhance the efficiency of Support Vector Machine, the idea of running it on a quantum machine takes over.A Quantum Machine uses Qubits which is a single bit representing 0, 1 and superposition states of 0 & 1. This use of Qubit introduces the concept of ‘parallel processing’. The Quantum Machine utilises a different version of the SVM algorithm for performing the task of classification. In the algorithm, classical data is transformed into quantum data and then analysed over a Quantum Machine. For this experiment, the outcomes from both Classical Machi...
Physical Review Letters, 2001
We propose a novel clustering method that is based on physical intuition derived from quantum mechanics. Starting with given data points, we construct a scale-space probability function. Viewing the latter as the lowest eigenstate of a Schrödinger equation, we use simple analytic operations to derive a potential function whose minima determine cluster centers. The method has one parameter, determining the scale over which cluster structures are searched. We demonstrate it on data analyzed in two dimensions (chosen from the eigenvectors of the correlation matrix). The method is applicable in higher dimensions by limiting the evaluation of the Schrödinger potential to the locations of data points.
Advances in Neural Information Processing Systems 14, 2002
We propose a novel clustering method that is an extension of ideas inherent to scale-space clustering and support-vector clustering. Like the latter, it associates every data point with a vector in Hilbert space, and like the former it puts emphasis on their total sum, that is equal to the scalespace probability function. The novelty of our approach is the study of an operator in Hilbert space, represented by the Schrödinger equation of which the probability function is a solution. This Schrödinger equation contains a potential function that can be derived analytically from the probability function. We associate minima of the potential with cluster centers. The method has one variable parameter, the scale of its Gaussian kernel. We demonstrate its applicability on known data sets. By limiting the evaluation of the Schrödinger potential to the locations of data points, we can apply this method to problems in high dimensions.
Neural Networks, 2003
Refined concepts, such as Rademacher estimates of model complexity and nonlinear criteria for weighting empirical classification errors, represent recent and promising approaches to characterize the generalization ability of Support Vector Machines (SVMs). The advantages of those techniques lie in both improving the SVM representation ability and yielding tighter generalization bounds. On the other hand, they often make Quadratic-Programming algorithms no longer applicable, and SVM training cannot benefit from efficient, specialized optimization techniques. The paper considers the application of Quantum Computing to solve the problem of effective SVM training, especially in the case of digital implementations. The presented research compares the behavioral aspects of conventional and enhanced SVMs; experiments in both a synthetic and real-world problems support the theoretical analysis. At the same time, the related differences between Quadratic-Programming and Quantum-based optimization techniques are considered. q
International Journal of Data Mining, Modelling and Management, 2010
The emerging field of quantum computing has recently created much interest in the computer science community due to the new concepts it suggests to store and process data. In this paper, we explore some of these concepts to cope with the data clustering problem. Data clustering is a key task for most fields like data mining and pattern recognition. It aims to discover cohesive groups in large datasets. In our work, we cast this problem as an optimisation process and we describe a novel framework, which relies on a quantum representation to encode the search space and a quantum evolutionary search strategy to optimise a quality measure in quest of a good partitioning of the dataset. Results on both synthetic and real data are very promising and show the ability of the method to identify valid clusters and also its effectiveness comparing to other evolutionary algorithms. . Her areas of interest include computational intelligence, quantum inspired computing, bioinformatics and image analysis.
Quantum Foundations, Probability and Information, 2018
The aim of this paper is to provide a quantum counterpart of the well known minimum-distance classifier named Nearest Mean Classifier (NMC). In particular, we refer to the following previous works: i) in [13] we have introduced a detailed quantum version of the NMC, named Quantum Nearest Mean Classifier (QNMC), for two-dimensional problems and we have proposed a generalization to abitrary dimensions; ii) in [12] the n-dimensional problem was analyzed in detail and a particular encoding for arbitrary n-feature vectors into density operators has been presented. In this paper, we introduce a new promizing encoding of arbitrary n-dimensional patterns into density operators, starting from the two-feature encoding provided in [13]. Further, unlike the NMC, the QNMC shows to be not invariant by rescaling the features of each pattern. This property allows us to introduce a free parameter whose variation provides, in some case, an improvement of the QNMC performance. We show experimental results where: i) the NMC and QNMC performances are compared on different datasets; ii) the effects of the non-invariance under uniform rescaling for the QNMC are investigated.
arXiv: Quantum Physics, 2018
We present an algorithm for quantum-assisted cluster analysis (QACA) that makes use of the topological properties of a D-Wave 2000Q quantum processing unit (QPU). Clustering is a form of unsupervised machine learning, where instances are organized into groups whose members share similarities. The assignments are, in contrast to classification, not known a priori, but generated by the algorithm. We explain how the problem can be expressed as a quadratic unconstrained binary optimization (QUBO) problem, and show that the introduced quantum-assisted clustering algorithm is, regarding accuracy, equivalent to commonly used classical clustering algorithms. Quantum annealing algorithms belong to the class of metaheuristic tools, applicable for solving binary optimization problems. Hardware implementations of quantum annealing, such as the quantum annealing machines produced by D-Wave Systems, have been subject to multiple analyses in research, with the aim of characterizing the technology&...
This paper proposes a new quantum-like method for the binary classification applied to classical datasets. Inspired by the quantum Helstrom measurement, this innovative approach has enabled us to define a new classifier, called Helstrom Quantum Centroid (HQC). This binary classifier (inspired by the concept of distinguishability between quantum states) acts on density matrices-called density patterns-that are the quantum encoding of classical patterns of a dataset. In this paper we compare the performance of HQC with respect to twelve standard (linear and non-linear) classifiers over fourteen different datasets. The experimental results show that HQC outperforms the other classifiers when compared to the Balanced Accuracy and other statistical measures. Finally, we show that the performance of our classifier is positively correlated to the increase in the number of "quantum copies" of a pattern and the resulting tensor product thereof.
Mathematics, 2017
Data clustering is a vital tool for data analysis. This work shows that some existing useful methods in data clustering are actually based on quantum mechanics and can be assembled into a~powerful and accurate data clustering method where the efficiency of computational quantum chemistry eigenvalue methods is therefore applicable. These methods can be applied to scientific data, engineering data and even text.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.