Computer Science

Share this article:

Computer Science

  • Join our comunity:

Machine Learning and Adiabatic Quantum Computing

By: , Posted on: May 14, 2014

9780128009536When we think about quantum computing, we expect lower computational complexity and higher storage capacity, but there is more to gain. Machine learning, a branch of artificial intelligence that seeks patterns in data, is particularly well-suited for quantum computing. A core concept in computational learning theory is generalization performance: we are interested in how well a learned model will perform on previously unseen instances. This is an area where quantum computing can help. Some quantum learning models use traditional quantum circuit models, or rely on other methods such as quantum process tomography. Here we take a quick glimpse at adiabatic quantum computing and how some learning methods map to this paradigm.

In quantum mechanics, the interactions among elementary particles and the impact of the environment are described by an operator, the Hamiltonian. The ground state of the Hamiltonian is the lowest energy state that the system can occupy. Many optimization problems in learning theory translate to a Hamiltonian, and the goal becomes finding the ground state. This is where the adiabatic theorem helps: if we gradually change a system in the ground state, allowing time to adapt its configuration, then it will remain in the ground state at the end of the process. We exploit this theorem by reaching the ground state of a simple Hamiltonian, then changing it to meet our target Hamiltonian, thus solving our optimization problem.

Perhaps the most straightforward example is boosting. Boosting combines several simple classifiers to construct a strong combined classifier. By manipulating the objective function of finding the optimal composition of classifiers, the problem is identical to an Ising Hamiltonian, a two-dimensional model of magnetic spins. To calculate the optimal classifier, we reach the ground state of a simple Hamiltonian, and adiabatically evolve to the Ising Hamiltonian. The real gain is that boosting on classical computers relies on a convex objective function, whereas a nonconvex objective function yields better generalization performance. The adiabatic evolution is not sensitive to convexity, hence this way we are able to train better boosting models [1].

Let us take a look at how we can achieve an exponential speedup in quantum machine learning. K-means clustering finds groups of similar instances in a collection of data, where the total number of clusters is K. It is an NP-hard task, and approximating heuristics run in polynomial time on classical computers. Listing which data instance belongs to which cluster already has linear complexity. If we allow the input and output data to be quantum superpositions, we can achieve an exponential speedup over classical heuristics [2]. An interim state in the quantum adiabatic algorithm for K-means cluster encodes the current cluster assignment in a tensor product state. The initial Hamiltonian encodes only the cluster labels, whereas the the final Hamiltonian also encodes the distances between the cluster centroids and the individual data instances. By adiabatically changing the Hamiltonian to the final one, we rotate the cluster label states to associate the label with the states that should be assigned to the label. Repeated application of the heuristic produces better and better assignments.

As a last example, we look at how we can store exponentially more patterns using a quantum algorithm. A classical Hopfield network is a type of artificial neural net that serves as an associative memory: given a pattern, it retrieves the closest match stored in the network. In a classical system, the capacity of a Hopfield network to store patterns depends on the number of artificial neurons in the system. With sophisticated training algorithms, the number of patterns that can be stored is about one fifth of the number of neurons. Using quantum hardware, no such limits apply. If a pattern is represented by n bits, a quantum superposition may store up to 2n patterns. To retrieve a pattern given an input, we represent both the memory and the new input as a Hamiltonian. The input Hamiltonian is chosen to shift the energy landscape of the memory, pointing to the most similar pattern in the memory. We rely on the adiabatic theorem to evolve to the ground state of this composite Hamiltonian and identify the closest match [3]. The process is illustrated in the figure below.

adiabatic_memory
An overview of an adiabatic quantum associative memory. On the left, the excited state of a simple Hamiltonian H0 is cooled to its ground state. On the right, we construct a composite Hamiltonian Hmem+Hinp of the stable states of the associative memory and the new input state. We adiabatically evolve the system from H0 to Hmem+Hinp to get the closest match for the input pattern.

Implementation of small-scale adiabatic computing is already feasible, for instance, by nuclear magnetic resonance. A solid-state implementation would be more attractive, as existing manufacturing processes could be adapted. Progress has been made in this direction, potentially involving hundreds of qubits, but this technology remains controversial. For this reason alone, it is worth keeping track of competing methods that could bring quantum learning closer to application.

[1] Denchev, V. S.; Ding, N.; Vishwanathan, S. & Neven, H. Robust classification with adiabatic quantum optimization. In Proceedings of the 29th International Conference on Machine Learning, 2012.

[2] Lloyd, S.; Mohseni, M. & Rebentrost, P. Quantum algorithms for supervised and unsupervised machine learning. arXiv:1307.0411, 2013.

[3] Neigovzen, R.; Neves, J. L.; Sollacher, R. & Glaser, S. J. Quantum pattern recognition with liquid-state nuclear magnetic resonance. Physical Review A, American Physical Society, 2009, 79, 042321.

About the Author

Peter WittekPeter Wittek received his PhD in Computer Science from the National University of Singapore, and he also holds an MSc in Mathematics. He is interested in interdisciplinary synergies, such as scalable learning algorithms on supercomputers, computational methods in quantum simulations, and quantum machine learning. He has been involved in major EU research projects, and obtained several academic and industry grants.

Affiliated with the University of Borås, he works location-independently, and did research stints at several institutions, including the Indian Institute of Science, Barcelona Supercomputing Center, Bangor University, Tsinghua University, the Centre for Quantum Technologies, and the Institute of Photonic Sciences.

He is the author of  Quantum Machine Learning: What Quantum Computing Means to Data Mining, available on the Elsevier Store at a 25% discount. This book bridges the gap between abstract developments in quantum computing and the applied research on machine learning. It captures a broad array of highly specialized content in an accessible and up-to-date review of the growing academic field of quantum machine learning and its applications in industry.

Connect with us on social media and stay up to date on new articles

Computer Science

Computing functionality is ubiquitous. Today this logic is built into almost any machine you can think of, from home electronics and appliances to motor vehicles, and it governs the infrastructures we depend on daily — telecommunication, public utilities, transportation. Maintaining it all and driving it forward are professionals and researchers in computer science, across disciplines including:

  • Computer Architecture and Computer Organization and Design
  • Data Management, Big Data, Data Warehousing, Data Mining, and Business Intelligence (BI)
  • Human Computer Interaction (HCI), User Experience (UX), User Interface (UI), Interaction Design and Usability
  • Artificial intelligence (AI)
Morgan Kaufmann companion resources can be found here You can also access companion materials and instructor’s resources for all our new books on the Elsevier Store. Search by author, title or ISBN, then look for the “Resources” tab on any book page. Looking for companion materials or instructor’s resources for these titles? Connect below: