Transformative Benefits of Quantum Machine Learning

MathJax example

Quantum computing is poised to revolutionize the technological landscape, bringing solutions to problems once deemed unsolvable. In a previous post, we explored the fundamental differences between classical and quantum computers, highlighting the revolutionary potential of quantum algorithms. Among the many fields poised to benefit, machine learning stands out as one of the most promising areas for quantum enhancement.

What is Quantum Machine Learning?

Quantum machine learning (QML) combines the groundbreaking technologies of quantum computing and machine learning. By integrating quantum algorithms into the machine learning pipeline, we can achieve advantages such as exponential computational speed-ups, novel data representations, and improved data augmentation. While classical machine learning is already powerful, the addition of quantum computing opens up new possibilities for overcoming existing limitations.

 
MathJax example

The Bottlenecks of Machine Learning

Despite their impressive capabilities, classical machine learning models face significant challenges. One major bottleneck is the immense computational power required to train and deploy these models. As we enter the era of large-scale AI models, such as large language models (LLMs), the compute power demands skyrocket [1]. For instance, training a model like GPT-4 can cost over $100 million [2,3]. If this trend continues, some estimates suggest that by 2037, the compute costs could exceed the entire US GDP [4]. This scarcity of computational resources consolidates market power among a few key players, such as Nvidia, Google, Microsoft, and Amazon [1].

Another significant bottleneck is the curse of dimensionality. As the dimensionality of data increases, the complexity of finding useful patterns grows exponentially, often leading to overfitting and reduced performance of supervised learning methods. This necessitates new algorithmic techniques that can handle high-dimensional data accurately and efficiently.

All in all, we need new algorithmic techniques to handle high-dimensional data more accurately while also cutting down on the time and costs of training these models.

Source: Lennart Heim

 
MathJax example

How Quantum Machine Learning Can Help

Quantum computing offers a promising solution to overcome these challenges. The advantages of quantum machine learning vary based on the specific algorithm and its application. To better understand these benefits, let's explore two main categories of quantum machine learning algorithms.

Based on Linear Algebra

The first category of quantum machine learning (QML) algorithms focuses on accelerating linear algebra subroutines. Before deep learning dominated the scene, many machine learning methods—such as support vector machines, principal component analysis, and clustering techniques—relied heavily on linear algebra. Interestingly, quantum computing is fundamentally rooted in linear algebra, allowing many of these subroutines to be significantly sped up with quantum algorithms. Here are some notable examples:

  1. Fourier Transform: Widely used in machine learning tasks like speech recognition, audio analysis, and vibration analysis, the Fourier Transform extracts frequency-based features from raw signals and aids in denoising, image filtering, and audio compression. Classically, it operates with a complexity of \(O(N \log N)\) for \(N\) data points. The quantum version (Quantum Fourier Transform [5]) achieves this with a complexity of only \(O(\log^2(N))\), offering an exponential advantage.
  2. Harrow-Hassidim-Lloyd (HHL) Algorithm: This algorithm [6] solves linear systems of equations, a common requirement in many machine learning models such as support vector machines and recommender systems, which often involve matrix factorization. Classical algorithms, like the Conjugate Gradient method, have polynomial complexity \(O(N \cdot k)\) where \(k\) is the condition number of the matrix. The HHL algorithm, however, operates with a complexity of \(O(\log(N) \cdot \text{poly}(k, \frac{1}{\epsilon}))\), where \(\epsilon\) is the desired precision of the solution. This represents an exponential speedup in terms of dimensionality, assuming \(k\) and \(\frac{1}{\epsilon}\) are manageable.
  3. Quantum Phase Estimation: This algorithm [7] is used to estimate the eigenvalues of a unitary operator. In spectral graph theory applications, such as spectral clustering, QPE can be used to find the eigenvalues of the graph Laplacian, significantly speeding up the clustering process based on spectral properties. Principal component analysis, a method for dimensionality reduction, involves solving eigenvalue problems, which can also be accelerated using QPE. The classical complexity of these problems is around \(O(N^2)\), while the quantum complexity can be reduced to \(O(\log^2(N))\), providing an exponential advantage.

While these algorithms promise significant benefits over classical methods, they currently require advanced quantum hardware that is still under development. Therefore, while they hold the potential to revolutionize classical machine learning in the next five to ten years, their practical application remains largely theoretical at present.

MathJax example

Based on Data Processing

The second category of quantum machine learning (QML) algorithms focuses on data representation and processing rather than linear algebra subroutines. Instead of seeking exponential speed-ups, these algorithms aim for benefits such as higher accuracy, efficiency, energy savings, or even explainability. Let’s look at some examples:

  1. Encode Exponentially Large Data Samples: Quantum algorithms can encode and manipulate exponentially larger data samples than classical algorithms by leveraging the principles of superposition and entanglement, achieving up to an exponential reduction in the number of qubits needed. For example, the amplitude encoding method uses \(n\)-qubits to encode up to \(2^n\) values [8]. Different efficient encoding methods can be applied depending on the input data type. In image processing, the flexible representation of quantum images [8] offers an exponential reduction in qubits, and quantum relaxations [10] can be used in optimization problems to reduce qubit requirements. After encoding data in a quantum state, another quantum algorithm processes it, such as a quantum neural network or a kernel method, providing additional benefits.
  2. Reduced Complexity and Computing Costs: Some quantum machine learning models require significantly fewer training parameters to achieve the same expressive power. For instance, using a quantum neural network with efficient data encoding methods can result in networks with significantly fewer training parameters than their classical counterparts [11], because of the significant reduction in the number of qubits used to process your data. This reduction in complexity allows for shorter training times and lower costs. Another example are tensor networks, a quantum-inspired model that can optimally compress neural networks, leading to efficient reduction in training parameters, GPU usage, and time for both training and inference [12].
  3. Smoother Training and Fewer Chances of Overfitting: Variational quantum algorithms and quantum neural networks can offer smoother convergence of the loss function compared to classical models due to the reduced complexity of the resulting model. This often leads to better generalization capacity on the test set [13].
  4. Explainability: Some quantum machine learning algorithms enhance the explainability of solutions. This the case for Bayesian networks. Inference in large classical Bayesian networks is computationally intensive, often requiring approximation methods for practical use. The complexity of exact inference scales exponentially with the number of nodes in the worst case. Quantum Bayesian networks [14], on the other hand, can perform inference more efficiently by exploiting quantum parallelism, providing direct access to joint and marginal probability distributions and a deeper understanding of causal relationships between nodes.
  5. Efficiency: Quantum computing excels in solving optimization problems like graph partitions (Maxcut) or combinatorial problems (QUBO). Many machine learning problems can be formulated as combinatorial optimization problems. For example, unsupervised image segmentation can be transformed into a graph partitioning problem [15], where each pixel of the image corresponds to a node in the graph, and the edges represent similarities between neighbouring pixels. The optimal segmentation is achieved by finding the optimal partition in the graph that produces more dissimilarity between classes.
  6. Novel Representations of Data: Selecting suitable representations for input data is essential for achieving optimal performance in numerous classification or regression algorithms. Quantum algorithms such as quantum kernel methods [16] can generate novel data representations that are challenging or impossible to create with classical kernels. These methods are used in machine learning algorithms such as SVMs and spectral clustering.
  7. Efficient Sampling: The probabilistic nature of quantum computing allows for the generation of realistic distributions, useful in quantum generative models that leverage quantum mechanics to generate complex data distributions. Quantum generative models often demonstrate better generalization, particularly when data is scarce [17].
MathJax example

The advantage of these QML algorithms is that they require much less overhead from quantum hardware. Many of these algorithms can be run today for small or intermediate-sized problems and can scale as quantum hardware advances. Additionally, most quantum-inspired algorithms already provide significant advantages today, optimized to run on classical hardware such as GPUs while being ready to evolve to full-quantum models as quantum computers scale. This allows for significant benefits now while preparing for future advancements.

Quantum Machine Learning promises to revolutionize machine learning by overcoming current limitations like high computational costs and the curse of dimensionality. Leveraging the unique strengths of quantum computing, QML offers up to exponential speed-ups and novel data representations that enhance accuracy, efficiency, and scalability. While some quantum algorithms require advanced hardware still in development, many quantum-inspired techniques can already be used on classical hardware, offering immediate benefits. As quantum technology advances, QML will pave the way for a more powerful and accessible future in machine learning.

 

Enjoying our QML resources? Enroll in our FREE quantum fundamentals course.

Our Chief Science Officer and lead scientist, Dr. Laia Domingo has curated a quantum machine learning fundamentals introductory course for anyone to begin their QML journey!

  • Virtual, self-paced lessons

  • Quantum machine learning focus

  • Completely FREE

Subscribe to be notified when the course is released!

 
MathJax example

[1] Jai Vipra, Sarah Myers West. Computational Power & AI, accessed June 6, 2024. https://ainowinstitute.org/wp-content/uploads/2023/09/AI-Now_Computational-Power-an-AI.pdf

[2] Appenzeller, Bornstein, and Casado, “Navigating the High Cost of AI Compute.”

[3] Google Colab, “[PUBLIC] Cost Estimates for GPT-4,” https://colab.research.google.com/drive/1O99z9b1I5O66bT78r9ScslE_nOj5irN9?usp=sharing

[4] Lennart Heim, “This Can’t Go On(?) – AI Training Compute Costs,” heim.xyz (blog), June 1, 2023, https://blog.heim.xyz/this-cant-go-on-compute-training-costs.

[5] Coppersmith, D. (2002). An approximate Fourier transform useful in quantum factoring. ArXiv/0201067.

[6] Harrow, A. W., Hassidim, A., & Lloyd, S. (2009). Quantum Algorithm for Linear Systems of Equations. Phys. Rev. Lett., 103(15), 150502. https://doi.org/10.1103/PhysRevLett.103.150502

[7] Mande, N. S., & de Wolf, R. (2023). Tight Bounds for Quantum Phase Estimation and Related Problems. ArXiv:2305.04908.

[8] Rath, M., & Date, H. (2023). Quantum Data Encoding: A Comparative Analysis of Classical-to-Quantum Mapping Techniques and Their Impact on Machine Learning Accuracy. ArXiv. https://api. semanticscholar.org/CorpusID:265281517

[9] Le, Phuc & Iliyasu, Abdullah & Dong, Fangyan & Hirota, Kaoru. (2011). A flexible representation of quantum images for polynomial preparation, image compression and processing operations, Quantum Inf. Quantum Information Processing. 10. 63-84. 10.1007/s11128-010-0177-y.

[10] Fuller, Bryce, Charles Hadfield, Jennifer R. Glick, Takashi Imamichi, Toshinari Itoko, Richard J. Thompson, Yang Jiao, Marna M. Kagele, Adriana W. Blom-Schieber, Rudy Raymond, and Antonio Mezzacapo. "Approximate Solutions of Combinatorial Problems via Quantum Relaxations." 2021. arXiv:2111.03167.

[11] Domingo, L., Djukic, M., Johnson, C. et al. Binding affinity predictions with hybrid quantum-classical convolutional neural networks. Sci Rep 13, 17951 (2023). https://doi.org/10.1038/s41598-023-45269-y

[12] Tomut, A., et al. (2024). CompactifAI: Extreme Compression of Large Language Models using Quantum-Inspired Tensor Networks. ArXiv:2401.14109v2.

[13] Domingo, L., Chehimi, M., Banerjee, S., He Yuxun, S., Konakanchi, S., Ogunfowora, L., Roy, S., Selvaras, S., Djukic, M., & Johnson, C. (2024). A hybrid quantum-classical fusion neural network to improve protein-ligand binding affinity predictions for drug discovery. ArXiv:2309.03919.

[14] Borujeni, S. E., Nannapaneni, S., Nguyen, N. H., Behrman, E. C., & Steck, J. E. (2021). Quantum circuit representation of Bayesian networks. Expert Systems with Applications, 176, 114768. https://doi.org/10.1016/j.eswa.2021.114768

[15] Venkatesh, S. M., Macaluso, A., Nuske, M., Klusch, M., & Dengel, A. (2023). Q-Seg: Quantum Annealing-based Unsupervised Image Segmentation. ArXiv:2311.12912.

[16] Paine, A. E., Elfving, V. E., & Kyriienko, O. (2023). Quantum kernel methods for solving regression problems and differential equations. Phys. Rev. A, 107(3), 032428. https://doi.org/10.1103/PhysRevA.107.032428

[17] Hibat-Allah, M., Mauri, M., Carrasquilla, J. et al. A framework for demonstrating practical quantum advantage: comparing quantum against classical generative models. Commun Phys 7, 68 (2024). https://doi.org/10.1038/s42005-024-01552-6

Previous
Previous

Cloud-based quantum hardware providers: Where to run your quantum algorithms

Next
Next

Quantum Computing: Unlocking the future of technology