Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Arrow up icon
GO TO TOP
Quantum Machine Learning and Optimisation in Finance

You're reading from   Quantum Machine Learning and Optimisation in Finance Drive financial innovation with quantum-powered algorithms and optimisation strategies

Arrow left icon
Product type Paperback
Published in Dec 2024
Publisher Packt
ISBN-13 9781836209614
Length 494 pages
Edition 2nd Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Jacquier Antoine Jacquier Antoine
Author Profile Icon Jacquier Antoine
Jacquier Antoine
Alexei Kondratyev Alexei Kondratyev
Author Profile Icon Alexei Kondratyev
Alexei Kondratyev
Arrow right icon
View More author details
Toc

Table of Contents (21) Chapters Close

Preface 1. Chapter 1 The Principles of Quantum Mechanics FREE CHAPTER 2. Part I Analog Quantum Computing – Quantum Annealing
3. Chapter 2 Adiabatic Quantum Computing 4. Chapter 3 Quadratic Unconstrained Binary Optimisation 5. Chapter 4 Quantum Boosting 6. Chapter 5 Quantum Boltzmann Machine 7. Part II Gate Model Quantum Computing
8. Chapter 6 Qubits and Quantum Logic Gates 9. Chapter 7 Parameterised Quantum Circuits and Data Encoding 10. Chapter 8 Quantum Neural Network 11. Chapter 9 Quantum Circuit Born Machine 12. Chapter 10 Variational Quantum Eigensolver 13. Chapter 11 Quantum Approximate Optimisation Algorithm 14. Chapter 12 Quantum Kernels and Quantum Two-Sample Test 15. Chapter 13 The Power of Parameterised Quantum Circuits 16. Chapter 14 Advanced QML Models 17. Chapter 15 Beyond NISQ 18. Bibliography
19. Index 20. Other Books You Might Enjoy

5.5 Deep Boltzmann Machine

Deep Boltzmann Machines (DBMs) can be constructed from several RBMs where the hidden layer of the first RBM becomes the visible layer of the second, and so on, as shown in Figure 5.4.

−v−hvisi−hisibiidledbddleleenan lyalaelayyryeeerrRrBRRMRBBBMMM 2 112 /

Figure 5.4: Schematic representation of a DBM.

A DBM can be trained layer by layer, one RBM at a time. This will result in a powerful generative model capable of learning complex multivariate distributions and dependence structures. However, the generative training of the DBM can be used as the first step towards building a discriminative model if the training dataset samples are labelled. In this case, all DBM weights and biases found with the help of either CD or quantum Boltzmann sampling algorithms are seen as initial values of the weights and biases of the corresponding feedforward neural network. The discriminative model will consist of all the layers of the original DBM with an extra output layer performing the assignment of the class labels. The discriminative...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Modal Close icon
Modal Close icon