Summary
In this chapter, we learned about energy-based models – a special class of powerful generative models. We learned how to build, train, and run RBMs in order to generate synthetic samples that are statistically indistinguishable from the original training dataset.
We familiarised ourselves with the Boltzmann sampling and Contrastive Divergence algorithms. Boltzmann sampling can be efficiently performed on NISQ-era quantum annealers that may improve the quality of the model and achieve orders of magnitude of speedup in generating new samples.
We learned how to combine individual RBMs together to construct a DBM. Quantum annealing can be productively applied to the pre-training of a DBM before it is fine-tuned as a deep feedforward neural network classifier.
Finally, we explored the possibility of using RBMs and DBMs as the first model in the machine learning pipeline for denoising and feature extraction.
In the next chapter, we will shift our attention to gate model quantum...