Summary
In this chapter, we learned how to construct and train a generative QML model – Quantum Circuit Born Machine. We started with the general concept of a PQC as a generative model, where the readout operation produces a sample from the probability distribution encoded in the PQC parameters.
Next, we introduced the concept of a hardware-efficient PQC ansatz. Additionally, to build a model that is compatible with QPU connectivity and can easily be embedded into a QPU graph, we tried to use adjustable (one-qubit) and fixed (two-qubit) gates from the set of the native quantum gates for the given system.
Then, we studied differentiable and non-differentiable learning algorithms and experimented with a QCBM trained using a GA. Comparison with the classical benchmark (an RBM) demonstrated a realistic possibility of quantum advantage for generative quantum machine learning models.
Finally, we explored the question of training algorithm convergence for various sets of model parameters...