Deep Learning with Quantum and Classical Parameters PDF + Discussion 11/16/23.
A detailed study regarding 'End-to-end Differentiation' with QML/QiML Artificial Intelligence will be presented based on the following 3 Key Points.
-
Quantum inspired workflows using readily available CPUs, GPUs, or TPUs can optimize parameters of Quantum and Classical Deep learning networks for prospective advantage. This should be accomplished when information rich simulated quantum circuits are used with more traditional CNNs/Transformers in the same model. 'The ability to compute quantum gradients means that quantum computations can become part of automatically differentiable hybrid computation pipelines.' 1
-
The 2019 'Quantum transfer learning' model by Andrea Mari utilizes an A) Untrained ResNet deep neural network and parameterized quantum circuit which are both limited in size and performance based on available RAM and compute power. In addition, the model is based on a prior Stanford classical model which can be used for comparisons between a B) Fully trained ResNet model, and a C) Untrained ResNet model without using a quantum circuit. 2 3
-
Many subsequent modifications have been performed with the QTL model, however widespread documentation of combined benefits of quantum and classical parameters have not yet been realized. Here, the steps to troubleshoot mainstream 'End-to-end Differentiation' models using 'Simulator specific algorithms' with "Quantum and Classical Parameters" for better Artificial Intelligence accuracies and new utilities will be discussed. 4
'QML/QiML' is the proposed term at this current stage in industry - As existing classical processing methods already use the 'QML' term, and 'QiML' is less known but refers to quantum machine learning implemented on classical hardware. Lastly, Thank you PennyLane, Qiskit, and PyTorch for the continued innovations to improve QiML workflows for important applications.
Cover: A) PennyLane. B) GitHub. C) Medium. D) Wikipedia. E) arXiv.