Recurrent Quantum Neural Networks, presented by Johannes Bausch, Cambridge University
Recurrent neural networks are the foundation of many sequence-to-sequence models in machine learning, such as machine translation and speech synthesis. In this work we construct a quantum recurrent neural network (QRNN) with demonstrable performance on non-trivial tasks such as sequence learning and integer digit classification. The QRNN cell is built from parametrized quantum neurons, which, in conjunction with amplitude amplification, create a nonlinear activation of polynomials of its inputs and cell state, and allow the extraction of a probability distribution over predicted classes at each step. To study the model’s performance, we provide an implementation in pytorch, which allows the relatively efficient optimization of parametrized quantum circuits with thousands of parameters. We establish a QRNN training setup by benchmarking optimization hyperparameters, and analyse suitable network topologies for simple memorisation and sequence prediction tasks from Elman’s seminal paper (1990) on temporal struct
6 views
5
2
6 months ago 00:07:00 8
Beyond Infinity Number Comparison
4 years ago 01:15:46 6
Recurrent Quantum Neural Networks, presented by Johannes Bausch, Cambridge University
6 years ago 00:03:50 2
bod 包家巷 - The Recurrence Of Infections 复发感染 (Filmic Supplement #2)
6 years ago 00:07:45 3
bod 包家巷 - The Recurrence Of Infections 复发感染 (Filmic Supplement #1)
9 years ago 00:00:48 6
Machine Learning is Fun Part 2: Generating Super Mario Maker Levels with a Recurrent Neural Network