Quantum dropout randomly removes some gates in an overparametrized QNN during the training phase to avoid overfitting. We propose a framework for building classical ensembles of QNNs to mitigate current hardware limitations. A hybrid U-Net architecture within a diffusion process for an efficient image generation: better results with fewer resources. A Quantum Recurrent Neural Network based on Quantum Gated Recurrent Units.
Advancing Quantum Machine Learning: Efficient Hybrid Quantum Computation / Ceschini, A.; Panella, M.. - (2024), pp. 1-1. (Intervento presentato al convegno QUANTUM COMPUTING ANNUAL MEETING ICSC - SPOKE 10, Research and Innovation tenutosi a Politecnico di Milano, Milano, Italia).
Advancing Quantum Machine Learning: Efficient Hybrid Quantum Computation
A. Ceschini;M. Panella
2024
Abstract
Quantum dropout randomly removes some gates in an overparametrized QNN during the training phase to avoid overfitting. We propose a framework for building classical ensembles of QNNs to mitigate current hardware limitations. A hybrid U-Net architecture within a diffusion process for an efficient image generation: better results with fewer resources. A Quantum Recurrent Neural Network based on Quantum Gated Recurrent Units.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.