Knowledge distillation is a widely explored technique in classical machine learning, in which a smaller or more efficient model is trained to mimic the behavior of a larger, more complex model. In this study, we extend the concept of knowledge distillation from classical architectures to quantum architectures, with the goal of improving the training of quantum models while potentially reducing the number of parameters compared to their classical counterparts. Given the inherent challenges in training quantum neural networks, leveraging knowledge from well-established classical models could provide valuable insights and advantages, particularly in terms of model efficiency and performance. In this work we explore the potential benefits of this approach, evaluating a hybrid quantum model against a non-hybrid quantum baseline. While the proposed study is still in the preliminary stage, it aims to set the scene for further investigation into the most appropriate architecture for classical-to-quantum knowledge distillation in order to enhance the development and optimization of quantum neural networks more generally.

Classical to Quantum Knowledge Distillation: a Study on the Impact of Hybridization / Piperno, S.; Vittori, G.; Windridge, D.; Rosato, A.; Panella, M.. - (2025), pp. 1-8. ( 2025 International Joint Conference on Neural Networks, IJCNN 2025 Roma (Italia) ) [10.1109/IJCNN64981.2025.11227730].

Classical to Quantum Knowledge Distillation: a Study on the Impact of Hybridization

Piperno S.;Vittori G.;Rosato A.;Panella M.
2025

Abstract

Knowledge distillation is a widely explored technique in classical machine learning, in which a smaller or more efficient model is trained to mimic the behavior of a larger, more complex model. In this study, we extend the concept of knowledge distillation from classical architectures to quantum architectures, with the goal of improving the training of quantum models while potentially reducing the number of parameters compared to their classical counterparts. Given the inherent challenges in training quantum neural networks, leveraging knowledge from well-established classical models could provide valuable insights and advantages, particularly in terms of model efficiency and performance. In this work we explore the potential benefits of this approach, evaluating a hybrid quantum model against a non-hybrid quantum baseline. While the proposed study is still in the preliminary stage, it aims to set the scene for further investigation into the most appropriate architecture for classical-to-quantum knowledge distillation in order to enhance the development and optimization of quantum neural networks more generally.
2025
2025 International Joint Conference on Neural Networks, IJCNN 2025
quantum computing; knowledge distillation; hybridization
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Classical to Quantum Knowledge Distillation: a Study on the Impact of Hybridization / Piperno, S.; Vittori, G.; Windridge, D.; Rosato, A.; Panella, M.. - (2025), pp. 1-8. ( 2025 International Joint Conference on Neural Networks, IJCNN 2025 Roma (Italia) ) [10.1109/IJCNN64981.2025.11227730].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1757940
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact