Knowledge distillation is a widely explored technique in classical machine learning, in which a smaller or more efficient model is trained to mimic the behavior of a larger, more complex model. In this study, we extend the concept of knowledge distillation from classical architectures to quantum architectures, with the goal of improving the training of quantum models while potentially reducing the number of parameters compared to their classical counterparts. Given the inherent challenges in training quantum neural networks, leveraging knowledge from well-established classical models could provide valuable insights and advantages, particularly in terms of model efficiency and performance. In this work we explore the potential benefits of this approach, evaluating a hybrid quantum model against a non-hybrid quantum baseline. While the proposed study is still in the preliminary stage, it aims to set the scene for further investigation into the most appropriate architecture for classical-to-quantum knowledge distillation in order to enhance the development and optimization of quantum neural networks more generally.
Classical to Quantum Knowledge Distillation: a Study on the Impact of Hybridization / Piperno, S.; Vittori, G.; Windridge, D.; Rosato, A.; Panella, M.. - (2025), pp. 1-8. ( 2025 International Joint Conference on Neural Networks, IJCNN 2025 Roma (Italia) ) [10.1109/IJCNN64981.2025.11227730].
Classical to Quantum Knowledge Distillation: a Study on the Impact of Hybridization
Piperno S.;Vittori G.;Rosato A.;Panella M.
2025
Abstract
Knowledge distillation is a widely explored technique in classical machine learning, in which a smaller or more efficient model is trained to mimic the behavior of a larger, more complex model. In this study, we extend the concept of knowledge distillation from classical architectures to quantum architectures, with the goal of improving the training of quantum models while potentially reducing the number of parameters compared to their classical counterparts. Given the inherent challenges in training quantum neural networks, leveraging knowledge from well-established classical models could provide valuable insights and advantages, particularly in terms of model efficiency and performance. In this work we explore the potential benefits of this approach, evaluating a hybrid quantum model against a non-hybrid quantum baseline. While the proposed study is still in the preliminary stage, it aims to set the scene for further investigation into the most appropriate architecture for classical-to-quantum knowledge distillation in order to enhance the development and optimization of quantum neural networks more generally.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


