Sequential Recommender Systems (SRSs) have predominantly shifted to neural-based models. Despite significant advances, Convolutional Neural Network (CNN)-based SRSs are rapidly being overshadowed by the more performant attention-based models. In this paper, we present a novel modification of two widely used CNN-based SRSs, Caser and CosRec. We improve their training by adapting the convolution and pooling operations so that they can be trained simultaneously on the whole input sequence rather than just on the last element. Our experimental results show that these modified CNN-based models achieve up to +65% in NDCG@10 compared to their original versions. Furthermore, they are also more competitive with SASRec, one of the most used attention-based SRS, surpassing it by up to +53% in NDCG@10. These findings suggest that CNN-based SRSs, with appropriate modifications, warrant further investigation and may offer viable alternatives to current attention-based approaches. Our code is available at https://github.com/antoniopurificato/recsys_conv_conf.
Are Convolutional Sequential Recommender Systems Still Competitive? Introducing New Models and Insights / Siciliano, Federico; Purificato, Antonio; Betello, Filippo; Tonellotto, Nicola; Silvestri, Fabrizio. - (2025). ( IEEE International Joint Conference on Neural Networks Roma ) [10.1109/IJCNN64981.2025.11229036].
Are Convolutional Sequential Recommender Systems Still Competitive? Introducing New Models and Insights
Siciliano Federico
Co-primo
Methodology
;Purificato Antonio
Co-primo
Methodology
;Betello Filippo
Co-primo
Methodology
;Tonellotto Nicola
Secondo
Writing – Review & Editing
;Silvestri Fabrizio
Ultimo
Supervision
2025
Abstract
Sequential Recommender Systems (SRSs) have predominantly shifted to neural-based models. Despite significant advances, Convolutional Neural Network (CNN)-based SRSs are rapidly being overshadowed by the more performant attention-based models. In this paper, we present a novel modification of two widely used CNN-based SRSs, Caser and CosRec. We improve their training by adapting the convolution and pooling operations so that they can be trained simultaneously on the whole input sequence rather than just on the last element. Our experimental results show that these modified CNN-based models achieve up to +65% in NDCG@10 compared to their original versions. Furthermore, they are also more competitive with SASRec, one of the most used attention-based SRS, surpassing it by up to +53% in NDCG@10. These findings suggest that CNN-based SRSs, with appropriate modifications, warrant further investigation and may offer viable alternatives to current attention-based approaches. Our code is available at https://github.com/antoniopurificato/recsys_conv_conf.| File | Dimensione | Formato | |
|---|---|---|---|
|
Siciliano_Are-convolutional_postprint_2025.pdf
solo gestori archivio
Note: DOI: 10.1109/IJCNN64981.2025.11229036
Tipologia:
Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
1.15 MB
Formato
Adobe PDF
|
1.15 MB | Adobe PDF | Contatta l'autore |
|
Siciliano_Are-convolutional_2025.pdf
solo gestori archivio
Tipologia:
Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
2.83 MB
Formato
Adobe PDF
|
2.83 MB | Adobe PDF | Contatta l'autore |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


