In this work, we introduce Ducho 2.0, the latest stable version of our framework. Differently from Ducho, Ducho 2.0 offers a more personalized user experience with the definition and import of custom extraction models fine-tuned on specific tasks and datasets. Moreover, the new version is capable of extracting and processing features through multimodal-by-design large models. Notably, all these new features are supported by optimized data loading and storing to the local memory. To showcase the capabilities of Ducho 2.0, we demonstrate a complete multimodal recommendation pipeline, from the extraction/processing to the final recommendation. The idea is to provide practitioners and experienced scholars with a ready-to-use tool that, put on top of any multimodal recommendation framework, may permit them to run extensive benchmarking analyses. All materials are accessible at: https://github.com/sisinflab/Ducho.

Ducho 2.0: Towards a More Up-to-Date Unified Framework for the Extraction of Multimodal Features in Recommendation / Attimonelli, Matteo; Danese, Danilo; Malitesta, Daniele; Pomo, Claudio; Gassi, Giuseppe; Di Noia, Tommaso. - (2024), pp. 1075-1078. (Intervento presentato al convegno ACM Web Conference 2024 tenutosi a Singapore) [10.1145/3589335.3651440].

Ducho 2.0: Towards a More Up-to-Date Unified Framework for the Extraction of Multimodal Features in Recommendation

Attimonelli, Matteo;
2024

Abstract

In this work, we introduce Ducho 2.0, the latest stable version of our framework. Differently from Ducho, Ducho 2.0 offers a more personalized user experience with the definition and import of custom extraction models fine-tuned on specific tasks and datasets. Moreover, the new version is capable of extracting and processing features through multimodal-by-design large models. Notably, all these new features are supported by optimized data loading and storing to the local memory. To showcase the capabilities of Ducho 2.0, we demonstrate a complete multimodal recommendation pipeline, from the extraction/processing to the final recommendation. The idea is to provide practitioners and experienced scholars with a ready-to-use tool that, put on top of any multimodal recommendation framework, may permit them to run extensive benchmarking analyses. All materials are accessible at: https://github.com/sisinflab/Ducho.
2024
ACM Web Conference 2024
Multimodal Recommendation; Deep Neural Networks
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Ducho 2.0: Towards a More Up-to-Date Unified Framework for the Extraction of Multimodal Features in Recommendation / Attimonelli, Matteo; Danese, Danilo; Malitesta, Daniele; Pomo, Claudio; Gassi, Giuseppe; Di Noia, Tommaso. - (2024), pp. 1075-1078. (Intervento presentato al convegno ACM Web Conference 2024 tenutosi a Singapore) [10.1145/3589335.3651440].
File allegati a questo prodotto
File Dimensione Formato  
Attimonelli_Ducho-2.0_2024.pdf

accesso aperto

Note: https://dl.acm.org/doi/pdf/10.1145/3589335.3651440
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 2.1 MB
Formato Adobe PDF
2.1 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1725015
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact