Visual data analysis is a key tool for helping people to make sense of and interact with massive data sets. However, existing evaluation methods (e.g., database benchmarks, individual user studies) fail to capture the key points that make systems for visual data analysis (or visual data systems) challenging to design. In November 2017, members of both the Database and Visualization communities came together in a Dagstuhl seminar to discuss the grand challenges in the intersection of data analysis and interactive visualization. In this paper, we report on the discussions of the working group on the evaluation of visual data systems, which addressed questions centered around developing better evaluation methods, such as "Howdo the different communities evaluate visual data systems?" and "What we could learn from each other to develop evaluation techniques that cut across areas?". In their discussions, the group brainstormed initial steps towards new joint evaluation methods and developed a first concrete initiative - a trace repository of various real-world workloads and visual data systems - that enables researchers to derive evaluation setups (e.g., performance benchmarks, user studies) under more realistic assumptions, and enables new evaluation perspectives (e.g., broader meta analysis across analysis contexts, reproducibility and comparability across systems).

Evaluating visual data analysis systems: A discussion report / Battle, Leilani; Angelini, Marco; Binnig, Carsten; Catarci, Tiziana; Eichmann, Philipp; Fekete, Jean-Daniel; Santucci, Giuseppe; Sedlmair, Michael; Willett, Wesley. - (2018), pp. 1-6. (Intervento presentato al convegno 2018 Workshop on Human-In-the-Loop Data Analytics, HILDA 2018 tenutosi a Houston; United States) [10.1145/3209900.3209901].

Evaluating visual data analysis systems: A discussion report

Angelini, Marco;Catarci, Tiziana;Santucci, Giuseppe;
2018

Abstract

Visual data analysis is a key tool for helping people to make sense of and interact with massive data sets. However, existing evaluation methods (e.g., database benchmarks, individual user studies) fail to capture the key points that make systems for visual data analysis (or visual data systems) challenging to design. In November 2017, members of both the Database and Visualization communities came together in a Dagstuhl seminar to discuss the grand challenges in the intersection of data analysis and interactive visualization. In this paper, we report on the discussions of the working group on the evaluation of visual data systems, which addressed questions centered around developing better evaluation methods, such as "Howdo the different communities evaluate visual data systems?" and "What we could learn from each other to develop evaluation techniques that cut across areas?". In their discussions, the group brainstormed initial steps towards new joint evaluation methods and developed a first concrete initiative - a trace repository of various real-world workloads and visual data systems - that enables researchers to derive evaluation setups (e.g., performance benchmarks, user studies) under more realistic assumptions, and enables new evaluation perspectives (e.g., broader meta analysis across analysis contexts, reproducibility and comparability across systems).
2018
2018 Workshop on Human-In-the-Loop Data Analytics, HILDA 2018
Computational Theory and Mathematics; Information Systems; Computer Science Applications1707 Computer Vision and Pattern Recognition
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Evaluating visual data analysis systems: A discussion report / Battle, Leilani; Angelini, Marco; Binnig, Carsten; Catarci, Tiziana; Eichmann, Philipp; Fekete, Jean-Daniel; Santucci, Giuseppe; Sedlmair, Michael; Willett, Wesley. - (2018), pp. 1-6. (Intervento presentato al convegno 2018 Workshop on Human-In-the-Loop Data Analytics, HILDA 2018 tenutosi a Houston; United States) [10.1145/3209900.3209901].
File allegati a questo prodotto
File Dimensione Formato  
Battle_Evaluating-visual-data_2018.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 843.66 kB
Formato Adobe PDF
843.66 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1180273
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? 13
social impact