BEyond time and errors: novel evaLuation methods for Information Visualization

A Workshop of the ACM CHI 2008 Conference

April 5, 2008 - Florence, Italy

Organized by: Enrico Bertini, Adam Perer, Catherine Plaisant, Giuseppe Santucci.

Florence Panorama 1Florence Panorama 2
ACM CHI 2008 Logo


  • [April 2, 2008] The final workshop agenda is ready (see below).

Workshop Description

The purpose of information visualization is to provide users with accurate visual representations of data and natural interaction tools to support discovery and sense making. These activities are often exploratory in nature and can take place over days, weeks or months and rarely follow a predefined or linear workflow. While the overall use of information visualizations is accelerating, the growth of techniques for the evaluation of these systems has been slow. To understand these complex behaviors, evaluation efforts should be targeted at the component level, the system level, and the work environment level. The commonly used evaluation metrics such as task time completion and number of errors appear insufficient to quantify the quality of an information visualization system; thus the name of the workshop: “beyond time and errors …”.

BELIV’08 aims at gathering researchers in the field to continue the exploration of novel evaluation methods, and to structure the knowledge on evaluation in information visualization around a schema, where researchers can easily identify unsolved problems and research gaps.

Workshop Agenda


Overview of the workshop and logistics
Introduction of participants

09:30 The problems we face

10:10 What to measure and how

  • Sean M. McNee, Ben Arnette.
    Productivity as a Metric for Visual Analytics: Reflections on E-Discovery. (Research paper)

10:30 - 11:00 Coffee break

12:30 - 13:30 Lunch

13:30 Qualitative methods and logging

15:00 - 15:30 Coffee break

15:30 Methodologies and Case studies

  • Mark Whiting, Jereme Haack, and Carrie Varley.
    Creating Realistic, Scenario-Based Synthetic Data for Test and Evaluation of Information Analytics Software. (Research paper)
  • Theresa A. O’Connell and Yee-Yin Choong.
    User-Centered Evaluation Methodology for Interactive Visualizations. (Position paper)
  • Eliane R.A. Valiati, Carla M. Dal Sasso Freitas, and Marcelo S. Pimenta.
    Applying MILCs to the evaluation of information visualization techniques in three case studies. (Research paper)
  • Xiaobin Shen, Andrew Vande Moere, Peter Eades, and Seokhee Hong.
    The Long-Term Evaluation of Fisherman in a Partial-Attention Environment. (Research paper)

16:40 - 17:40 Discussion

Topics of Interest

Topics include, but are not limited to:

  • Evaluation in the visualization lifecycle
  • Utility characterization
  • Quality metrics
  • Insight characterization
  • Synthetic data set generation
  • Taxonomies of tasks
  • Benchmark development and repositories
  • Methodology of longitudinal case studies
  • Evaluation of early prototypes
  • Measuring adoption
  • Evaluation heuristics and guidelines

Important: We will not consider papers that merely report on the evaluation of a system, unless they illustrate a new method or provide insights as to the benefits and drawbacks of a new evaluation method.

Format of the event (1 full day)

The workshop will take place over one full day, limited to about 20 participants. Participants will introduce themselves and be asked to write on sticky notes, suggested topics for the afternoon discussion and what they hope outcomes of the workshop should be. Then selected papers will be presented in the morning and early afternoon. The majority of the afternoon will then be dedicated to discussions and work in breakout sessions. We will end by reports to the whole group for a closing synthesis. We will provide groups with paper, markers, sticky notes, and other supplies to facilitate brainstorming and reporting. This will also facilitate the preparation of a poster summarizing the workshop. An optional dinner will follow the workshop to further strengthen bonds between participants.

Important Dates

  • Submission: 31 Oct 2007
  • Notification: 28 Nov 2007
  • Camera ready due: Mid-March
  • Workshop: 5 Apr 2008

How to participate

To participate to the workshop it is necessary to have a paper accepted and be registered both to the workshop and the main CHI conference.

Paper Types
We accept 2 types of submissions: position papers or research papers:

  • Position papers are short statements (1-2 pages) describing a participant's relevant experience and ideas that can contribute to the discussion during the workshop. They will be made available to the workshop participants only.
  • Research papers are longer (4-8 pages) and present new work and unpublished results. Research papers will be peer-reviewed by members of the program committee and selected according to their novelty, quality and relevance. Authors of accepted research papers will have a chance to revise their papers before they are published in the ACM digital library.

To submit a paper send an email to: by October 31, 2007. Please clarify if you are submitting a position or research paper.

We will need all complete research or position papers submitted by October 31st to start the peer review process.

All the submissions should be formatted in the ACM style. Suitable templates, in LaTeX and Word, can be downloaded from: Submission should be either in PDF (preferred) or Word formats.

Program Committee (Preliminary)

Paolo Buono (University of Bari, Italy)
Tiziana Catarci (University of Rome, Italy)
Jean Daniel Fekete (INRIA, France)
Alan Dix (Lancaster University, UK)
George Grinstein (UMass, USA)
Kasper Hornbæk (University of Copenhagen, DK)
Robert Kosara (UNCC, USA)
Chris North (Virginia Tech, USA)
George Robertson (Microsoft Research, USA)
John Stasko (Georgia Tech, USA)


For any questions or problems contact Enrico Bertini at: enrico.bertini AT or Adam Perer at adamp AT


Materials from BELIV'06, the first edition of the workshop organized at AVI 2006: