This is Part 2 of 7 of the highlights from “Illuminating the Path: The Research and Development Agenda for Visual Analytics.” Please see this post for an introduction to the study and access to the other 6 parts.
Data flooding is a term I use to illustrate the fact that “our ability to collect data is increasing at a faster rate than our ability to analyze it.” To this end, I completely agree with the recommendation that new methods are required to “allow the analyst to examine this massive, multi-dimensional, multi-source, time-varying information stream to make decisions in a time-critical manner.”
We don’t want less, but rather more information since “large data volumes allow analysts to discover more complete information about a situation.” To be sure, “scale brings opportunities as well.” As a result, for example, “analysts may be able to determine more easily when expected information is missing,” which sometimes “offers important clues […].”
However, while computer processing power and memory density have changed radically over the decades, “basic human skills and abilities do not change significantly over time.” Technological advances can certainly leverage our skills “but there are fundamental limits that we are asymptotically approaching,” hence the notion of information glut.
In other words, “human skills and abilities do not scale.” That said, the number of humans involved in analytical problem-solving does scale. Unfortunately, however, “most published techniques for supporting analysis are targeted for a single user at a time.” This means that new techniques that “gracefully scale from a single user to a collaborative (multi-user) environment” need to be developed.
However, current technologies and platforms being used in the humanitarian and human rights communities do not address the needs for handling ever-changing volumes of information. “Furthermore, current tools provide very little in the way of support for the complex tasks of anlaysis and discovery process.” There clearly is a platform scarcity.
Admittedly, “creating effective visual representations is a labor-intensive process that requires a solid understanding of the visualization pipeline, characteristics of the data to be displayed, and the tasks to be performed.”
However, as is clear from the crisis mapping projects I have consulted on, “most visualization software is written with incomplete knowledge of at least some of this information.” Indeed, it is rarely possible for “the analyst, who has the best understanding of the data and task, to construct new tools.”
The NVAC study thus recommends that “research is needed to create software that supports the most complex and time-consuming portions of the analytical process, so that analysts can respond to increasingly more complex questions.” To be sure, “we need real-time analytical monitoring that can alert first responders to unusual situations in advance.”