Tag Archives: NVAC

Part 7: A Taxonomy for Visual Analytics

This is Part 7 of 7 of the highlights from the National Visualization Analysis Center. (NVAC). Unlike previous parts, this one focuses on a May 2009 article. Please see this post for an introduction to the study and access to the other 6 parts.

Jim Thomas, the co-author of “Illuminating the Path: A Research and Development Agenda for Visual Analytics,” and Director of the National Visualization Analysis Center (NVAC) recently called for the development of a taxonomy for visual analytics. Jim explains the importance of visual analytics as follows:

“Visual analytics are valuable because the tool helps to detect the expected, and discover the unexpected. Visual analytics combines the art of human intuition and the science of mathematical deduction to perceive patterns and derive knowledge and insight from them. With our success in developing and delivering new technologies, we are paving the way for fundamentally new tools to deal with the huge digital libraries of the future, whether for terrorist threat detection or new interactions with potentially life-saving drugs.”

In the latest edition of VAC Views, Jim expresses NVAC’s interest in helping to “define the study of visual analytics by providing an order and arrangement of topics—the taxa that are at the heart of studying visual analytics. The reason for such a “definition” is to more clearly describe the scope and intent of impact for the field of visual analytics.”

Jim and colleagues propose the following higher-order classifications:

  • Domain/Applications
  • Analytic Methods/Goals
  • Science and Technology
  • Data Types/Structures.

In his article in VAC Views, Jim requests feedback and suggestions for improving the more detailed taxonomy that he provides in the graphic below. The latter was not produced in very high resolution in VAC Views and does not reproduce well here, so I summarize below whilst giving feedback.

VacViews

1. Domain/Applications

While Security (and Health) are included in the draft NVAC proposal as domains / applications, what is missing is Humanitarian Crises, Conflict Prevention and Disaster Management.

Perhaps “domain/applications” should not be combined since “applications” tends to be a subset of associated “domains” which poses some confusion. For example, law enforcement is a domain and crime mapping analysis could be considered as an application of visual analytics.

2. Analytic Methods/Goals

Predictive, Surveillance, Watch/Warn/Alert, Relationship Mapping, Rare Event Identification are included. There are a host of other methods not referred to here such as cluster detection, a core focus of spatial analysis. See the methods table in my previous blog post for examples of spatial cluster detection.

Again I find that combining both “analytic methods” and “goals” makes the classification somewhat confusing.

3. Scientific and Technology

This classification includes the following entries (each of which are elaborated on individually later):

  • Analytic reasoning and human processes
  • Interactive visualization
  • Data representations and theory of knowledge
  • Theory of communications
  • Systems and evaluations.

4. Data Types/Structures

This includes Text, Image, Video, Graph Structures, Models/Simulations, Geospatial Coordinates, time, etc.

Returning now to the sub-classifications under “Science and Technology”:

Analytic reasoning and human processes

This sub-classification, for example, includes the following items:

  • Modes of inference
  • Knowledge creation
  • Modeling
  • Hypothesis refinement
  • Human processes (e.g., perception, decision-making).

Interactive visualization

This is comprised of:

  • The Science of Visualization
  • The Science of Interaction.

The former includes icons, positioning, motion, abstraction, etc, while the latter includes language of discourse, design and art, user-tailored interaction and simulation interaction.

Data representations and theory of knowledge

This includes (but is not limited to):

  • Data Sourcing
  • Scale and Complexity
  • Aggregation
  • Ontology
  • Predictions Representations.

Theory of communications

This sub-classification includes for example the following:

  • Story Creation
  • Theme Flow/Dynamics
  • Reasoning representation.

Systems and evaluations

This last sub-classification comprises:

  • Application Programming Interface
  • Lightweight Standards
  • Privacy.

Patrick Philippe Meier

Part 5: Data Visualization and Interactive Interface Design

This is Part 5 of 7 of the highlights from “Illuminating the Path: The Research and Development Agenda for Visual Analytics.” Please see this post for an introduction to the study and access to the other 6 parts.

Data Visualization

The visualization of information “amplifies human cognitive capabilities in six basic ways” by:

  • Increasing cognitive resources, such as by using a visual resource to expand human working memory;
  • Reducing search, such as by representing a large amount of data in a small place;
  • Enhancing the recognition of patterns, such as when information is organized in space by its time relationships;
  • Supporting the easy perceptual inference of relationships that are otherwise more difficult to induce;
  • Enabling perceptual monitoring of a large number of potential events;
  • Providing a manipulable medium that, unlike static diagrams, enables the exploration of a space of parameter values.

The table below provides additional information on how visualization amplifies cognition:

NAVCsTable

Clearly, “these capabilities of information visualization, combined with computational data analysis, can be applied to analytic reasoning to support the sense-making process.” The National Visualization and Analysis Center (NVAC) thus recommends developing “visually based methods to support the entire analytic reasoning process, including the analysis of data as well as structured reasoning techniques such as the construction of arguments, convergent-divergent investigation, and evaluation of alternatives.”

Since “well-crafted visual representations can play a critical role in making information clear […], the visual representations and interactions we develop must readily support users of varying backgrounds and expertise.” To be sure, “visual representations and interactions must be developed with the full range of users in mind, from the experienced user to the novice working under intense pressure […].”

As NVACs notes, “visual representations are the equivalent of power tools for analytical reasoning.” But just like real power tools, they can cause harm if used carelessly. Indeed, it is important to note that “poorly designed visualizations may lead to an incorrect decision and great harm. A famous example is the poor visualization of the O-ring data produced before the disastrous launch of the Challenger space shuttle […].”

Effective Depictions

This is why we need some basic principles for developing effective depictions, such as the following:

  • Appropriateness Principle: the visual representation should provide neither more or less information than that needed for the task at hand. Additional information may be distracting and makes the task more difficult.
  • Naturalness Principle: experiential cognition is most effective when the properties of the visual representation most closely match the information being represented. This principle supports the idea that new visual metaphors are only useful for representing information when they match the user’s cognitive model of the information. Purely artificial visual metaphors can actually hinder understanding.
  • Matching Principle: representations of information are mst effective when they match the task to be performed by the user. Effective visual representations should present affordances suggestive of the appropriate action.
  • Congruence Principle: the structure and content of a visualization should correspond to the structure and content of the desired mental representation.
  • Apprehension Principle: the structure and content of a visualization should be readily and accurately perceived and comprehended.

Further research is needed to understand “how best to combine time and space in visual representation. “For example, in the flow map, spatial information is primary” in that it defines the coordinate system, but “why is this the case, and are there visual representations where time is foregrounded that could also be used to support analytical tasks?”

In sum, we must deepen our understanding of temporal reasoning and “create task-appropriate methods for integrating spatial and temporal dimensions of data into visual representations.”

Interactive Interface Design

It is important in the visual analytics process that researchers focus on visual representations of data and interaction design in equal measure. “We need to develop a ‘science of interaction’ rooted in a deep understanding of the different forms of interaction and their respective benefits.”

For example, one promising approach for simplifying interactions is to use 3D graphical user interfaces. Another is to move beyond single modality (or human sense) interaction techniques.

Indeed, recent research suggests that “multi-modal interfaces can overcome problems that any one modality may have. For example, voice and deictic (e.g., pointing) gestures can complement each other and make it easier for the user to accomplish certain tasks.” In fact, studies suggest that “users prefer combined voice and gestural communication over either modality alone when attempting graphics manipulation.”

Patrick Philippe Meier

Part 2: Data Flooding and Platform Scarcity

This is Part 2 of 7 of the highlights from “Illuminating the Path: The Research and Development Agenda for Visual Analytics.” Please see this post for an introduction to the study and access to the other 6 parts.

Data Flooding

Data flooding is a term I use to illustrate the fact that “our ability to collect data is increasing at a faster rate than our ability to analyze it.” To this end, I completely agree with the recommendation that new methods are required to “allow the analyst to examine this massive, multi-dimensional, multi-source, time-varying information stream to make decisions in a time-critical manner.”

We don’t want less, but rather more information since “large data volumes allow analysts to discover more complete information about a situation.” To be sure, “scale brings opportunities as well.” As a result, for example, “analysts may be able to determine more easily when expected information is missing,” which sometimes “offers important clues […].”

However, while computer processing power and memory density have changed radically over the decades, “basic human skills and abilities do not change significantly over time.” Technological advances can certainly leverage our skills “but there are fundamental limits that we are asymptotically approaching,” hence the notion of information glut.

In other words, “human skills and abilities do not scale.” That said, the number of humans involved in analytical problem-solving does scale. Unfortunately, however, “most published techniques for supporting analysis are targeted for a single user at a time.” This means that new techniques that “gracefully scale from a single user to a collaborative (multi-user) environment” need to be developed.

Platform Scarcity

However, current technologies and platforms being used in the humanitarian and human rights communities do not address the needs for handling ever-changing volumes of information. “Furthermore, current tools provide very little in the way of support for the complex tasks of anlaysis and discovery process.” There clearly is a platform scarcity.

Admittedly, “creating effective visual representations is a labor-intensive process that requires a solid understanding of the visualization pipeline, characteristics of the data to be displayed, and the tasks to be performed.”

However, as is clear from the crisis mapping projects I have consulted on, “most visualization software is written with incomplete knowledge of at least some of this information.” Indeed, it is rarely possible for “the analyst, who has the best understanding of the data and task, to construct new tools.”

The NVAC study thus recommends that “research is needed to create software that supports the most complex and time-consuming portions of the analytical process, so that analysts can respond to increasingly more complex questions.” To be sure, “we need real-time analytical monitoring that can alert first responders to unusual situations in advance.”

Patrick Philippe Meier

Part 1: Visual Analytics

This is Part 1 of 7 of the highlights from “Illuminating the Path: The Research and Development Agenda for Visual Analytics.” Please see this post for an introduction to the study and access to the other 6 parts.

NVAC defines Visual Analytics (VA) as “the science of analytical reasoning facilitated by interactive visual interfaces. People use VA tools and techniques to synthesize information and derive insights from massive, dynamic, ambiguous, and often conflicting data; detect the expected and discover the unexpected; provide timely, defensible, and understandable assessments; and communicate assessment effectively for action.”

The field of VA is necessarily multidisciplinary and combines “techniques from information visualization with techniques from computational transformation and analysis of data.” VA includes the following focus areas:

  • Analytical reasoning techniques, “that enable users to obtain deep insights that directly support assessment, planning and decision-making”;
  • Visual representations and interaction techniques, “that take advantage of the human eye’s broad bandwidth pathway to into the mind to allow users to see, explore, and understand large amounts of information at once”;
  • Data representation and transformations, “that convert all types of conflicting and dynamic data in ways that support visualization and analysis”;
  • Production, presentation and dissemination techniques, “to communicate information in the appropriate context to a variety of audiences.”

As is well known, “the human mind can understand complex information received through visual channels.” The goal of VA is thus to facilitate the analytical reasoning process “through the creation of software that maximizes human capacity to perceive, understand, and reason about complex and dynamic situations.”

In sum, “the goal is to facilitate high-quality human judgment with a limited investment of the analysts’ time.” This means in part to “expose all relevant data in a way that facilitates the reasoning process to enable action.” To be sure, solving a problem often means representing it so that the solution is more obvious (adapted from Herbert Simon). Sometimes, the simple act of placing information on a timeline or a map can generate clarity and profound insight.” Indeed, both “temporal relationships and spatial patterns can be revealed through timelines and maps.”

VA also reduces the costs associated with sense-making in two primary ways, by:

  1. Transforming information into forms that allow humans to offload cognition onto easier perceptual processes;
  2. Allowing software agents to do some of the filtering, representation translation, interpretation, and even reasoning.

That said, we should keep in mind that “human-designed visualizations are still much better than those created by our information visualization systems.” That is, there are more “highly evolved and widely used metaphors created by human information designers” than there are “successful new computer-mediated visual representations.”

Patrick Philippe Meier

Research Agenda for Visual Analytics

I just finished reading “Illuminating the Path: The Research and Development Agenda for Virtual Analytics.” The National Visualization and Analytics Center (NVACs) published the 200-page book in 2004 and the volume is absolutely one of the best treaties I’ve come across on the topic yet. The purpose of this series of posts that follow is to share some highlights and excerpts relevant for crisis mapping.

NVACcover

Co-edited by James Thomas and Kristin Cook,  the book focuses specifically on homeland security but there are numerous insights to be gained on how “virtual analytics” can also illuminate the path for crisis mapping analytics. Recall that the field of conflict early warning originated in part from World War II and  the lack of warning during Pearl Harbor.

Several coordinated systems for the early detection of a Soviet bomber attack on North America were set up in the early days of the Cold War. The Distant Early Warning Line, or Dew Line, was the most sophisticated of these. The point to keep in mind is that the national security establishment is often in the lead when it comes to initiatives that can also be applied for humanitarian purposes.

The motivation behind the launching of NVACs and this study was 9/11. In my opinion, this volume goes a long way to validating the field of crisis mapping. I highly recommend it to colleagues in both the humanitarian and human rights communities. In fact, the book is directly relevant to my current consulting work with the UN’s Threat and Risk Mapping Analysis (TRMA) project in the Sudan.

So this week, iRevolution will be dedicated to sharing daily higlights from the NVAC study. Taken together, these posts will provide a good summary of the rich and in-depth 200-page study. So check back here post for live links to NVAC highlights:

Part 1: Visual Analytics

Part 2: Data Flooding and Platform Scarcity

Part 3: Data Tetris and Information Synthesis

Part 4: Automated Analysis and Uncertainty Visualized

Part 5: Data Visualization and Interactive Interface Design

Part 6: Mobile Technologies and Collaborative Analytics

Part 7: Towards a Taxonomy of Visual Analytics

Note that the sequence above does not correspond to specific individual chapters in the NVAC study. This structure for the summary is what made most sense.

Patrick Philippe Meier