Display data with integrity.

NJ color JPEG 6.26.jpg

In this digital century, education is rife with data. Student test scores, enrollment statistics, “data-driven” instruction, percentages of “highly qualified” teachers–data is virtually everywhere, especially in public schools. Unfortunately, just because data is so common does not mean that it is routinely well handled. Sometimes data is suspiciously derived, sometimes it does not reflect what it purports to, and often it is poorly presented. 

When data is presented for use in understanding, reasoning, and decision-making, the design of the display should reflect these serious purposes. Whether you’re making your own data display or critically engaging someone else’s, the principles of analytical design by Edward Tufte (sometimes called “the DaVinci of data”) define a gold standard.   

Below I use Tufte’s principles as a framework to discuss one data display from my dissertation about how students describe outstanding teaching. The data-map shown above presents demographic information about the data I used for the study (i.e., letters of recommendation written by students describing teachers who eventually won Princeton University’s Distinguished Teacher Award). The challenge was to show as much information as possible about the context of these winning high school teachers. When I recognized that making it look good would be far beyond my technical powers, I quickly sought out the assistance of graphic designer Roy Chambers.

Tufte’s Six Principles of Analytical Design

1. Show comparisons, contrasts, differences. A good display provides adequate information so that readers can reason for themselves about what they’re looking at, and raise their own questions. Studying the data-map of New Jersey, readers are able to ask (and answer) questions like: 

  • Are winners spread evenly throughout the state?

  • Which subject areas do winners teach? How wide is the range? 

  • Where in the state are wealthy districts clustered?

  • Among winners, what is the rough ratio of wealthy to moderate and lower-resourced districts?

  • What is the rough ratio of winners from public schools, as compared to independent and religious schools?

2. Show causality, mechanism, explanation, systematic structure. I wanted this data-map to provide a picture of teachers who won the DTA award during the 25 years of data I analyzed. What Roy Chambers produced is a unified visualization that reveals patterns and prompts new questions. For instance, the map immediately illuminates that winners are distributed fairly evenly with respect to county. That winners skew toward wealthy districts is a little harder to spot, but it also shows up. 

3. Show more than 1 or 2 variables (i.e. multivariate data) in a single display. How many times have you seen pie chart after pie chart after histogram after line graph in some wretched PowerPoint? If you can’t hold it in your head, it’s less likely the fault of the data than the design. This map literally reflects a “30,000-ft. perspective,” so it requires a little orientation at the outset. But after you contend with the legend, you may realize that for each of the 101 winners, there is a lot of information: the year of winning, subject taught, SES and type of school, category of municipal population, county and location within county–well over 700 data points represented on one 8.5" x 11" sheet of paper!

4. Completely integrate words, numbers, images, diagrams. Maps fairly easily accommodate words, numbers, grids, and scales, but this principle may be applied to any display where diversity of evidence will do a fuller job of explanation. Tufte maintains that the key question is How can something be explained? Very often, a single mode of data (e.g., statistical data or anecdotes or news clippings) is not sufficient. Use whatever reliable data may be helpful, in whatever combination, to explain a phenomenon as clearly as possible. For example, in school-based research on academic performance, you might combine data from student perceptions surveys and/or focus groups with report card grades.  

5. Thoroughly describe the evidence. Provide a detailed title, indicate the authors and sponsors, document the data sources, show complete measurement scales, point out relevant issues. The title of the data-map lets readers know that they’re looking at winner demographics for the DTA for the years 1989 to 2013. It clarifies what DTA stands for and confirms that beneath all the labels is the State of New Jersey. Roy Chambers is identified as the designer, with a website for further information. Credit is given to the cartographer J. A. Anderson, who made the 1869 map that Chambers updated to reflect contemporary county boundaries. The legend partially explains measurement scales for SES and the category of municipal population, but it does not provide the sources for this information. (Because this was not designed as a stand-alone document, these measurements are explained in the body of the dissertation which the map was designed to accompany.) 

6. Analytical presentations ultimately stand or fall depending on the quality, relevance, and integrity of their content. As Tufte’s subtitle has it, “Content counts most of all.” Ask What are the content-reasoning tasks that this display is supposed to help with? Is this the best way to show what the reader needs to know in order to understand, evaluate and/or decide an issue? 
 

sources

Roy A. Chambers, "DTA (Princeton University Distinguished Secondary School Teaching Award) Winner Demographics: 1989-2013, New Jersey." (2014) Based on 1869 map by J. A. Anderson. 

Edward R. Tufte, “The Fundamental Principles of Analytical Design” in Beautiful Evidence (Graphics Press, 2006), pp. 122-139.  

 

[Copyright 2015 by Peter Horn, Ed.D. All rights reserved.]

Previous
Previous

High-stakes testing will always backfire.

Next
Next

A team is not just a bunch of people sitting around the same table.