I think I’m getting jaded. I am beginning to wonder whether lobbying for balanced reporting of evaluation and research findings is a waste of time. With voices more influential than mine weighing in on the opposite side, I’m having trouble staying positive. Granted, I do find inspiration in the work of people much wiser than me who have confronted this issue. One such source is my favorite sociologist, Stanislav Andreski, who wrote the following in his book, Social Sciences as Sorcery:
In matters where uncertainty prevails and information is accepted mostly on trust, one is justified in trying to rouse the reading public to a more critical watchfulness by showing that in the study of human affairs evasion and deception are as a rule much more profitable than telling the truth.1
The problem is, wisdom like Andreski’s languishes on dusty library shelves and the dust-free shelves of the Open Library. Much more (dare I call it?) airtime goes to large and prestigious institutions that are comfortable spinning research results to suit their purposes.
Fortunately, I am not so demoralized as to pass up the opportunity to share yet another institution-stretching-the-truth-about-research-data story with you. This involves an evaluation project funded by the Robert Wood Johnson Foundation and conducted by Mathematica Policy Research and the John W. Gardner Center for Youth and Their Communities at Stanford University. [Read more]
It never hurts to revisit the basics of a method that we’ve chosen to apply to a task we want to accomplish or a problem needing solved. So, the recent announcement of the Library Edge benchmarks is a good occasion to discuss that particular performance assessment method. In the third edition of his book, Municipal Benchmarks, University of North Carolina professor David Ammons describes three types of benchmarking:1
1. Comparison of performance statistics
2. Visioning initiatives
3. “Best practices” benchmarking
The idea behind item #1 is that the sufficiency of an organization’s performance can be judged by comparing its performance data with other organizations or against externally defined standards. Comparisons of different organizations using only performance data, without any reference to standards, is called comparative performance measurement. An Urban Institute handbook of the same name by Elaine Morley, Scott Bryant, and Harry Hatry gives an in depth explanation of this method.2 [Read more...]
1 Ammons, D. N. (2012). Municipal benchmarks: Assessing local performance and establishing community standards, Armonk, NY: M.E. Sharpe, p. 15.
2 Morley, E., Bryant, S. P., & Hatry, H. P. (2001). Comparative performance measurement, Washington, DC: The Urban Institute Press.
To begin this episode I want to introduce you to a couple of historical ideas on best practices in graphical data presentation—or using the more modern term, data visualization. (The peculiar title I’ve chosen comes from this history. Read on to see what it means.) Then I’ll step through a redesign of a bar chart to show you how effective graphical simplicity can be.
In the 1980’s and 1990’s statistical graphing experts Edward Tufte, William Cleveland, and Howard Wainer were promoting fair and clear designs for statistical charts.1 Nearly seventy years earlier American engineer Willard C. Brinton was doing the same thing in his 1914 book, Graphic Methods for Presenting Facts. Here’s a figure from the book:
Source: Brinton (1914), Graphic Methods for Presenting Facts, p. 21
Note that in his figure Brinton advocated for “accuracy of statement.” He did the same in this next… [Read more]
1 A more recent and quite definitive book on the principles of best data visualization practice is the second edition of Stephen Few’s book, Show Me the Numbers.
Nowadays libraries aspire to be data-driven. Almost everyone agrees that collecting and using data to improve organizational performance is a good thing. Implied in the various regimens promoting this idea (library assessment, managing-for-results, evidence-based practice, quality management, etc.) is the need for practicing two virtues: patience and determination. These virtues happen also to be part and parcel of information literacy. We repeatedly advise users to avoid settling for the most convenient and quickly accessible information that shows up. And urge them to put ample time and energy into thinking critically about their question, its context, and the complete range of potentially relevant information.
Being data-driven requires this same discipline. I mention discipline because this blog entry concerns a challenging but quite educational quantitative topic. If you follow this to its conclusion I guarantee your numeracy muscles will be invigoratingly exercised!
I begin with this caveat: When advocacy or public relations professionals fail to think critically about their data, they risk communicating the wrong message to their audiences. Last month I ran across an interesting case of this in a radio public service announcement by the Ohio Highway Patrol . . . [Read more]
The graphic below is a variant of one I blogged about in my prior entry. Its designers added a storm to create what might be called an inclement tug-of-war. This version of the graphic is from the Libraries Connect Communities: Public Library Funding & Technology Access Study 2011-2012 (PLFTAS) press kit. Both versions communicate this same basic message: The Great Recession led to severe and cumulative cuts in library funding and, at the same time, to an unrelenting surge in demand for traditional and technological services. The supporting data are the same in both versions. As I explained last time, since the data are not measures of actual library funding or usage, they don’t really confirm the graphic’s claims.
Source: ALA, Libraries Connect Communities, 2012. Click for larger image.
Actually, there are funding and usage data that can shed light on these claims, namely the Institute for Museum and Library Services (IMLS) survey data collected annually from 9000+ U.S. public libraries.1 I decided to examine data for 2005 through 2010 from this dataset.2 These cover the official span of the Great Recession (December 2007 to June 2009) with some leeway to identify trends that were ongoing prior to the recession.
First, I’ll summarize what the data show and then follow with details. For U.S. public libraries as a whole, funding cuts during the Great Recession were not nearly the calamity that library advocacy stories described, nor were the cuts cumulative over multiple years. Overall funding grew moderately through the recession and then began falling in 2010 . . . [Read more]
1 Each survey year has different counts of libraries reporting specific measures.
2 The 2010 public library survey data, issued last June, is the most recent dataset available from IMLS.
I admit it. I’ve been suffering from a case of statistician’s block. No inspiring ideas for this blog have presented themselves since July. Well, actually, a couple did surface but I resisted them. Very recently, though, the irresistible “infographic” shown here came to my attention. I am therefore pleased to return to my keyboard to discuss this captivating image with you!
Source: ALA, Libraries Connect Communities, 2012. Click for larger image.
The infographic appears in the executive summary of the American Library Association’s (ALA) report, Libraries Connect Communities: Public Library Funding & Technology Access Study 2011-2012, published in June. The graphic’s basic message is an ongoing struggle between two sides. On the left the blue silhouetted figures represent public demand for technology services at libraries, with four percentages quantifying levels of use. The lone silhouette on the right side personifies library funding (is he a municipal budget official?), with a single percentage quantifying that. Apparently, the quantities on the left are, using the tug-of-war metaphor, overpowering the right side.
Let’s look a bit closer at the quantitative evidence in this infographic . . . [Read more]
A while back, in his 21st Century Library Blog Steve Matthews commented on some data appearing in a report entitled The Library in the City published by the PEW Charitable Trusts Philadelphia Research Initiative. Dr. Matthews was puzzled by an inconsistency between statistical trends highlighted in the report and standard per capita circulation, visits, and Internet computer measures. He noted, for example, that among the libraries studied Columbus Metropolitan Library had the greatest cumulative decline in visits (-17%) over the seven year study period. Yet, in 2011 Columbus ranked 2nd in the group on visits per capita. The opposite was true for the Enoch Pratt Library in Baltimore. Although the library showed the second highest cumulative increase in visits (at 25%), its 2011 per capita visit rate was the lowest in the group. Curious patterns, indeed.
There are a couple of statistical dynamics at play here . . . [Read more]