The graphic below is a variant of one I blogged about in my prior entry. Its designers added a storm to create what might be called an inclement tug-of-war. This version of the graphic is from the Libraries Connect Communities: Public Library Funding & Technology Access Study 2011-2012 (PLFTAS) press kit. Both versions communicate this same basic message: The Great Recession led to severe and cumulative cuts in library funding and, at the same time, to an unrelenting surge in demand for traditional and technological services. The supporting data are the same in both versions. As I explained last time, since the data are not measures of actual library funding or usage, they don’t really confirm the graphic’s claims.
Source: ALA, Libraries Connect Communities, 2012. Click for larger image.
Actually, there are funding and usage data that can shed light on these claims, namely the Institute for Museum and Library Services (IMLS) survey data collected annually from 9000+ U.S. public libraries.1 I decided to examine data for 2005 through 2010 from this dataset.2 These cover the official span of the Great Recession (December 2007 to June 2009) with some leeway to identify trends that were ongoing prior to the recession.
First, I’ll summarize what the data show and then follow with details. For U.S. public libraries as a whole, funding cuts during the Great Recession were not nearly the calamity that library advocacy stories described, nor were the cuts cumulative over multiple years. Overall funding grew moderately through the recession and then began falling in 2010. Visits was the star statistic of the group, growing at a faster rate during the recession than before, until it fell substantially in 2010. Public Internet computers usage grew in fits and starts but lacked consistent growth. Program attendance began declining before the recession and remained level until it dropped significantly in 2010.
Now a caveat: Due to the nature of summary statistics, the prior paragraph—and the PLFTAS graphic also—are almost completely irrelevant to individual public libraries. Put another way, during the economic downturn there was no nationwide weather system determining the fate of each and every public library. A minority of libraries ended up in the pathway of the severe weather forecast by library advocates and sustained significant funding cuts. Yet other libraries had marked funding increases while others maintained stable funding. And hundreds and hundreds of libraries were somewhere in-between. During the recession and its immediate aftermath some libraries reported appreciable increases in demand for services. Others reported substantial decreases. Some saw demand mostly unchanged. Still others saw both increases and decreases over time.
This variety (variation) is probably the most important part of this story. However, before exploring this let’s cover the summary statistics. The IMLS surveys do not collect in-library WiFi usage, website access, technology class enrollment, and the like. The one indicator of technology utilization they do collect is public Internet computer users. This indicator appears with three others in the charts below. Program attendance and visits are included to see if there are signs of the surge advocacy campaigns have described. And total operating expenditures, adjusted for inflation, is used as an indicator of funding.3
The first chart here presents national totals for all U.S. libraries combined:
Click for larger image
The chart indicates that public library use and funding were on upswings prior to the recession. These upswings continued for inflation-adjusted funding, visits, and program attendance, with funding and visits falling in 2010. Growth in Public Internet computer usage was uneven but upward.
Since the PLFTAS graphic refers to annual growth in library utilization, the next chart depicts the data from the prior chart as annual rates of change:
Click for larger image
Notice that growth in public Internet computer usage stagnated at -0.4% in 2008, rose to 3% , and then fell again to 0.6%. (It’s possible this volatility is due to the newness of this measure, which IMLS began collecting in 2006.) Visits showed increasing growth through 2009, while program attendance had stable growth. However, by 2010 both measures dropped significantly, with visits reaching negative growth. Either libraries could not sustain these levels presumably due to shorter hours, branch closings, and other cutbacks, or patrons curtailed visits and attendance of their own doing. During the recession total inflation-adjusted public library expenditures increased annually. A decrease did not occur until 2010, when the expenditures rate of growth dropped to nearly -3%.
Besides looking at national totals we can also consider typical values for the libraries surveyed. A commonly used typical data value is the statistical median, which identifies the middle of the data by marking the 50th percentile. Median annual rates of change for U.S public libraries appear in this next chart:
Click for larger image
Compared to the -0.44% figure in 2008 in the previous chart, median rates of change in public Internet computer users make more sense. These rates increased in the first half of the recession but decreased after that. Program attendance had a somewhat lower rate early on, dipping to 2% by 2010. The rate of growth in visits peaked in 2009 and then sank to 1% by 2010. Growth in inflation-adjusted operating expenditures began diminishing in 2009, essentially reaching zero (-0.11%) in 2010.
To recap, the growth in visits confirms the PLFTAS graphic’s claim of increased utilization of traditional library services during the Great Recession. But program attendance does not, since its growth rate was actually higher before the recession. Nor is there evidence of continuously increasing demand for technology services. Nor of severe and cumulative cuts to library funding.4 As to the main thesis of the graphic, the tug-of-war between funding and usage, I have some interesting data to show you further on.
In the meantime, let’s explore the variation described earlier by looking at the spreads (distributions) of these measures. The histograms below show the distributions of changes from 2009 to 2010 only:
Click for larger image
The values above the bars indicate the number of libraries falling within each range. The span of each bar covers either +5% or -5%. As you can see, plenty of libraries showed increases and plenty showed decreases, with most libraries clustering towards the center of the distributions. In each chart, the highest bar happens to fall at the 0% to +5% range. And the spread is fairly even left to right, except the ranges for libraries with increases are wider than those showing decreases. (The distributions are positively skewed.) Notice also that the top two histograms (measured in millions) in the set of four histograms have less clumping towards the center than the bottom two (measured in billions). Perhaps you have some ideas about what causes this pattern.
I’m not going to discuss the histograms much except to point out how they stretch far leftward and even further rightward indicating the minority of libraries with large single-year funding decreases or increases. The same for the output measures. Still, the majority of libraries had visits and operating expenditures changes ranging between -10% and +10%. For public Internet computer usage this majority range is -15% and +15%. And for program attendance it is -20% to + 20%.
The next chart shows the same histograms with the proportion of libraries on either side of zero marked, and medians also.
Click for larger image
Finally, to the tug-of-war issue. This is the claim that libraries have had to confront funding cuts and increased demands for services at the same time. The scatter plots below show the relationships between changes in funding and usage from 2009 to 2010. Very quickly, there is hardly any relationship between changes in funding and changes in visits or public Internet computer usage. If it were true that individual libraries that endured funding cuts simultaneously delivered more services to the public, then there would be more libraries clustered in the top left quadrant of these two charts, and fewer in the bottom left. In the (olive) scatter plot for public Internet terminal usage about 23% of libraries appear in the upper left quadrant and 22% in the lower left. In the (gray) visits scatter plot about 23% lie in the upper left quadrant and 21% in the lower.
Click either chart to see larger image. Move cursor over any circle in larger image to see individual library data.
Notice that trend lines in each plot slant slightly upward towards the right, meaning there is a very slight positive relationship between expenditures and utilization. (I say very slight because the lines are not that far from horizontal.) The strength of this relationship is estimated by a statistical quantity, R-Squared. R-Squared indicates the extent to which changes in either of these two library utilization measures can be predicted from changes in operating expenditures. 0.004 = 0.4% predictability and 0.016 = 1.6% predictability. 100% would be perfect predictability. Bottom line, knowing how a library’s expenditures changed in 2010 doesn’t help in predicting how much its utilization measures changed. This low predictability is even worse than your average weather forecast! But I am exaggerating, as we shouldn’t trust this straight line-oriented statistic too much. It might not be the best one to use here.5
Anyway, there were indeed libraries that experienced funding cuts in 2010 and still provided more services that year. Good for them! But there was an almost equal number of libraries with funding cuts that provided fewer services. Plus, of all libraries that provided more services, more had added wherewithal to do so (increased funding) than did not. And there were also libraries with funding increases showing decreases in services.
Due to the variation we see here it is possible to make a variety of advocacy claims about the plight of libraries. And, of course, we want to do anything we can to help libraries that are in dire circumstances. But as information professionals we need to think twice about repeatedly presenting one-sided information to the public. Loyal library supporters who learn that the perfect storm we reported was more along the lines of scattered thunderstorms are likely to lose confidence in our assessments.
1 Each survey year has different counts of libraries reporting specific measures.
2 The 2010 public library survey data, issued last June, is the most recent dataset available from IMLS.
3 Adjustment for inflation was standardized on 2010 dollars using the method described in Institute for Museum and Library Services, Public Libraries Survey Fiscal Year 2009, p. 34. Adjustment factors are shown here:
4 Do not confuse the idea of “no evidence of so-and-so” with “evidence that so-and-so is untrue or did not occur.” Not finding evidence confirming a hypothesis is not the same as finding evidence disproving a hypothesis. Incidentally, this misconception showed up in the 2012 U.S. presidential campaign. (A misconception in a political campaign? Really?) The Democrats claimed that a lack of evidence confirming that voter fraud occurred meant that voter fraud did not occur. An unsound idea. An essay I wrote last year discussed this tricky issue in the context of statistical hypothesis testing.
5 This statistical tool (linear regression) could be understating the relationship between expenditures and the output measures. It might be that a curvilinear statistical model would work better here. But that’s kind of overkill for this discussion.