Sugar High

It’s very simple. Government agencies that issue distorted information in a time of crisis lose credibility and end up appearing incompetent. Even federal bureaucracies eventually learn this. Take the Gulf of Mexico oil disaster for example. Last week CBS News reported that the U.S. Department of the Interior misrepresented findings from independent scientific assessments of the leakage. The Department’s press release said scientists estimated that the rate of leakage fell within a range of 12,000 to 19,000 barrels of oil per day. Yet scientists had submitted this range as an estimate only of the best case scenario (the lower bound of their overall estimate). They had yet to come up with worst case estimates (the upper bound), a fact the original press release omitted. This led CBS News to accuse the government of “sugar coating” the scientific findings. On June 3 the Department corrected the press release to accurately reflect the scientists’ report. (And I saw the press release was revised again on June 7.)

In the library world, however, this lesson has yet to be fully appreciated. For library marketeers sugar glaze is an essential part of the communication recipe. Consider a recent compendium of library advocacy “statistics” issued by OCLC in a two-page flier, How Libraries Stack Up: 2010.

Viewing the flier, one’s eyes are immediately drawn to a silouetted map of the USA overwritten with these words:

Every day 300,000 Americans get job-seeking help at their public library.

OCLC USA SillouetteThe source for the 300,000 figure is “primary research” OCLC conducted involving 719 self-selected respondents to a questionnaire disseminated via “a post on various e-mail lists.”1 A footnote in the flier acknowledges that this and other data from the research are only estimates.

But how accurate are these estimates? I suspect the 300,000 figure is not a particularly solid one for two reasons: First, obtaining reliable counts of job-seeking services requires more effort than most libraries can apply to the task. Like reference question tallies, these services are not auditable transactions. Usually, these estimates are based on impressions of library staff. (Imagine if circulation statistics were reported based on staff opinions about how busy the circulation desk has been.) Second, the figure is also quite likely to be inaccurate (as an estimate of the estimates libraries make) because libraries responding to the survey were not a representative sample of public libraries nationally.2

The OCLC flier doesn’t really say what kind of job-seeking help Americans receive from libraries every day. Maybe they get self-service access to word processing, email, career websites, resume guides, and so forth. Or maybe they get comprehensive services like career counseling, guidance in resume writing, job interview coaching, and the like. As it is, we don’t have detailed nationwide data telling us what mix of services these patrons receive, nor how often they receive them. Based on the figure A4 from the ALA research brief the flier cites, we might speculate that patrons receive much more of the former (self-service access) than the latter (comprehensive services). Yet the phrase “get job-seeking help” implies patrons receive personalized, hands-on assistance from librarians. If most of these daily services are not personalized, then a more honest phrase should be used. “Job-seeking self-help” wouldn’t work. How about “access to job-seeking resources?”

On the next page there is an image of the help wanted section of a newspaper with these captions:

U.S. public libraries offering career assistance:  13,000
U.S. Department of Labor One-Stop Career Centers:  3,000

The suggestion is that, due to their prevalence, libraries provide 4.33 times more job-seeking help than one-stop career centers do. There are two problems with this comparison. The first is a data measurement problem. Due to their prevalence we might suppose that neighborhood convenience stores sell 4 times as many gallons of milk as supermarkets do. But we need actual milk sales figures from both types of stores to see whether this is true.

The second problem is with the logic of the comparison, which depends upon a vague definition of the services being provided. Services from one-stop career centers are, generally speaking, more intensive than those delivered at a library. An analogy for this argument comes from the health care sector: While there may be more urgent care centers than hospitals, they do not collectively provide anywhere near the “amount” of care that hospitals provide, when scope and sophistication of care are considered.

OCLC flier graphic.  Source: How Libraries Stack Up: 2010.

A similar specious comparison appears in another diagram on the same page of the OCLC flier: “U.S. public libraries circulate as many materials every day as FedEx ships packages worldwide.” Amplified by the truck graphic shown above, the idea is that these two accomplishments are equivalent. But they are hardly alike. Sure, in both settings about 8 million items move from point A to point B daily. But FedEx items are routinely transported across and between continents, whereas public library items move within a few-mile radius from the library (ignoring interlibrary loans). Except for bookmobile and mail-delivered circulation, library patrons transport all of these items to and fro.

On the other hand, FedEx transports all of its parcels from their originating locales to final destination addresses. So the truck graphic is an apt symbol for FedEx’s efforts.  But putting the 7.9 million library circulation figure on the truck is a distortion of the facts, if the purpose is to compare the amount of work each institution performs.

Other comparisons in the flier are more fair and sensible. The comparison of library card holders with credit card holders is a reasonable one. So is contrasting library visits to other sources of entertainment, like movies and sporting events (although it would be interesting to include arts and culture attendance also).

Library DVD circulation versus commercial video rentals shown in the pie chart on page 2 is a fair comparison. Unfortunately, the chart has the same problem that the Business Week graph in my earlier post has. On one hand, the Redbox and Netflix rentals data are reasonably reliable. But, because OCLC’s survey sampling method was inadequate, the DVD circulation data are unreliable. We cannot draw valid conclusions from the pie chart because part of the data are not credible. The same goes for the flier’s comparison of library meeting rooms usage with commercial conference centers. The comparison is also weak because libraries typically serve a different clientele than conference centers do.

Again, the message is simple: The more accurate and sound advocacy information is, the more valuable it will be to libraries—unsweetened.

 
—————————

1  Another source for the figure is the ALA Research and Statistics issue brief, A Perfect Storm Brewing, although it is unclear how this brief contributed to the figure.
2  The sampling method used was convenience sampling, a type of nonprobability sampling that paints an incomplete and insufficient picture of the larger population of interest.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s