Readers of the, say, older persuasion may recall a time when children actually enjoyed games that required no peripheral devices, infrared sensors, or satellite tracking. There was one party game, simply called (I think) “Telephone,” where one player whispered a message to the next, and that player to the next, until the message was passed all the way around the circle of players. The fun came when everyone heard the amusing distortions that ended up in the final message.
In library advocacy research, though, message distortion is not amusing. I noticed a serious instance of this in a recent IMLS Research Brief which cites an American Library Association (ALA) report finding that patron use of library computers for job-seeking purposes has “greatly increased.” The ALA report is Job-Seeking in U.S. Public Libraries and the statement the IMLS brief cites is this:
As part of site visits to public libraries in nine states [conducted in three annual studies], the study research team has found greatly increased use of library technology for job-seeking and e-government.1
The ALA report is one of a series of “issue briefs” published by the Office for Research and Statistics that summarize and supplement key results from the multi-year Public Library Funding and Technology Access study. To date this project has issued three annual reports beginning with the 2006/2007 edition. The project is a collaborative effort connected with the Public Libraries and the Internet longitudinal studies which began in 1994.
Anyway, I wondered how big this increase actually was and what the level of job-seeking computer use had been before the big increase happened. So I went searching for the numbers in the ALA Public Library Funding annual studies. Turns out none of the studies measured frequency of job-seeking or e-government computer use by patrons. Nor did the studies compare frequencies of any reported computer uses from year to year. The 2006/2007 and 2007/2008 editions merely state that job-seeking and e-government were common uses reported by some patrons, without mentioning increases of any sort. The 2008/2009 edition reports increased patron job-seeking computer use, but does not describe this increase as substantial or “great.”
The quotation from the issue brief (above) says that the researchers detected this increase by means of interviews with staff and patrons during library site visits. These interviews, conducted in a few selected U.S. states each year, included a simple open-ended question to patrons: “What do you use [the library’s computers] for?” Job-seeking and e-government made the lists of most frequent responses (and each state’s list apparently differed from the others). But, no frequency counts for these uses show up in the three studies, perhaps because the counts weren’t collected.2
Even if the researchers did tally these uses during the interviews, neither the interviewees nor the states where interviews were conducted were randomly selected. So, we couldn’t say the tallies represent the larger patron population nationally. The 2007/2008 study reports a convenience sample3 of about 200 patrons who were using library computers at the time.4 The study then reports that “Interviews with users confirmed staff observations that much computer use in libraries is job-related…”5 Exactly how much they don’t say. And how often unrepresented patrons—say teenagers who had yet to show up after school—might use computers for job-seeking we can’t tell.6
Selection bias of a different sort confounds year to year comparisons from the ALA studies. Because the annual interviews were conducted in different states where the job markets and online government services could differ significantly, there are no reliable baselines for comparing job-seeking and e-government computer usage between years. For instance, the 2006/2007 study included a site visit to a library in Nevada where staff reported long lines of patrons using library computers to apply for jobs at a newly opened gambling casino. Relying on such atypically high usage as a baseline for job-seeking computer use could mask actual increases in later study years in different states.
The question remains, what data are these “great increases” based on? None that I can find in the ALA studies. The issue brief does cite other figures that did increase over time, but these figures don’t describe patterns of computer use. The figures are from the survey portion of ALA’s studies, from questionnaire items that elicit library staff opinions. Staff were asked to identify the top five public Internet services that they believed to be “the most critical to the role of the library branch in its local community.” In the 2006/2007 study 44% of responding staff chose provision of job-seeking services as one of their top five priorities. In the 2007/2008 study this proportion was 62.2%, and in 2008/2009 it was 65.9%.
Basically, about 22% more votes by staff (each staff respondent got up to 5 votes) went to patron job-seeking services in the 2008/2009 survey than in the 2006/2007 one. While these vote tallies may reflect changes in staff perceptions of computer use during this period, the votes are only opinions, and don’t indicate how patrons actually used computers in libraries.
The truth is that we don’t have the usage data needed to support the assertion made in the ALA issue brief. Without valid baseline data, we can’t measure increases in patron job-seeking or e-government computer use at all, and we certainly can’t tell whether or not any increases have been great.
1 American Library Association, Oct. 2009, Job-Seeking in U.S. Public Libraries, p. 2; emphasis added.
2 Site visits, as well as focus groups which the ALA studies held, belong to the category of qualitative research methods. As the ALA project illustrates, collecting essentially quantitative information using qualitative methods can lead to problems.
3 Convenience sampling is a type of ‘nonprobability sampling.’ With nonprobability sampling, we have no statistical basis for claiming that our study findings describe the larger population that we had hoped our research would apply to. Using nonprobality sampling invites biased information into study results.
4 American Library Association, 2008, Libraries Connect Communities, p. 128.
5 American Library Association, 2008, p. 131; emphasis added.
6 The 2006/2007 ALA study cites a 2006 Baltimore, MD study where library computer use was found to depend on age group. See footnote on p. 169 of the ALA 2006/2007 study.