Presentation is loading. Please wait.

Presentation is loading. Please wait.

Making Incremental Improvements to Public Library Comparative Statistical Practices Ray Lyons Jason Holmes Library Assessment Conference Seattle, Washington.

Similar presentations


Presentation on theme: "Making Incremental Improvements to Public Library Comparative Statistical Practices Ray Lyons Jason Holmes Library Assessment Conference Seattle, Washington."— Presentation transcript:

1 Making Incremental Improvements to Public Library Comparative Statistical Practices Ray Lyons Jason Holmes Library Assessment Conference Seattle, Washington August 5, 2008

2 The Basic Problem The ultimate goal of the library is public enlightenment. It is difficult to assess our impact on enlightenment of the community because we have no way to measure it. We measure what we CAN measure We compare what we CAN compare

3 Context of Comparative Statistics Assessment –Measures used as part of a more general process of assessing library value and effectiveness Management practice –Measures intended for iterative and ongoing process of performance measurement

4 Public Library Assessment Library profession traditionally applies systems or industrial model: Inputs: resources supplied to the library Outputs: products and services ThroughputInputOutput

5 Performance Measurement Model InputsOutputs Intermediate Outcomes End Outcomes Outcome Measures Outcome Measures RESULTS EFFORTS

6 Performance Measurement Steps 1.Define long term goals (desired outcomes) 2.Define medium and short term objectives 3.Develop programs aimed at objectives 4.Specify measurement ‘indicators’ 5.Monitor indicators to track accomplishments

7 PLA Planning-for-Results approach to library management Abandonment of established operational and performance standards ALA / PLA 1987 publication, Output Measures for Public Libraries: A Manual of Standardized Procedures, defines standard statistics and collection procedures Rationale for Standardized Statistics

8 Useful for self-evaluation based on service response choices library makes Should be interpreted with respect to library mission, goals, objectives Interpretation left up to the library ALA / PLA Approach to Standardized Statistics

9 Current Practices in Comparative Library Statistics How are library statistics currently used for comparing public libraries? What are the bases for these uses? What purposes do they serve?

10 Survey of Ohio Public Libraries on Use of Comparative Statistical Measures Exploratory study Available at: UsePerceptComparStats.pdf Responses via interview or online questionnaire Stratified random sample of 90 Ohio public libraries Two strata: urban & rural counties Response rate = 47% (42 libraries)

11 Survey Findings

12

13 Survey Findings: Frequency of Managerial Review of Input Measures Operating expenditures12%9%65%15%0% Print Mat. Expenditures12%18%62%9%0% Electronic Mat. Expenditures39%21%36%3%0% Print Materials56%9%29%3% 0% Print Subscriptions74%12%15%0% Audio/Video Materials58%9%30%0%3%0% Databases63%14%6% Internet terminals51%11%14%9%6%9% FTE Staff79%6%12%0%3%0% Frequency Input Measure Annually Quarterly Monthly Weekly Rarely Not Sure NOTE: Sky blue highlighting indicates measures that 50% or more library managerial teams review periodically. Light blue highlighting indicates higher frequencies that, combined, total 50% or more.

14 Survey Findings: Frequency of Managerial Review of Output Measures Circulation9%0%83%9%0% In-house Mat. Use21%12%26%3%34%0% Interlibrary loan23%6%66%6%0% Visits21%15%54%6%3%0% Reference Transactions32%24%34%0%9%0% Program attendance32%9%51%6%0% Electronic Mat. Use19%6%60%0%9%0% Internet Terminal Use15%6%66%3% Website Use12%9%63%0%9%3% Frequency Output Measure Annually Quarterly Monthly Weekly Rarely Not Sure NOTE: Sky blue highlighting indicates measures that 50% or more library managerial teams review periodically. Light blue highlighting indicates higher frequencies that, combined, total 50% or more.

15 Survey Findings

16

17

18 Survey Findings: Statistical Measures Libraries Use in Comparisons with Other Libraries (Table Format) Material Expenditures100.0% Circulation96.8% Operating Expenditures90.3% FTE Staff77.4% Print Material Counts48.4% Audio/Video Material Counts41.9% Databases Available41.9% Visits38.7% Subscriptions35.5% Interlibrary Loans35.5% Measure % of Libraries Using Measure Electronic Materials Expenditures32.3% Librarians32.3% Program attendance29.0% Internet terminals25.8% Reference Transactions25.8% Electronic Materials Usage12.9% Other (borrowers, salaries, etc.)12.9% In-house Material Usage9.7% Internet Terminal Usage9.7% Website Usage9.7% Measure % of Libraries Using Measure

19 Interpreting Library Measures There are no ‘right’ or ‘wrong’ scores on an output measure; ‘high’ and ‘low’ values are relative. The scores must be interpreted in terms of library goals, scores on other measures, and a broad range of other factors. - Van House, Weill, and McClure (1990)

20 ALA / PLA policy since 1980’s: Leave data interpretation to local library “Each library staff should decide for them- selves whether the [statistical] findings for that library were acceptable in terms of performance expectations.” - Ellen Altman (1990) describing the Public Library Performance Measurement Study by Deprospo et al. (1973) Interpreting Library Measures

21 Lack of criteria for evaluating measures Collection of standard statistics assumes all library resources/activities counted to be equivalent Key Problems with Library Statistics

22 Standardization ignores differences in: Key Problems with Library Statistics - Complexity - Sophistication - Relevance - Quality (Merit) - Value (Worth) - Effectiveness - Efficiency - Significance

23 Data imprecision due to: –Inconsistent collection methods –Mistakes –Sampling error –“Gaming” –Statistical imputation Imprecision makes individual library comparisons less valid Key Problems with Library Statistics

24 Lack of reliable methods for identifying peer libraries –Comparisons are either approximate or inaccurate –Can result in incorrect or misleading conclusions Data are self-reported and unaudited Key Problems with Library Statistics

25 The More-is-Better Myth –Views higher numbers as favorable performance, lower as unfavorable “More activity does not necessarily mean better activity” - Van House, Weill, and McClure (1990) –Striving to earn higher numbers may compromise service quality Key Problems with Library Statistics

26 Statistics say nothing about performance adequacy, quality, effectiveness, or efficiency of library resources/activities No consensus on constructs that statistics can realistically reflect Difficult to determine remedies for problems which statistics might reveal Key Problems with Library Statistics

27 Variety of reasons for insufficient scores: Key Problems with Library Statistics - Inadequate knowledge of community needs - Staff skill deficiencies - Inadequate staffing - Inefficient workflows - Inadequate planning - Limited user competencies... and others Adapted from Poll and te Boekhorst (2007)

28 Output measures “reflect the interaction of users and library resources, constrained by the environment in which they operate. The meaning of a specific score on any measure depends on a broad range of factors including the library’s goals, the current circumstances of the library and its environment, the users, the manner in which the measure was constructed, and how the data were collected.” [emphasis added] - Van House, Weill, and McClure (1990)

29 Policy-Level Problems with Library Statistics PLA managing-for-results approach has produced undesirable results –Confusion about meanings of statistical indicators –Expectations that local libraries are able to interpret data productively have been too optimistic

30 Exaggerated or inaccurate advocacy campaigns undermine credibility of assessment process, methods, and data Biased advocacy studies at cross- purposes with need for accountability Policy-Level Problems with Library Statistics

31 Negative Side-Effects of Library Advocacy Practices Advocacy narratives can ‘dumb down’ data analysis and assessment efforts –Encourage absurd interpretations of library statistics –Misuse key assessment terms and concepts –Promote unjustifiable conclusions drawn from studies that have employed limited research methods

32 Improvements Needed Commit to ensuring credibility of assessment data and performance measurement findings –Discourage naïve, disingenuous, and unsupportable use of statistics or assessment findings –Specify profession’s ideology regarding “rules of evidence”

33 Improvements Needed Fuller understanding of limitations of statistical indicators and comparison methods –Discourage describing performance solely based on standard statistics “The measures are best used with other information about the library.” - Van House, Weill, and McClure (1990)

34 Relate levels of resources/services to levels of community needs Explore relationships among entire set of standard statistical indicators, i.e. complementary and conflicting dimensions Improvements Needed

35 Identify peer libraries using multiple indicators: Community population + library budget + key demographic characteristics Explore feasibility of alternative sets of indicators depending on library type, size, mission, etc. Improvements Needed

36 Increased understanding of measurement and interpretation –Draw reasonable conclusions and interpretations –Basic behavioral science measurement practices Improvements Needed

37 Behavioral Science Measurement Model Conceptualization Nominal Definition Operational Definition Measurement in Real World Babbie (2007)

38 References Ellen Altman. “Reflections on Performance Measures Fifteen Years Later” In Library Performance, Accountability, and Responsiveness: Essays in honor of Ernest R. DeProspo, C.C. Curran and F.W. Summers, eds. (Norwood, NJ: Ablex, 1990) Earl Babbie, The Practice of Social Research, 11th ed. (Beaumont, California: Thomson, 2007) Roswitha Poll and Peter te Boekhorst. Measuring Quality: Performance Measurement in Libraries, 2nd ed. (Munich: KF Saur, 2007) Nancy A. Van House et al. Output Measures for Public Libraries: A Manual of Standardized Procedures, 2nd ed. (Chicago: American Library Association, 1987) Nancy A. Van House, Beth T. Weill, and Charles R. McClure, Library Performance: A Practical Approach (Chicago: American Library Association, 1990)


Download ppt "Making Incremental Improvements to Public Library Comparative Statistical Practices Ray Lyons Jason Holmes Library Assessment Conference Seattle, Washington."

Similar presentations


Ads by Google