Assessing and Evaluating Services in Libraries and Information Centers Towards Sustained Progress and Development Dennis A. Alonzo Dean, College of Education.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Requirements Engineering Processes – 2
Chapter 5 Transfer of Training
Planning Reports and Proposals
Using the CDC Evaluation Fwork to Avoid Minefields on the Road to Good Evaluation Presented to: 2002 National Asthma Conference October 24, 2002 By: Thomas.
international strategic management
A Systems Approach To Training
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 6 Author: Julia Richards and R. Scott Hawley.
Science Subject Leader Training
1 of 21 Information Strategy Developing an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy Developing.
Evidence 2/8/2014 Evidence 1 Evidence What is it? Where to find it?
Overview of Performance Measurement. Learning Objectives By the end of the module, you will be able to: Describe what performance measurement is, and.
Joint ATS-WASC Accreditation Reviews Jerry McCarthy, ATS Teri Cannon, WASC.
Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
1 Introduction to Safety Management April Objective The objective of this presentation is to highlight some of the basic elements of Safety Management.
1 Assessing Health Needs Gilbert Burnham, MD, PhD Johns Hopkins University.
ActionDescription 1Decisions about planning and managing the coast are governed by general legal instruments. 2Sectoral stakeholders meet on an ad hoc.
The Managing Authority –Keystone of the Control System
Human Performance Improvement Process
Program Goals, Objectives and Performance Indicators A guide for grant and program development 3/2/2014 | Illinois Criminal Justice Information Authority.
Chapter 3 Critically reviewing the literature
Converting Data to Information. Know your data Know your audience Tell a story.
1 Click here to End Presentation Software: Installation and Updates Internet Download CD release NACIS Updates.
Part Three Markets and Consumer Behavior
1 Implementing Internet Web Sites in Counseling and Career Development James P. Sampson, Jr. Florida State University Copyright 2003 by James P. Sampson,
Computer Literacy BASICS
Presenter: Beresford Riley, Government of
Management Plans: A Roadmap to Successful Implementation
Chapter 5 – Enterprise Analysis
Introduction to Monitoring and Evaluation
A Roadmap to Successful Implementation Management Plans.
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 31 Slide 1 Service-centric Software Engineering.
Developing and Implementing a Monitoring & Evaluation Plan
Promoting Regulatory Excellence Self Assessment & Physiotherapy: the Ontario Model Jan Robinson, Registrar & CEO, College of Physiotherapists of Ontario.
Basel-ICU-Journal Challenge18/20/ Basel-ICU-Journal Challenge8/20/2014.
1..
Strategic Financial Management 9 February 2012
CONTROL VISION Set-up. Step 1 Step 2 Step 3 Step 5 Step 4.
M & E for K to 12 BEP in Schools
Science as a Process Chapter 1 Section 2.
RTI Implementer Webinar Series: Establishing a Screening Process
Controlling as a Management Function
1 Phase III: Planning Action Developing Improvement Plans.
PSSA Preparation.
Virginia Teacher Performance Evaluation System 0 August 2012.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Educator Evaluation: A Protocol for Developing S.M.A.R.T. Goal Statements.
Data, Now What? Skills for Analyzing and Interpreting Data
UNSW Strategic Educational Development Grants
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Too expensive Too complicated Too time consuming.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Stages of Research and Development
Presentation transcript:

Assessing and Evaluating Services in Libraries and Information Centers Towards Sustained Progress and Development Dennis A. Alonzo Dean, College of Education Director, Special Programs Head, Curriculum Development Center University of Southeastern Philippines Davao City

Key Concepts Reasons for Evaluation The Framework of Evaluation Identifying Performance issues for Evaluation Methods Performance Measurement for the eLib

Why do we need to Evaluate? To gather empirical data to inform decisions. As an internal control mechanism to ensure that the resources are used efficiently and effectively To convince the funders and the clients that the service is delivering the benefits that were expected when the investment was made Dennis A. Alonzo 3

Big Question in Evaluation What do you want to know? Dennis A. Alonzo 4

Ideal Library Set clear, tough and meaningful standards Tell users in a clear, straightforward way about services Consult widely about what services people need and how services can be improved Make services available to everyone who needs them Treat all people fairly. Have polite and helpful staff

Ideal Library Use resources effectively by budgeting carefully Continually make improvements Work with other providers to provide a better service Show that users agree that the services provided are really good.

Issues in Evaluation Collect information to facilitate decision making Justify increasing expenditures Evaluate the quality of services provided Plan for future improvements and directions Identify the extent to which problems can be solved Identify differing needs of different user categories

Issues in Evaluation Plan public relations work and information dissemination Provide feedbacks to and evaluate contractors Involve users in management – allows users to rediscover a voice in library management and express views about service priorities Avoid “questionnaire fatigue”

Focus of Evaluation Appraisal of strengths and weaknesses? Effectiveness of its educational services?

Identifying Performance Issues The way the management structure functions Internal operations relating to information materials, such as cataloguing and classification Library/ information services to users New programs of service delivery Alternative possibilities for doing anything The functioning of a total system prior to planning change

Evaluation defined: Can and should enhance the quality of interventions (policies and programs) designed to solve or ameliorate problems in social and corporate setting (Owen, 2006) Process of knowledge production Uses rigorous empirical enquiry Dennis A. Alonzo

Logic of Evaluation (Fournier, 1995) Establishing criteria of worth On what dimensions must the evaluand do well? Constructing standards How well should the evaluand perform? Measuring performance and comparing with standards How well the evaluand perform? Synthesizing and integrating evidence into a judgment of merit or worth What is the worth of the evaluand? Dennis A. Alonzo

Objects of an Evaluation Policies Legislative policies Large scale policies Local policies Programs Products Person/ People Dennis A. Alonzo

Criteria for Evaluation Success Efficiency Effectiveness Benefits Costs – which can be evaluated independently or in association with any of the above

Evaluation Forms and Approaches Program Proactive Evaluation Clarificative Evaluation Monitoring Evaluation Impact Evaluation Interactive Evaluation Owen, 2006 Dennis A. Alonzo

PROACTIVE EVALUATION Takes place before the program is designed Assists planners to make decisions about what type of program is needed Provides input about how best to develop program in advance of the planning stage Dennis A. Alonzo

Typical Issues Is there a need for a program? What do we know about the problem that the program will address? What is recognized as best practice in this area? Have there been attempts to find solutions to this problem? What does the relevant research or conventional wisdom tell us about this problem? What could find out from external sources to rejuvenate an existing policy or program? Dennis A. Alonzo

Major Approaches Needs Assessment or Needs Analysis Research Synthesis Meta-analysis Narrative Review Review of Best Practices Dennis A. Alonzo

Evaluation Forms and Approaches Program Proactive Evaluation Clarificative Evaluation Monitoring Evaluation Impact Evaluation Interactive Evaluation Early After Implementation Owen, 2006 Dennis A. Alonzo

CLARIFICATIVE EVALUATION Designed to assist stakeholders to conceptualize interventions and improve their coherence, and thus increase the chances that their implementation will lead to the desired outcomes. Concentrates on making explicit the internal structure and functioning of an interventions Program logic or theory is developed/ revised Dennis A. Alonzo

Issues to be addressed What are the intended outcomes of this program and how is the program designed to achieve them? What are the underlying rationale for this program? What program structures or elements need to be modified to maximize program potential to achieve the intended outcomes? Is the program plausible? Which aspects of the program are amenable to a subsequent monitoring or impact assessment? Dennis A. Alonzo

Approaches: Evaluability Assessment (EA) Program Logic Ex-ante Evaluation Dennis A. Alonzo

Evaluation Forms and Approaches Program Proactive Evaluation Clarificative Evaluation Monitoring Evaluation Impact Evaluation Interactive Evaluation After The Program Design has been Clarified/ Finalized Owen, 2006 Dennis A. Alonzo

3. INTERACTIVE EVALUATION Provides systematic evaluation findings through which local providers can make decisions about the future direction of the program; Provides assistance in planning and carrying out self-evaluations; Focuses evaluation on organizational change and improvement, in most cases on a continuous basis; and Empowers providers and participants. Dennis A. Alonzo

Typical Issues to be addressed What is this program trying to achieve? How is this program progressing? Is the delivery working? Is it consistent with the program plan? How could the delivery be changed so as to make it more effective? How could this organization be changed so as to make it more effective? Dennis A. Alonzo

Approaches Responsive Evaluation (Stake, 1980) Action Research Development Evaluation Empowerment Evaluation Quality Review Dennis A. Alonzo

Evaluation Forms and Approaches Program Proactive Evaluation Clarificative Evaluation Monitoring Evaluation Impact Evaluation Interactive Evaluation Conducted to determine the performance of each unit of the program Owen, 2006 Dennis A. Alonzo

4. MONITORING EVALUATION Appropriate when a program is well established and ongoing. Involve the development of a system of regular monitoring of the progress of the program. Include a rapid response capability (Mangano, 1989) and to provide timely information for organizational leaders (Owen & Lambert, 1998) Dennis A. Alonzo

Typical Issues Is the program reaching the target population? Is implementation meeting program benchmarks? How is implementation progressing between sites? How is implementation progressing now compared to a month ago, or a year ago? Are our cost rising or falling? How can we fine-tune this program to make it more efficient? How can we fine-tune this program to make it more effective? Is there a site which needs attention to ensure more effective delivery? Dennis A. Alonzo

Key Approaches Component Analysis Devolved Performance Evaluation Systems Analysis Dennis A. Alonzo

Evaluation Forms and Approaches Program Proactive Evaluation Clarificative Evaluation Monitoring Evaluation Impact Evaluation Interactive Evaluation May be conducted during the early implementation but mostly done after program phase out Owen, 2006 Dennis A. Alonzo

IMPACT EVALUATION Determines the range and extent of outcomes of a program; Determine whether the program has been implemented as planned and how implementation has affected outcomes; Provides evidence to funders, senior managers and politicians about the extent to which resources allocated to a program have been spent wisely; and Informs decision about replication or extension of a program Dennis A. Alonzo

Typical Issues Has the program been implemented as planned? Has the stated goals of the program been achieved? Have the needs of those served by the program been achieved? What are the unintended outcomes of the program? Does the implementation strategy lead to the intended outcomes? How do differences in implementation affect program outcomes? Is the program more effective fro some participants than for others? Has the program been cost-effective? Dennis A. Alonzo

Key Approaches Objectives-based (Tyler, 1950) Needs-based (Schriven, 10972) Goal Free Process – outcome Realistic Evaluation Performance Audit Dennis A. Alonzo

Framework for Planning an Evaluation 1. Specifying the Evaluand What is the object of the evaluation? What is known about the evaluand? How was it developed? How long has it been in existence? What is the nature of the evaluand: policy/program/organization/product? Who are the key players in its development (actual or projected) and its implementation? Dennis A. Alonzo

Framework for Planning an Evaluation 2. Purpose What is the fundamental reason for commissioning the evaluation? Consistent with evaluation form, the evaluation is primarily concerned with: Synthesis of information to aid program development; Clarification of the program; Improvement of the implementation of the program; Monitoring program outcomes; Determining program impact. Dennis A. Alonzo

Framework for Planning an Evaluation 3. Clients/ Audiences To whom will the findings of the evaluation be directed? Identify your clients, the primary audience, and other people who will use the information to make decisions Dennis A. Alonzo

Framework for Planning an Evaluation 4. Resources What person power and material resources are available to undertake the evaluation? The resources available determine the extent of the data management and the range of evaluation findings that can be provided. Dennis A. Alonzo

Framework for Planning an Evaluation 5. Evaluation focus/ foci Which element(s) of the program will need to be investigated: program context, program design, program implementation, program outcomes or a combination? What is the state of the development of the evaluand? Dennis A. Alonzo

Framework for Planning an Evaluation 6. Evaluation issues and key questions Identify the issues to be addressed Questions lead the direction of the evaluation 7. Data management Identify data collection strategy and analysis Is sampling important? Is anything known about this from other sources? How ill the data be collected? How will the data be analyzed to adress the key evaluation questions? Dennis A. Alonzo

Framework for Planning an Evaluation 8. Dissemination of Findings What strategies for reporting will be used? When will reporting take place? What kind of information will be included (findings, conclusions, judgments, recommendations)? Who will make the recommendations? Dennis A. Alonzo

Framework for Planning an Evaluation 9. Codes of Behavior What are the ethical conditions which underlie the evaluation report? 10. Budget and Timeline Given the resources, what will be achieved at key time-points during the evaluation? Dennis A. Alonzo

Performance Measurement for the eLib Access to electronic journal Word processing packages Excel and other statistical packages Demonstration software Internet use Bibliographic software Digitized books and journals Electronic information database OPACs Networked CD-ROMS on local area networks Full text outputs via bibliographic searching Web based training packages

Performance Indicators for eLib Informative content Reliability Validity Appropriateness Practicability Comparability

Performance Issues Skills level Real vs browsing Recreational use Provision of unwanted/ unanticipated services Queuing/ booking/ walkouts Remote logging in/ problems with Problems of outputting data

Performance Issues No define service period Quality and reliability of internet data Non-use Changes over time Distributed resources Problems with the library’s control The service oriented culture PCs vs Macs

Proposed list of performance indicators Percentage of target population reached by elib services Number of log-ins to elib services per capita per month Number of remote log-ins to elib services per capita per month Number of electronic documents delivered per capita per month Cost per log in per elib service

Proposed list of performance indicators Reference enquiries submitted electronically per month Library computer work station use rate Number or library work station per capita Rejected log-ins as a percentage of total log ins Systems availability Mean waiting time for access to library computer workstations IT expenditure as percentage of total library expenditures