Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Dr Linda Allin Division of Sport Sciences The value of real life evaluation research for student learning and employability in Sports Development.
M & E for K to 12 BEP in Schools
Project Monitoring Evaluation and Assessment
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
An Assessment Primer Fall 2007 Click here to begin.
Copyright © 2012 Pearson Education, Inc. Publishing as Prentice Hall 5.1.
Enter System Name AdvancED TM External Review Exit Report Catalyst High School May 11,12,13, 2014.
Staff Compensation Program Update
Evaluating and Revising the Physical Education Instructional Program.
Evaluation. Practical Evaluation Michael Quinn Patton.
Types of Evaluation.
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
Introduction to Cost management
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Student Assessment Inventory for School Districts Inventory Planning Training.
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
Best-Fit Evaluation Strategies: Are They Possible? John Carlo Bertot, John T. Snead, & Charles R. McClure Information Use Management and Policy Institute.
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Nature and Scope of Marketing Research
QS 702 Phase II: Encouraging the Integration of Technology Into Higher Education.
Needs Analysis Session Scottish Community Development Centre November 2007.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Evaluation Test Justin K. Reeve EDTECH Dr. Ross Perkins.
Evaluation Office 1 Evaluating Capacity Development David Todd Senior Evaluation Officer GEF Evaluation Office.
Don Von Dollen Senior Program Manager, Data Integration & Communications Grid Interop December 4, 2012 A Utility Standards and Technology Adoption Framework.
Skunk Works Evaluation Tools: How do we know if we are having an impact?
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Too expensive Too complicated Too time consuming.
Week 8: Research Methods: Qualitative Research 1.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
BENCHMARKING For Best Practices. What is Benchmarking A method for identifying and importing best practices in order to improve performance A method for.
Overview of Evaluation ED Session 1: 01/28/10.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
1 Evaluating the Quality of the e-Learning Experience in Higher Education Anne Jelfs and Keir Thorpe, Institute of Educational Technology (IET), The Open.
Copyright  2005 McGraw-Hill Australia Pty Ltd PPTs t/a Australian Human Resources Management by Jeremy Seward and Tim Dein Slides prepared by Michelle.
Program Evaluation.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. Business Plug-In B15 Project Management.
From Monitoring Through Evaluation To Impact Assessment The case of NHDRs and vulnerable groups Andrey Ivanov Human Development Adviser, Bratislava RSC.
National Workshop 3 Luwero, Uganda March 2015.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Program Evaluation Planning Taylor-Powell, E., Steele, S., & Douglah, M. (1996). Planning a program evaluation. Retrieved from University of Wisconsin-Extension-Cooperative.
Keeping Up With Demand: Measuring Labor Market Alignment in TAACCCT Programs Michelle Van Noy and Jennifer Cleary TCI Research Symposium: Evidence of What.
Improving Technology Infrastructure and Web-based Information and Services Northeast Iowa Community College PRP031A Christine Woodson, Project Director.
National Agencies’ contribution to the evaluation of Grundtvig Action and to the National Evaluation Report on Socrates Programme Socrates - Grundtvig.
Generic competencesDescription of the Competence Learning Competence The student  possesses the capability to evaluate and develop one’s own competences.
Alex Ezrakhovich Process Approach for an Integrated Management System Change driven.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Evaluation What is evaluation?
 Meaning  Purpose  Steps in evaluation  Models of evaluation  Program evaluators and the role.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Program Quality Assurance Process Validation
SYSTEMS ANALYSIS Chapter-2.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
How to conduct Effective Stage-1 Audit
OECD good practices for setting up an RIA system Regional Capacity-Building Seminar on Regulatory Impact Assessment Istanbul, Turkey 20 November 2007.
Presentation transcript:

Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information resources and services: Selected Issues. In C. R. McClure & J. C. Bertot (Eds.), Evaluating networked information services: Techniques, policy and issues (pp ). Medford, NJ: Information Today Inc.

Abstract IT and information managers see the importance in measuring their organizations networks and information flow important Two questions to consider – What are the key elements of a strategy that measures these networks? What are the key issues that may arise when employing such a strategy? Research suggests successful managers develop a strategy and consider key issues prior to implementing these measures and continuing with data collection, analysis, and use procedures

Examples of Measurements Quantitative (Statistics) Performance measures (Statistical ratios) Qualitative (Interviews, surveys, observations, shadowing, etc.) Any others?

The Research Sponsored by United States Institute for Museum and Library Services Multi-method approach – Site visits, case studies, interviews, focus groups, surveys, observation, path analysis Iterative approach – Study team adapted collection and analysis based on learning Limited study scope – Study participants were all library information managers Drawbacks and/or limitations?

Selection Issues What were some potential concerns and issues that these library information managers identified prior to adopting data collection, measures, and analysis? Why are you measuring something?  What are the potential uses?  Tradeoffs – New measures bring additional costs, time, and resources of an organization What will you do with your findings?  May have different outcomes and effects for different stakeholders in organization (i.e. management, employees, shareholders)

Measure Types Capacity – Ability of an organization to use or deliver information through network Use – Utilization measure of these resources or services Efficiency – Relates services or resources used to services or resources provided Impact – Effect of resource or service on an activity or situation Outcome – Are goals achieved? Examples of measure types? Other considerations  Defining measures, data collection, and analysis  Fair measures – All of an organizations operations

Data Collection Sources – Are all relevant stakeholders’ input collected? Can data be collected uniformly for accurate analysis? Which data collection measures should be used? Qualitative, quantitative, both? Any new collection techniques that should be used? Privacy – Are stakeholders free to give an accurate portrayal? Balancing data collection across organization Proper instruction and preparation is key Any other potential issues?

Data Analysis and Use Combine new data with existing information – Internal, external  Example – Your new data with national studies or internal performance or composite measures  Is this information compatible for comparison?’ How do you analyze the results generated by a new measure?  Is further instruction necessary?  Who interprets and how?  How is this data presented?  How do you use it? Does the end cost outweigh the benefit?  Table Examples

Conclusions / Recommendations Information managers should choose network measures based on context of change, complexity, utility, and resources Proper strategy required prior to implementation of network measures, data collection, and analysis for success Reducing and balancing staff burden of collection and analysis Be able to provide proper instruction/training to properly educate important stakeholders Future and additional considerations and research

Discussion Topics Has anyone been involved in this process in their organizations?  Success/failures, most important metrics, time consumption, cost/benefits?  Did anything come out of this data? Any other analytics tools used?  i.e. External, web-based tools  Internal work flow documents Thoughts on the reading – Agree/disagree, glaring deficiencies, techniques used, research methods?

Work Flow Example