PPA 502 – Program Evaluation Lecture 4a – Qualitative Data Collection.

Slides:



Advertisements
Similar presentations
Exploratory Research and Qualitative Analysis
Advertisements

Donald T. Simeon Caribbean Health Research Council
Collecting data Chapter 5
Computer Applications in Testing and Assessment James P. Sampson, Jr. Florida State University Copyright 2002 by James P. Sampson, Jr., All Rights Reserved.
Learning Objectives 1 Copyright © 2002 South-Western/Thomson Learning Qualitative Research CHAPTER five.
Reviewing and Critiquing Research
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
Chapter 13 Survey Designs
Learning Goals Explain the importance of information to the company
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
Observation, Focus Groups, and Other Qualitative Measures
Formative and Summative Evaluations
Introduction to Qualitative Research
© 2010 Cengage Learning. Atomic Dog is a trademark used herein under license. All rights reserved. Chapter 4 Analyzing Jobs.
Principles of Marketing
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
11 Populations and Samples.
RESEARCH METHODS Lecture 19
Evaluation. Practical Evaluation Michael Quinn Patton.
Principles of Marketing
Chapter 13 Survey Designs
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
Chapter 4 – Strategic Job Analysis and Competency Modeling
Aaker, Kumar, Day Seventh Edition Instructor’s Presentation Slides
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Survey Designs EDUC 640- Dr. William M. Bauer
Development of Competence Profile Quality managers in VET-institutions Project no: PL1-LEO This publication [communication] reflects the.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 14.
 It is a master plan that specifies the methods and procedures for collecting data and analyzing the needed information (Zikmund et al, 2010)  It involves.
Qualitative Research MKTG 3342 Fall 2008 Professor Edward Fox.
Business and Management Research
Assessing Organizational Communication Quality
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Questionnaires and Interviews
MAST: the organisational aspects Lise Kvistgaard Odense University Hospital Denmark Berlin, May 2010.
Qualitative Research Methodologies Keys to Exploratory Research.
Marketing Research: Overview
Making Sense of the Social World 4th Edition
Chapter 12: Survey Designs
Classroom Assessments Checklists, Rating Scales, and Rubrics
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Managing Marketing Information Chapter Learning Goals 1.Explain the importance of information to the company 2.Define the marketing information.
Evaluating a Research Report
Data Collection Methods
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Introduction to research methods 10/26/2004 Xiangming Mu.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
8. Observation Jin-Wan Seo, Professor Dept. of Public Administration, University of Incheon.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section B 1.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Marketing Research Approaches. Research Approaches Observational Research Ethnographic Research Survey Research Experimental Research.
CalACT ALL ABOUT FOCUS GROUPS JD FRANZ RESEARCH, INC. Public Opinion and Marketing Research.
Interviews By Mr Daniel Hansson.
Introducing Regulatory Impact Analysis into the Turkish Legal Framework Improving Transparency, Consultation and Communication of RIAs March 2009.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Managing Marketing Information. Marketing Information Consumer needs and motives for buying are difficult to determine. Required by companies to obtain.
NEEDS ASSESSMET Primary Data for Needs Assessment.
Chapter 3: Needs Assessment. Needs Assessment, defined: The measure against which program implementation and outcome will be compared. “A needs assessment.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
1 International Institute of Business Analysis Vision: The world's leading association for Business Analysis professionals” Mission: To develop and maintain.
DATA COLLECTION METHODS IN NURSING RESEARCH
RECRUITMENT & SELECTION
SP_ IRS : Research in Inclusive and Special Education
Chapter Three Research Design.
Survey-Document Examination-Observation-Benchmarking
Presentation transcript:

PPA 502 – Program Evaluation Lecture 4a – Qualitative Data Collection

Trained Observer Ratings  A systematic technique for assigning accurate, reliable grades based on direct visual observations.  Trained observers are persons who make ratings of conditions by comparing their perception of the condition to a prespecified rating scale.

Trained Observer Ratings  Ratings should contain written definitions or photographic benchmarks.  Systematic procedures to ensure the accuracy and consistency of ratings between raters and over time.  Most applications have focused on facilities maintenance.

Trained Observer Ratings  Advantages –Easy and often inexpensive way to quantify conditions and program outcomes that are otherwise difficult to measure. –Despite subjectivity, if done properly, they can introduce considerable objectivity and reliability. –Readily understood by public administrators. –Tend to focus on conditions as experienced by average citizen. –In most cases, ratings can be made by ordinary citizens.

Trained Observer Ratings  Disadvantages. –Can only assess characteristics that can be directly sensed. –Can be affected by conditions that limit observability. –Can impose some physical dangers. –Most measures are nominal or ordinal limiting statistical applications. –Not useful for assessing subtle conditions or complex changes. –Without considerable care, imprecision, poor repeatability, and lack of comparability can result. –Credibility can be affected if rater is also implementer. –Ratings can become intrusive.

Trained Observer Ratings  Steps. –Selecting characteristic to be rated. –Developing rating scales and forms (single versus multiple.) –Selecting the raters and conducting the ratings. –Quality control procedures. –Analyzing and presenting the results. –Time, cost, and staffing requirements.

Expert Judgment  Decision-makers and program evaluators should consider using expert judgment when programs are subject to high uncertainty. –Uncertain inputs. –Uncertain outcomes. –Uncertain causality.  Expert judgments should be reliable (reproducible) and valid.

Expert Judgment  Inside expert evaluations. –Managers and administrators. –Program evaluation staff. –Steps Collecting and collating information. Checking for validity and reliability. Implementing recommendations.

Expert Judgment  Using outside experts. –Selecting the experts.  Eliciting systematic judgments. –Unstructured, direct interaction (informed dialogue) procedure. –Structured, indirect interactions. –Delphi.

Expert Judgment  Delphi steps. –Develop the evaluation issues and questions that the experts will address. –Obtain data that the decision maker wants experts to examine and arrange any desired interviews with program staff. –Design the instrument for addressing the issues and questions. –Select and contact the experts. –Administer the instrument, round #1. –Collate, aggregate, and send the judgments from round #1 back to the experts. This step is done by the person designated to be in control.

Expert Judgment  Delphi steps (contd.) –Administer the instrument, round #2. –Repeat step #6. –Administer the instrument, round #3. –Repeat step #6. –Prepare final report on results.  Devil’s advocate.  Dialectical inquiry.

Focus Groups  Consumer focus of modern public administration.  What are focus groups? –An informal, small-group discussion designed to obtain in-depth qualitative information. Individuals are specifically invited to participate in the discussion. Participants usually have something in common (demographics, service receipt, agency employment).

Focus Groups  Focus group discussion are informal. Interaction is encouraged. Conversation led by a moderator whose role it is to foster interaction. Moderator manages discussion to make sure it does not stray too far from the topic of interest. Moderator also follows up on participant’s comments to obtain further details or to introduce new topics to the group.

Focus Groups  Focus groups have the following characteristics: –Each group is kept small to encourage interaction among the members. –Each session usually lasts ninety minutes. –The conversation focuses on a restricted number of topics. The actual number varies depending on the objective of the session but is usually no more than three to five related subjects. –The moderator has an agenda that outlines the major topics to be covered. These topics are usually narrowly defined to keep the conversation relevant.

Focus Groups  Differences between focus groups and surveys. –Size and selection of sample respondents. –Construction of questions. –How results are handled. Surveys – aggregation. Focus groups – disaggregation and detail.

Focus Groups  Focus groups most useful in the exploratory stages of research, or when an administrator wants to develop a deeper understanding of a program or service.

Focus Groups  Steps. –Participant selection. What target population or populations do you want information from? –You should not invite different target populations to the same session. –Definition should be careful and specific to the research questions. –But there are tradeoffs which can reduce the potential pool. Focus group recruiting is done over the phone with screening questionnaires. Focus group participants are usually compensated for their time.

Focus Groups  Steps. –The agenda. The agenda or moderator’s guide outlines the major topics to be covered. The moderator refers to this document during the discussion to make sure all major topics are covered. The focus group moderator usually writes his or her own guide. Structure. –General question to initiate interaction. –Moderator must decide on the spot whether to allow free- wheeling discussion or structured conversation. Should know in advance if order is critical.

Focus Groups  Steps. –Communication with moderator. Moderator must understand needs of sponsor in preparing the agenda. –In-house or professional research. –Facilities. Usually conducted in special facilities with video- and audio- taping equipment. Ethical considerations paramount here. –Schedules. Usually on weekday evenings. Multiple sessions. Competing activities. Lead time necessary for preparation. Minimum of three weeks.

Focus Groups  Reports. –Debriefing summaries. –Descriptions of the discussion and key points made. Not quantitative.  Pitfalls –Inappropriate to research questions (generalizability). –Inadequate communication between sponsor and moderator. –Sponsor defensiveness.