GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Introduction to Monitoring and Evaluation
The complex evaluation framework. 2 Simple projects, complicated programs and complex development interventions Complicated programs Simple projects blue.
Donald T. Simeon Caribbean Health Research Council
MODULE 8: PROJECT TRACKING AND EVALUATION
Evaluation Research Kodi D. Havins AED 615 Fall 2006 Dr. Franklin.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
Results-Based Management: Logical Framework Approach
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Qualitative Research. Comparing Qualitative and Quantitative Methods Before discussing the differences between qualitative and quantitative methodologies.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
TOOLS OF POSITIVE ANALYSIS
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Evaluation. Practical Evaluation Michael Quinn Patton.
Types of Evaluation.
CORRELATIO NAL RESEARCH METHOD. The researcher wanted to determine if there is a significant relationship between the nursing personnel characteristics.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Standards and Guidelines for Quality Assurance in the European
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Several Evaluations Theories and Methods Reference: Foundation of Program Evaluation by Sadish, Cook, and Leviton (1991)
CASE STUDIES IN PROJECT MANAGEMENT
REC 375—Leadership and Management of Parks and Recreation Services Jim Herstine, Ph.D., CPRP Assistant Professor Parks and Recreation Management UNC Wilmington.
Marketing Research  Def. - Formal communication link with the environment to provide accurate and useful information for better decision making.  Systematic.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Too expensive Too complicated Too time consuming.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Evaluating the Options Analyst’s job is to: gather the best evidence possible in the time allowed to compare the potential impacts of policies.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Evaluating a Research Report
Research methods in clinical psychology: An introduction for students and practitioners Chris Barker, Nancy Pistrang, and Robert Elliott CHAPTER 11 Evaluation.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
CRJS 4466 PROGRAM & POLICY EVALUATION LECTURE #5 Evaluation projects Questions?
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Making Sense of the Social World 4 th Edition Chapter 11, Evaluation Research.
Copyright  2004 McGraw-Hill Pty Ltd. PPTs t/a Marketing Research by Lukas, Hair, Bush and Ortinau 2-1 The Marketing Research Process Chapter Two.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Methodology Matters: Doing Research in the Behavioral and Social Sciences ICS 205 Ha Nguyen Chad Ata.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Context Evaluation knowing the setting Context Evaluation knowing the setting.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
 2007 Johns Hopkins Bloomberg School of Public Health Introduction to Program Evaluation Frances Stillman, EdD Institute for Global Tobacco Control Johns.
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
Evaluation and Designing
Evaluation design and implementation Puja Myles
Research Design Quantitative Study Design - B. Back to Class 9.
Research for Nurses: Methods and Interpretation Chapter 1 What is research? What is nursing research? What are the goals of Nursing research?
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Readings n Text: Riddick & Russell –Ch1 stakeholders – p10 –Ch 2 an evaluation system –Proposal p25-36 – Ch 4 – Lit Review n Coursepack –GAO report Ch.
Thinking About Program Evaluation HUS 3720 Instructor Terry Wimberley, Ph.D.
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
Evaluation Research Dr. Guerette. Introduction Evaluation Research – Evaluation Research – The purpose is to evaluate the impact of policies The purpose.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
ABRA Week 3 research design, methods… SS. Research Design and Method.
What is Research Design? RD is the general plan of how you will answer your research question(s) The plan should state clearly the following issues: The.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
Overview Introduction to marketing research Research design Data collection Data analysis Reporting results.
DATA COLLECTION METHODS IN NURSING RESEARCH
Designing Effective Evaluation Strategies for Outreach Programs
QIC-AG Logic Model Template
Qualitative vs. Quantitative research
Presentation transcript:

GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research

MULITPLE-METHODS APPROACH Triangulation: applying 2 or more dissimilar measures and/or methods (research strategies) to investigating a certain problem. Why do it: increased confidence in findings

Key: -what you want to study (i.e. the nature of the research question, the phenomena considered) should determine your research strategy/methods! - the relative strengths & weaknesses of alternative approaches should be weighted in deciding which methods to select, and how best to combine them when possible. Table 12.2 in Singleton and Straights, p. 399

Multiple methods can also be used within a single approach: - allows exploiting the strengths & weaknesses of complementary methods. Ex: One approach (survey method), but mail questionnaire to probability sample, & face-to-face interviews on smaller sample of non-respondents, to estimate non-response bias. Vignette experimental designs in survey research; Use of archival records to identify groups for field research …

II. EVALUATION RESEARCH ebooks/method_techniques/index_en.htm Application of social research methods for: (a) assessing social intervention programs & policies instituted to solve social problems; (b) in the private sector: assess policy, personnel, products. Major goal of evaluation: Influence decision-making/policy formulation through providing empirically-driven feedback.

Evaluation takes place within a political & organizational context, where researchers face multiple stakeholders. Stakeholders: - individuals/ groups/ or organizations that have a significant interest in how well a program/product functions/performs. Ex: Program sponsor (actor who initiates & funds the program/product) Evaluation sponsor (who mandates & funds the evaluation) Policymaker/decision maker who determines the fate of the program/product, …

Outcome of evaluation: Detailed technical report that describes the research design, methods and results. Plus: executive summaries, memos, oral reports geared to the needs of specific stakeholders.

Evaluation Strategies Scientific-experimental models (see socialresearchmethods.net/kb/intreval.php) Take values & methods from the social sciences; - prioritize on the desirability of impartiality, accuracy, objectivity & the validity of the information generated. Ex: - experimental & quasi-experimental designs; - objectives-based research that comes from education; - econometrically-oriented perspectives including cost- effectiveness and cost-benefit analysis; - theory-driven evaluation.

Management-oriented systems models -emphasize comprehensiveness in evaluation, placing evaluation within a larger framework of organizational activities. The Program Evaluation and Review Technique (PERT) The Critical Path Method (CPM). The Logical Framework -- "Logframe" model developed at U.S. Agency for International Development Units Treatments Observing Observations Settings (UTOS); Context Input Process Product (CIPP)

Qualitative/anthropological models Emphasize: -the importance of observation; -the need to retain the phenomenological quality of the evaluation context -the value of subjective human interpretation in the evaluation process. Ex: naturalistic or 'Fourth Generation' evaluation; the various qualitative schools; critical theory & art criticism approaches; and, the 'grounded theory' approach of Glaser and Strauss among others.

Participant-oriented models Emphasize the central importance of the evaluation participants, especially clients & users of the program or technology. Ex: Client-centered and stakeholder approaches; consumer- oriented evaluation systems.

Types of Evaluation Formative Evaluation (Product): Needs assessment: who needs the program? How great the is the need? & What might work to meet the need? Evaluability assessment: is an evaluation feasible & how can stakeholders help shape its usefulness? Structured conceptualization: helps stakeholders define the program/ technology, the target population, & the possible outcomes Implementation evaluation: monitors the fidelity of the program or technology delivery Process evaluation investigates the process of delivering the program or technology, including alternative delivery procedures

Summative evaluation (Effects/Outcome): Outcome evaluations: did the program/technology produce demonstrable effects on specifically defined target outcomes? (effect assessment) Impact evaluation: broader; assesses overall/ net effects -- intended or unintended -- of the program/ technology as a whole Cost-effectiveness & cost-benefit analysis address questions of efficiency by standardizing outcomes in terms of their dollar costs & values Secondary analysis: reexamines existing data to address new questions or use methods not previously employed Meta-analysis integrates the outcome estimates from multiple studies to arrive at an overall/ summary judgement on an evaluation question

Methodological Issues in Evaluation Research Effect Assessment: did the program/technology caused demonstrable effects? The ‘black box’ paradigm We can observe: - what goes into the ‘black box’ – the inputs (here, the program/product/intervention) and -what comes out of the box – the output (certain effects). Theory as guide to research

Research Design & Internal Validity Ideal strategy for effect assessment: experiment, with units of analysis randomly assigned to at least 2 conditions (one with intervention present, one without).

Measurement Validity -need good conceptualization -reliable and valid measures of cause (treatment program) and effect (expected outcome). Issues with creating valid indicators of program outcomes. Timing of outcome measurement: -lagged effect of the program; continuous/gradual effects, vs. instantaneous effects. To increase measurement validity: multiple measurement (independent measures) & different points in time.

External Validity -Random sample, or ‘true’ experiments are most often not feasible  non-probability sample Selection biases: - self-selection into the treatment group; -selection of program participants because they are likely to generate positive results / are available; Social context of evaluation may threaten external validity