S519: Evaluation of Information Systems Analyzing data: Causation Ch5.

Slides:



Advertisements
Similar presentations
Why We Do Research Chapter 1. Ordinary Versus Systematic Biased Question: A question that leads to a specific response or excludes a certain group Nonscientific.
Advertisements

Lesson Overview 1.1 What Is Science?.
The Research Consumer Evaluates Measurement Reliability and Validity
1 Hypothesis Testing Chapter 8 of Howell How do we know when we can generalize our research findings? External validity must be good must have statistical.
S519: Evaluation of Information Systems Analyzing data: value and importance Ch6+7.
 How to infer causation: 8 strategies?  How to put them together? S519.
 What are evaluation criteria?  What are step3 and step 4?  What are the step3 and step4 output report? S519.
We’re ready to TEST our Research Questions! In science, how do we usually test a hypothesis?
Chapter 51 Experiments, Good and Bad. Chapter 52 Experimentation u An experiment is the process of subjecting experimental units to treatments and observing.
S519: Evaluation of Information Systems Evaluation Criteria Ch3+4.
Research Methods in Psychology Pertemuan 3 s.d 4 Matakuliah: L0014/Psikologi Umum Tahun: 2007.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Association vs. Causation
What is Descriptive Research Method also known as statistical research describes data and characteristics about the population or phenomenon the questions.
Research Methods in Nursing. Examining History 1600’s is the age of reasoning; finding reason and experimenting what is observed. Isaac Newton is a pioneer.
Discussion Gitanjali Batmanabane MD PhD. Do you look like this?
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
The Evaluation Plan.
S519: Evaluation of Information Systems
 Is there a comparison? ◦ Are the groups really comparable?  Are the differences being reported real? ◦ Are they worth reporting? ◦ How much confidence.
The Process of Science Science is the quest to understand nature.
Evaluating a Research Report
What is Science? Science is a system of knowledge based on facts and principles.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Qualitative Papers. Literature Review: Sensitizing Concepts Contextual Information Baseline of what reader should know Establish in prior research: Flaws.
Experimental Design All experiments have independent variables, dependent variables, and experimental units. Independent variable. An independent.
The Research Enterprise in Psychology
Research PHE 498. Define Research Research can be considered as systematic inquiry: A process that needs to be followed systematically to derive conclusions.
Social Research Methods. Social Research Goal: Test common sense & peoples assumptions then replace with fact & evidence and make………… Definition: statement.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Western Education & Critical Thinking. Most nations on earth are traditional societies.
Quantitative and Qualitative Approaches
CHAPTER 1 Understanding RESEARCH
Where did plants and animals come from? How did I come to be?
Sociologists Doing Research Chapter 2. Research Methods Ch. 2.1.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
S519: Evaluation of Information Systems Result D-Ch10.
 Now we are ready to write our evaluation report.  Basically we are going to fill our content to the checklist boxes we learned in lec2. S519.
S519.  What is evaluation?  What are the steps involved?  What are step1 and step2?  What are absolute or relative merit? S519.
Lesson Overview Lesson Overview What Is Science? Lesson Overview 1.1 What Is Science?
S519: Evaluation of Information Systems Analyzing data: Merit Ch8.
Chapter 10 Finding Relationships Among Variables: Non-Experimental Research.
EDU 5900 AB. RAHIM BAKAR 1 Research Methods in Education.
Lesson Overview Lesson Overview What Is Science? Lesson Overview 1.1 What Is Science?
S519: Evaluation of Information Systems Analyzing data: Causation Ch5.
OBSERVATIONAL STUDIES & EXPERIMENTAL DESIGN AP Statistics – Ch 13.
Lesson Overview Lesson Overview What Is Science? Lesson Overview 1.1 What Is Science?
Lesson Overview Lesson Overview What Is Science?.
mQ OBJECTIVES The student should be able to: 1.list and describe the steps of the scientific method 2.define.
Chapter 2 Notes Ms. Sager. Science as Inquiry What is Science? – Word derived from Latin – means “to know” – A way of knowing – How to answer questions.
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
Statistics 100 Lecture Set 4. Lecture Set 4 Chapter 5 and 6 … please read Read Chapter 7 … you are responsible for all of this chapter Some suggested.
TOPIC 1.2, RISK. SPECIFICATIONS: RISK 1.18 Analyse and interpret quantitative data on illness and mortality rates to determine health risks (including.
Experiments Textbook 4.2. Observational Study vs. Experiment Observational Studies observes individuals and measures variables of interest, but does not.
1-3: Data collection and sampling techniques Note: This PowerPoint is only a summary and your main source should be the book.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Statistical Experiments What is Experimental Design.
Research Design. How do we know what we know? The way we make reasoning Deductive logic Begins with one or more premises, reasoning then proceeds logically.
Chapter 33 Introduction to the Nursing Process
Why We Do Research Chapter 1.
Chapter 12 Single-Case Evaluation Designs
Teaching and Educational Psychology
Design Methodology Desi Dwi Kristanto, M.Ds. Week 6.
Nature of Science.
Observational Studies
Lab Safety & Experimental Design Review
Presentation transcript:

S519: Evaluation of Information Systems Analyzing data: Causation Ch5

Step5: Analyzing data Dealing with the causation issue, basically be able to answers following questions: How certain does the client need us to say that the evaluand „caused“ a certain change? What are the basic principles for inferring causation? What types of evidence do we have available to help us identify or rule out possible causal links? How should we decide what blend of evidence will generate the level of certainty needed most cost- effectively?

Certainty about causation (D- ch5) Each decision-making context requires a different level of certainty Quantitative or qualitative analysis All-quantitative or all-qualitative Sample choosing Sample size Mix of them

Inferring causation: basic principles Two basic principles: Look for evidence for and against the suspected main cause (i.e., evaluand) Look for evidence for and against any important alternative causes (i.e., rival explanations) Too many evidences or causes, which are the primary causes All based on the level of certainty you need for your evaluation Stepwise process Put yourself in the hardest critics, gather enough evidence to support your explanation  Repeat it until all remaining alternative explanations are ruled out. Critical multiplism: triangle The harder people attack, the more solid your answers need to be.

Inferring causation: 8 strategies 1: ask observers Two possibilities Ask actual or potential impactees Ask indirect impactees (i.e., co-worker, parents...) Design your interview questions to include causation questions E.g., how much has your knowledge increased as a result of attending this program? – get primary cause E.g., did anything else besides the program increase your knowledge in this area over the same period of time? – get other causes E.g., please describe anything else that has happened to you or someone you know as a result of participating in this program? – get the causes which people know or believe were caused by the program.

Inferring causation: 8 strategies 1: ask observers Causation-rich questions tend to be leading (direct the respondent to answer in a particular way). Be careful about the wording when designing interview questions The causation question is not just whether the program produced the effect but also what other factors enabled or inhibited the effect. Individual might not be a reliable witness to answer the causation question, other evidence will be required to make justifiable causal inferences.

Inferring causation: 8 strategies 1: ask observers Methods Questionnaires to identify the targeted groups (people who experienced substantial changes)  Using open-end to get more opinions In-depth interview with the targeted groups  To find out causation.

Inferring causation: 8 strategies 2. Check whether the content of the evaluand matches the outcome Alcoholics treatment program  alcoholics avoid relapses Check whether the strategies which alcoholics use to avoid relapses after the program, are the same as the strategies taught in the program

Inferring causation: 8 strategies 3. Look for other telltale patterns that suggest one cause or another Modus operandi method – look for evidence -- detective metaphor to describe the way in which potential causal explanations are identified and tested. A silly example Evidences: a naked man, dead; in the middle of the desert; personal belongings near by; half match at hand Cause of his death

Inferring causation: 8 strategies 4. Check whether the timing of outcomes makes sense Common sense: an outcome should appear only at the same time as or after whatever caused it – a considerable delay. A further downstream the outcomes, the longer they should take to appear Using timing to confirm or disconfirm causal links: Is the outcome happened before the evaluation? Or Other downstream outcomes too early? Is the timing of the outcomes logical to possible causes? Do the further downstream outcomes in the logic model occur out of sequence? More on Lipsey, M. W. (1989). Design sensitivity: Statistical power for experimental research. Newbury Park, CA: Sage.

Inferring causation: 8 strategies 4. Check whether the timing of outcomes makes sense Example – a community health education on diet and exercise Fairly immediate knowledge and skill gain: during or immediately after the intervention A short delay (days or weeks) before the knowledge and skills are transformed into changing behavior A moderate delay (weeks or months) before we see changes in individual health indicators (weight, cholesterol, blood pressure, etc.) A long delay (months or years) to see changes on improvement on diabetes and heart diseases

Exercise Take grantsmanship workshop as one example List the timeline potential outcomes (fairly immediate, a short delay, a moderate delay, a long delay) Using timing strategies to confirm or disconfirm the cause links, state one page for how and why: One month after the workshop, 3 proposals got grants One year after the workshop, 3 proposals got grants Three months after the workshop, some people write good proposals, but some are not. Think about your own solution Form a group and discuss Lab

Inferring causation: 8 strategies 5. Check whether the „dose“ is related logically to the „response“. The dose-response idea The more dose of drug, the better response later on If more A (dose), then better B (response) Compare the less dose with more dose (not overdose) to confirm or disconfirm the cause links  E.g., for performance evaluation project, if we found that performance had been improved dramatically in the unit where the system has been poorly implemented,  this system is not the cause of the performance improvement.

Inferring causation: 8 strategies 6. Make comparisons with a „control“ or „comparison“ group Divide the participants into different groups control group (receive the evaluand) vs. Comparison group (receive no evaluand) Sampling should be done carefully to make sure no systematic differences between groups Sample size randomization

Inferring causation: 8 strategies 7. Control statistically for extraneous variables When using control and comparison groups, try to exclude external variables and make two groups no systematic differences: Statistical methods  Regression analysis Try to identify other potential systematic differences  E.g., math improvement for students, how to sample students and think about other potential existing difference. Is the random sampling enough?

Inferring causation: 8 strategies 8. Identify and check the underlying causal mechanism(s) Try to look for an underlying mechanism to make the case for causation more or less convincing Cigarette smoking  lung cancer  Correlation studies  Carcinogenic in cigarette causes cancer Normally coming from literature.

Put them together Do we need all the evidences we collect from 8 strategies? How to select them? Put yourself in the shoes of a tough critic, identify the most potential threatening rival explanation, then chose the types of evidence that will most quickly and cost-effectively confirm or dispel that rival explanation. Go to next less tough rival explanation,... Continue, until you have amassed a body of evidence to provide you enough certainty to draw causal inferences

Exercise Grantsmanship workshop (p57) Grantsmanship workshop strengthen local communities For (evidences) Against (evidences) Other alternative causes (i.e., rival explanations) Using strategies to confirm or disconfirm these evidences or causes Putting them together Form a group to discuss Lab