1 The influence of the questionnaire design on the magnitude of change scores Sandra Nolte 1, Gerald Elsworth 2, Richard Osborne 2 1 Association of Dermatological.

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

LEUCEMIA MIELOIDE AGUDA TIPO 0
Question 8 Virginia Department of Education 8. Does the IEP consider the strengths, interests, preferences, and needs of the student? (34 C.F.R §300.43(a)(2)
Advanced Piloting Cruise Plot.
Supplemental Web Fig. 2 mos10/mos10, grown on 100 mm plate SLAS27 medium, day 1.
QUALITY CONTROL TOOLS FOR PROCESS IMPROVEMENT
Chapter 11 Attitude and Attitude Change
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 5 Author: Julia Richards and R. Scott Hawley.
1 Copyright © 2010, Elsevier Inc. All rights Reserved Fig 2.1 Chapter 2.
By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.
1 of 20 Evaluating an Information Project From Questions to Results © FAO 2005 IMARK Investing in Information for Development Evaluating an Information.
1 of 19 How to invest in Information for Development An Introduction IMARK How to invest in Information for Development An Introduction © FAO 2005.
Griffith Health Employee perceptions of the management of workplace stress Nicholas Buys Griffith University Lynda Matthews University of Sydney Christine.
Alignment and Quality 2/8/2014 Alignment and Quality 1 What is it? What do I need to look for?
1 Alternative measures of well-being Joint work by ECO/ELSA/STD.
Classroom Factors PISA/PIRLS Task Force International Reading Association January 2005.
Business Transaction Management Software for Application Coordination 1 Business Processes and Coordination.
Normative-narrative scenarios as a tool to support strategic R&D processes: benefits and limits Hannah Kosow, Dr. Robert Gaßner IZT – Institute for Futures.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Title Subtitle.
Woodburn Interchange EA Evaluation Framework Presentation SWG Meeting #2 April 10, 2003.
0 - 0.
ALGEBRAIC EXPRESSIONS
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
MULTIPLYING MONOMIALS TIMES POLYNOMIALS (DISTRIBUTIVE PROPERTY)
ADDING INTEGERS 1. POS. + POS. = POS. 2. NEG. + NEG. = NEG. 3. POS. + NEG. OR NEG. + POS. SUBTRACT TAKE SIGN OF BIGGER ABSOLUTE VALUE.
SUBTRACTING INTEGERS 1. CHANGE THE SUBTRACTION SIGN TO ADDITION
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
FACTORING Think Distributive property backwards Work down, Show all steps ax + ay = a(x + y)
Addition Facts
1 Lecture 4 CONSTRUCT VALIDITY. 2 Validity A test is said to be VALID if it measures what it is supposed to measure.
B45, Second Half - The Technology of Skill Formation 1 The Economics of the Public Sector – Second Half Topic 9 – Analysis of Human Capital Policies Public.
The Course experience questionnaire (P. Ramsden) Designed as a performance indicator 24 statements relating to 5 aspects 1 overall satisfaction statement.
1 Drafting a Standard n Establish the requirements n Agree the process n Draft the Standard n Test the Standard n Implement the Standard.
ZMQS ZMQS
Micro Focus Research 1 As far as youre aware, how does your organization plan to drive business growth over the next three years? (Respondents' first choices)
Developing a feedback questionnaire: Principles and steps Workshop for NHS staff 28 Dec 1999 (Tuesday) Kam-Por Kwan, EDU
There is no program and no policy that can substitute for a parent who is involved in their childs education from day one. President Barack Obama Overview.
ABC Technology Project
Understanding the ELA/Literacy Evidence Tables. The tables contain the Reading, Writing and Vocabulary Major claims and the evidences to be measured on.
Social inclusion initiatives: the effect of joined up approaches Justine McNamara and Alicia Payne Paper presented at the 11 th Australian Institute of.
1 Highlights of a Systematic Review of Research on Peer-Delivered Services Boston University Center for Psychiatric Rehabilitation March 2010.
© S Haughton more than 3?
Twenty Questions Subject: Twenty Questions
Squares and Square Root WALK. Solve each problem REVIEW:
Design formulation ● design disciplines ● differences ● commonalities ● formulation 1/24.
Lets play bingo!!. Calculate: MEAN Calculate: MEDIAN
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management.
Past Tense Probe. Past Tense Probe Past Tense Probe – Practice 1.
Chapter 5 Test Review Sections 5-1 through 5-4.
GG Consulting, LLC I-SUITE. Source: TEA SHARS Frequently asked questions 2.
1 First EMRAS II Technical Meeting IAEA Headquarters, Vienna, 19–23 January 2009.
Addition 1’s to 20.
25 seconds left…...
GETTING IT RIGHT. Today we will - review or refresh your approach to making funding bids by: Not sure? Ask! And this is to remind me and you please to.
Test B, 100 Subtraction Facts
1 Measuring data quality by the use of a routine re-interview module Experiences from the Norwegian European Social Survey Øyvin Kleven and Frode Berglund.
This outcome report is based on data from clients who completed a Functional Restoration Programme at the RealHealth Treatment Centre in Coventry between.
Week 1.
We will resume in: 25 Minutes.
Evaluation of a process for reflection on feedback to support student learning Mark M c Crory Steve M c Peake Denise Currie Department of Management and.
1 Unit 1 Kinematics Chapter 1 Day
How Cells Obtain Energy from Food
1 A Systematic Review of Cross- vs. Within-Company Cost Estimation Studies Barbara Kitchenham Emilia Mendes Guilherme Travassos.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
Where do data come from and Why we don’t (always) trust statisticians.
© 2013 Cengage Learning. Outline  Types of Cross-Cultural Research  Method validation studies  Indigenous cultural studies  Cross-cultural comparisons.
Indicators of Family Engagement Melanie Lemoine and Monica Ballay Louisiana State Improvement Grant/SPDG.
Brotherson, S., Kranzler, B., & Zehnacker, G.
Presentation transcript:

1 The influence of the questionnaire design on the magnitude of change scores Sandra Nolte 1, Gerald Elsworth 2, Richard Osborne 2 1 Association of Dermatological Prevention Hamburg, GERMANY 2 Deakin University Melbourne, AUSTRALIA

2 The measurement of program outcomes … it is the basis for continuous quality assurance / improvement … it delivers crucial information for a wide range of stakeholders … it can / should deliver information on what works and what doesnt … is important because …

3 Bias in outcomes assessment However … while program evaluations are crucial, there are continuous concerns about: biases that may threaten the validity of outcomes data one such bias that is a common concern in pre-test / post-test data is: Response Shift (Howard 1979)

4 Response Shift Change in common metric because of redefinition, reprioritisation and/or recalibration of the target construct (Schwartz & Sprangers, 1999) Common remedy to circumvent Response Shift: collection of retrospective pre-test data [actual pre-test - retrospective pre-test] = magnitude and direction of Response Shift [post-test - retrospective pre-test] = true program outcome (Visser et al., 2005)

5 The retrospective pre-test Collected after an intervention, generally in close proximity to post-tests How good (i.e. valid, reliable) are retrospective pre-test data? Past research generally focused on comparison of retrospective pre-test with actual pre-test; however, only few tested influence of scores on each other none tested the psychometric performance of retrospective pre-tests

6 Study aim 1) To explore influence of posing retrospective pre-test questions on ratings of post-tests 2) To explore whether other types of questions influenced post-tests (i.e. transition questions)

7 Research design Setting: chronic disease self-management courses Randomised design: three versions of the Health Education Impact Questionnaire (heiQ) were distributed at post-test (randomised within courses)

8 Research design Randomised design – Version I 1) post-test ONLY (n=331) (6-point Likert scale: strongly disagree to strongly agree)

9 Group I: post-test ONLY

10 Research design Randomised design – Version II 1) post-test ONLY (n=331) (6-point Likert scale: strongly disagree to strongly agree) 2) post-test + transition questions (n=304) (transition Qs: 5-point response scale: much worse to much better)

11 Group II: post-test + transition question

12 Research design Randomised design – Version III 1) post-test ONLY (n=331) (6-point Likert scale: strongly disagree to strongly agree) 2) post-test + transition questions (n=304) (transition Qs: 5-point response scale: much worse to much better) 3) post-test + retrospective pre-test (n=314) (both 6-point Likert scale: strongly disagree to strongly agree)

13 Group III: post-test + retro pre-test

14 Results Across the three randomised groups: no significant differences in: demographic characteristics pre-test scores (= scores collected before intervention) The randomisation worked

15 Results (cont.) Posing transition questions in addition to post-test questions had hardly any influence on post-test levels (Group II) In contrast, posing retrospective pre-test questions after an intervention had significant influence on ratings of post-tests in six of the eight heiQ subscales: Post-test ONLY (Group I) mean post-test: 4.76 Post-test + retrospective pre-test (Group III) mean post-test: 4.96 (on 6-pt Likert scale)

16 Group I Mean (SD) Group II Mean (SD) Group III Mean (SD)

17 Conclusions Asking retrospective pre-test questions at post-test has a significant influence on the ratings of post-test levels The influence was so substantial that it leads to different conclusions about program effectiveness It remains uncertain whether the application of retrospective pre-tests provides a more or less accurate reflection of the impact of chronic disease self-management programs

18 Conclusions It remains uncertain whether the application of retrospective pre-tests provides a more or less accurate reflection of the impact of chronic disease self-management programs However, psychometric properties of retrospective pre-test data seem to be substantially weaker than classic pre-test Classic pre-test / post-test design may be the more valid approach to evaluate self-management programs

19 Discussion Possible explanations: 1.Cognitive task may have triggered distorted responses consistent with theories: Effort justification (Hill & Betz, 2005) Implicit theory of change (Ross, 1989) Social desirability (Crowne & Marlowe, 1964) 2.The task of remembering pre-test levels might have been too complex for some respondents making these data less reliable

20 Discussion (cont.) 3.It remains to be shown what people think while responding to questionnaires qualitative research into response processes is essential to help understand & interpret self-report data

21 Thank you