Presentation is loading. Please wait.

Presentation is loading. Please wait.

Brad Cousins University of Ottawa October 2010. Evaluation design options Data quality assurance – validity/credibility – reliability/dependability Instrument.

Similar presentations


Presentation on theme: "Brad Cousins University of Ottawa October 2010. Evaluation design options Data quality assurance – validity/credibility – reliability/dependability Instrument."— Presentation transcript:

1 Brad Cousins University of Ottawa October 2010

2 Evaluation design options Data quality assurance – validity/credibility – reliability/dependability Instrument development and validation Data collection strategies

3 Comparison groups? – Yes, no, hybrid – Black box, grey box, glass box Data collected over time? – Yes, no, hybrid Mixed methods – Quant, qual., simultaneous, sequential

4 One shot, post only – X O1 Comparative post only – X O1 – O2 Randomized control trial – R X O1 – R O2

5 Time series design – O1 O2 O3 O4 X O5 O6 O7 O8 Pre-post comparative group design – O1 X O3 – O2 O4 Delayed treatment group design – O1 X O3 O5 – O2 O4 X O6

6 Major Concepts VALIDITY/CREDIBILITY Key points – Degrees on a continuum; – Describes the results or inferences; NOT the instrument; – Depends on the instrument and the process; – Involves evidence and judgment; Internal validity/credibility – Attribution: how confident can we be that the observed effects are attributable to the intervention?

7 Actual but non-program related changes in participants – Maturation – History Apparent changes dependent on who was observed – Selection – Attrition – Regression Changes related to methods of obtaining observations – Testing – Instrumentation

8

9 General Principles – Build on existing instruments and resources – Ensure validity: face, content, construct, – Ensure reliability (eliminate ambiguity) – Consider task demands – Obtrusive vs unobtrusive measures – Use of conceptual framework as guide – Demographic information solicited at end – Pilot test

10 Scales: Nominal, ordinal, interval Selected response – Multiple choice (tests) – Fixed option: Check all that apply Check ONE option only – Likert type rating scales Frequency (observation): N R S F A Agreement (opinion): SD D A SA

11 Selected response (cont) – Rank ordered preferences (avoid) – Paired comparison Constructed response – Open-ended comments Structured Unstructured – If ‘other’ (specify)

12 Data collection formats – Hardcopy – data entry format – Hardcopy – scan-able format – Internet format Over specify instructions Judicious use of bold/italics and font variation Response options on right hand side Stapling: booklet > upper left > left margin Judicious determination of length (8 p. max)

13 Review of purpose / expectations Spacing of questions to permit response recording Questions vs prompts Use of quantification

14 Ethics – Ethics review board procedures/protocols – Letters of informed consent Purpose How/why selected Demands / Right to refusal Confidential vs. anonymous Contact information – Issues and tensions

15 Interview tips – Small talk – set the tone – Audio tape recording – permission – Develop short-hand or symbolic field note skills – Permit some wandering but keep on track – Minimize redundancy

16 Quantitative for representation – proportionate to population – random Qualitative to maximize variation – Purposive sampling: based on prior knowledge of case(s)

17 Colton, D. & Covert, R. W. (2007). Designing and constructing instruments for social research and evaluation. San Fransisco: John Wiley and Sons, Inc. Cresswell, J. W. & Miller, D. L. (2000). Determining validity in qualtitative inquiry. Theory into practice, 39(3), 124-130. Fraekel, J.R. & Wallen, N.E. (2003). How to Design and Evaluate Research in Education. New York: McGraw-Hill. McMillan, J. H. (2004). 4th Ed. Educational Research. Toronto: Pearson, Bacon and Allen, pp. 172-174. Shultz, K.S. & Whitney, D.J. (2005). Measurement Theory in Action: Case Studies and Exercises. Thousand Oaks, CA: SAGE Publications.


Download ppt "Brad Cousins University of Ottawa October 2010. Evaluation design options Data quality assurance – validity/credibility – reliability/dependability Instrument."

Similar presentations


Ads by Google