Presentation on theme: "Concept of Measurement"— Presentation transcript:
1 Concept of Measurement The ability to demonstrate change or relationship and to communicate those changes to others.Describes the quality or quantity of an existing variable e.g. ROM and strength
2 Concept of Measurement Evaluate a patients condition and response to treatment. We measure changeIs the process of assigning numerals to objects to represent quantitative characteristics according to certain rules.
3 Variable:a characteristic that can be manipulated or observed and that can take on different values, eitherQuantitativelyQualitativelyThe ability to measure a variable is dependent on one’s ability to define it.
4 Continuous Variable: can take on any value along a continuum within a defined range (e.g. ROM or 50 0)Discrete Variable: described only in whole numbers (e.g. HR)Construct: an abstract concept that is invented to represent unmeasureable behaviors or ideas
7 Ordinal - numbers indicate rank order of observations MMTPainFunctional Status
8 Interval - equal intervals between numbers but not related to true zero, therefore not representing an absolute quantityIQF0Calendar Years
9 Ratio - numbers represent units with equal intervals, measured from true zero WeightStrengthBP
10 ReliabilityReliability - the extent to which a measurement is consistent and free from error ( repeatability)Usefulness of measurement in clinical research and decision making depends on the extent to which the therapist can rely on data as accurate and meaningful indicator of behavior or attribute
11 ReliabilityUsefulness of measurement in clinical research and decision makingDepends on the extent to which the therapist can rely on data as accurate and meaningful indicator of behavior or attribute
12 Measurement Errors X = T + E Observed Score True Score Error Measurements are rarely perfectly reliableError in MeasurementX = T EObserved Score True Score ErrorErrors can be-Systematic Errors: are predictable errors of measurementRandom Errors: due to chance (unpredictable)
13 Source of Measurement Error Rater errorInaccuracies in the measuring instrumentsvariability of the characteristics being measured
14 Source of Measurement Error Development of testing instruments involve a specific protocol that maximizes the reliability of the instrument. Errors are identified and then controlled or eliminated:Careful PlanningClear Operational DefinitionsInspection of Equipment
15 Estimate of Reliability True Score VarianceTrue Score Variance and Error VarianceTT + ET + E T reliability increases1.00 perfect reliability.00 no reliability
16 Correlation: degree of association between two sets of data or variables. Correlation between height and weightNot cause and effect researchState two variables are related (X,Y)No true variable manipulationAcceptable levels of positive and negative correlationsA level of significance1.0 perfect.75 good.50 poor
17 Test - retest reliability Establishes that an instrument is capable of measuring a variable with consistency. Analysis called test-rest reliability coefficient.Effected by:Testing Effects - practice or carry over effect testRater Bias - same rater can be influenced by the memory of the first score. Controlled by blinding tester.
18 3-Test-Retest Interval time 4-Carryover & testing effectsRater reliability - Rater is part of measurement system, in some cases is the actual instrument.
19 Intrarater Reliability - refers to the stability of data recorded by one individual across two or more trials Interrater Reliability - concerns variation between two or more raters who measure the same group of subjects
20 Validity of Measurement Validity - concerns the extent to which an instrument measures what is is intended to measure.Implies that measurement is relatively free from error. Valid test must also be reliable.Low reliability low validityHigh reliability is not automatically validValidity helps us make inferences about variables on relevant observable behavior or responses. Inferences go beyond the simple value assigned to them.
21 Specificity of Validity Face Validity - instrument appears to test what it is supposed toContent Validity - indicates that the items that make up an instrument adequately sample the universe of content that defines that variable being measured
22 Specificity of Validity Criterion-related validity - indicates the outcomes of one instrument, the target test that can be used as a substitute measure for an established gold standard criterion test, can be concurrent or predictive
23 Specificity of Validity Concurrent validity - establishes validity when two measures are taken at relatively the same time.Predictive validity - establishes that the outcome of the target test can be used to predict a future criterion score.Prescriptive validity-establishes that the interpretation of a measurement is appropriate for determining effective intervention
24 Specificity of Validity Construct validity - establishes the ability of an instrument to measure an abstract construct and the degree to which the instrument reflects the theoretical components of the construct.
25 Evaluating Diagnostic Procedures A diagnostic test is used to screen for the presence or absence of a disease or abnormal conditionDichotomousCategoricalContinuous
26 Evaluating Diagnostic Procedures Sensitivity-the ability of the test to obtain a positive test when the target condition is present
27 Evaluating Diagnostic Procedures Specificity-Is the ability of the test to obtain a negative test when the condition is really absent
28 Measuring Change Goal of tx is to effect a positive change Difference between outcome and initial is change or gain scoreUsed to analyze the effect of tx or intervention
29 Goals Measure change in an individual performance or condition Measure differences between individuals in the amount of changeSeek to identify factors that contribute to a good responseIntend to draw inferences about tx effects by looking at group differences
30 Validity of change scores Level of measurementReliabilityStabilityLinearity