Criteria for selection of a data collection instrument. 1.Practicality of the instrument: -Concerns its cost and appropriateness for the study population.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Validity and Reliability Chapter Eight.
VALIDITY AND RELIABILITY
Reliability for Teachers Kansas State Department of Education ASSESSMENT LITERACY PROJECT1 Reliability = Consistency.
Part II Sigma Freud & Descriptive Statistics
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
Research Methods in Psychology
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Measures of Association.
Statistical Issues in Research Planning and Evaluation
Reliability and Validity of Research Instruments
Chapter 4 Validity.
BASIC STEPS OF CARRYING OUT RESEARCH  Select a research topic.  Formulate a research question/problem/statement of purpose.  A Research Problem is a.
Statement of the Problem Goal Establishes Setting of the Problem hypothesis Additional information to comprehend fully the meaning of the problem scopedefinitionsassumptions.
Topics - Reading a Research Article Brief Overview: Purpose and Process of Empirical Research Standard Format of Research Articles Evaluating/Critiquing.
1 Writing the Research Proposal Researchers communicate: Plans, Methods, Thoughts, and Objectives for others to read discuss, and act upon.
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
Critique of Research Outlines: 1. Research Problem. 2. Literature Review. 3. Theoretical Framework. 4. Variables. 5. Hypotheses. 6. Design. 7. Sample.
Research problem, Purpose, question
Chapter 7 Correlational Research Gay, Mills, and Airasian
Classroom Assessment A Practical Guide for Educators by Craig A
Reliability and Validity. Criteria of Measurement Quality How do we judge the relative success (or failure) in measuring various concepts? How do we judge.
Understanding Validity for Teachers
Questions to check whether or not the test is well designed: 1. How do you know if a test is effective? 2. Can it be given within appropriate administrative.
The Research Process. Purposes of Research  Exploration gaining some familiarity with a topic, discovering some of its main dimensions, and possibly.
Measurement Concepts & Interpretation. Scores on tests can be interpreted: By comparing a client to a peer in the norm group to determine how different.
Technical Issues Two concerns Validity Reliability
Measurement and Data Quality
Near East University Department of English Language Teaching Advanced Research Techniques Correlational Studies Abdalmonam H. Elkorbow.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
EDU 8603 Day 6. What do the following numbers mean?
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
Appraisal and Its Application to Counseling COUN 550 Saint Joseph College For Class # 3 Copyright © 2005 by R. Halstead. All rights reserved.
Research Methods in Psychology Chapter 2. The Research ProcessPsychological MeasurementEthical Issues in Human and Animal ResearchBecoming a Critical.
Scientific Methods and Terminology. Scientific methods are The most reliable means to ensure that experiments produce reliable information in response.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
Non-Experimental Design Where are the beakers??. What kind of research is considered the “gold standard” by the Institute of Education Sciences? A.Descriptive.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
Reliability performance on language tests is also affected by factors other than communicative language ability. (1) test method facets They are systematic.
Writing A Review Sources Preliminary Primary Secondary.
TEST SCORES INTERPRETATION - is a process of assigning meaning and usefulness to the scores obtained from classroom test. - This is necessary because.
Lesson 3 Measurement and Scaling. Case: “What is performance?” brandesign.co.za.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Understanding Populations & Samples
Logic of Hypothesis Testing
Writing a sound proposal
VALIDITY by Barli Tambunan/
Lecture 5 Validity and Reliability
The project standards and procedures for health care project
Reliability and Validity in Research
Concept of Test Validity
© 2012 The McGraw-Hill Companies, Inc.
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Chapter 2 Methods Lecture PowerPoint © W. W. Norton & Company, 2008.
Validity and Reliability II: The Basics
15.1 The Role of Statistics in the Research Process
Methodology Week 5.
Analyzing Reliability and Validity in Outcomes Assessment
Chapter 8 VALIDITY AND RELIABILITY
Presentation transcript:

Criteria for selection of a data collection instrument. 1.Practicality of the instrument: -Concerns its cost and appropriateness for the study population. -How much will the instrument cost? -How long will it take to administer the instrument? -Will the population have the physical and mental abilities to complete the instrument?

-Are special motor skills or language ability required of subjects? -Does the researcher require special training to administer or score the instrument? -If so, is this training available? -Is money available, and someone available to analyze the data?

2. Reliability of the instrument : -Concerns the instrument’s consistency and stability. -Is the degree of consistency with which the instrument measures the attributes. There are three different types of reliability. A.Stability Reliability: - Refers to an instrument consistency over time.

-The same scores are obtained when the instrument is used with the same subjects twice. The correlation coefficient( coefficient of stability) range between and B- Equivalence Reliability Concerns the degree to which two different forms of an instrument obtain the same results or two or more observers using a single instrument obtain the same results.

C- Internal Consistency Reliability: The extent to which all items on an instrument measure the same variable. 3. Validity of the instrument: Refers to the degree to which an instrument measures what it is supposed to be measuring. Validity may be established through the use of a panel of experts or through an examination of the existing literatures on the topic.

The relationship between reliability and validity: Closely associated Reliability is usually considered first because reliability is a necessary condition for validity. An instrument cannot be valid unless it is reliable. The reliability of an instrument tells nothing about the degree of validity.

An instrument can be very reliable and have no validity. Measurement that is not reliable cannot be valid. Guidelines for evaluating data quality: 1.If operational definitions (or scoring procedures) are specified, do they clearly indicate the rules of measurement? Do the rules seem reasonable?

2. Does the report provide any evidence of the reliability of the data? Does the evidence com from the research sample itself, or is it based on other studies? If the latter is it reasonable to believe that data quality would be the same for the research sample as for the reliability sample?

3. If there is evidence of reliability, which estimation method was used? Was this method appropriate? Should an alternative or additional method of reliability appraisal have been used? The reliability adequate? 4. If the report does not provide evidence of reliability of the measures, are there any indications of efforts the researcher made to minimize errors of measurement?

5.Does the report offer nay evidence of the validity of the measures? Does the evidence come from the research sample itself, or is it based no other studies? If the latter, is it reasonable on other studies? If the latter, is it reasonable to believe that data quality would be the same for the research sample as for the validation sample?

6. If there is evidence of validity, which validity approach was used? Was this approach appropriate? Should an alternative or additional method of validation have been used? Does the validity of the instrument appear to be adequate? 7. Were the research hypotheses support If not, might data quality play a role in the failure of the hypotheses to be confirmed?

Presentation and discussion of study results. -Fining and discussion -Each research study should contain the following elements: 1.Findings of the study: Are the presentation of the results in the form of empirical data or facts. The reporting of these data is an objective process. This is not the place to express opinions or reactions to the data.

Findings are written in the past tense. Presentation of findings A- Narrative presentation of findings B- tables 2- Discussion of findings: Is a much more subjective section of a research report than the presentation of the findings.

Allows the researcher to make interpretations of the findings. The findings are interpreted in light of the theoretical framework and within the context of the literature review. In the discussion of the findings, the researcher discusses aspects of the results that are consistent with previous research and theoretical explanations and those that are not in agreement. The researcher also reports problems that occurred during the study that may have influenced the results.

3- Discussion of study hypotheses -The results of hypothesis testing can fall into one of three categories: A.The null hypothesis is not rejected. B.The null hypothesis is rejected and the research hypothesis is supported. C.The null hypothesis is rejected and the results are in the opposite direction from those predicted by the research hypothesis.

-Negative results can be as important as positive results. 4- Conclusions: The research’s attempt to show what Knowledge has been gained by the study and are also an attempt to generalize the findings. In writing conclusions, return to your study problem, purpose and, hypothesis,

Was the study problem answered? The research purpose met? The research hypothesis supported? The theoretical framework supported? 5. Implications: The implications of a study contain the “shoulds” that result from the study. “Nurse – educators should…” “or” “nurse- clinicians should”

6- Recommendations 7. Consideration of study limitations in future research: Some of the most common Recommendations that are made about limitations are: (a)alteration in the sample (b)Alteration in the instrument (c)Control of variables (d)Change in methodology.