I/O Psychology Research Methods. What is Science? Science: Approach that involves the understanding, prediction, and control of some phenomenon of interest.

Slides:



Advertisements
Similar presentations
VALIDITY AND RELIABILITY
Advertisements

Research Methods in Psychology
The Ways and Means of Psychology STUFF YOU SHOULD ALREADY KNOW BY NOW IF YOU PLAN TO GRADUATE.
Research in Psychology Chapter Two
Concept of Measurement
47.269: Research I: The Basics Dr. Leonard Spring 2010
Agenda for January 25 th Administrative Items/Announcements Attendance Handouts: course enrollment, RPP instructions Course packs available for sale in.
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
Psych 231: Research Methods in Psychology
Validity, Reliability, & Sampling
Research Methods in Psychology Pertemuan 3 s.d 4 Matakuliah: L0014/Psikologi Umum Tahun: 2007.
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
Chapter 2 Research Methods. The Scientific Approach: A Search for Laws Empiricism: testing hypothesis Basic assumption: events are governed by some lawful.
Methodology: How Social Psychologists Do Research
Measurement and Data Quality
Fig Theory construction. A good theory will generate a host of testable hypotheses. In a typical study, only one or a few of these hypotheses can.
Descriptive and Causal Research Designs
© 2011 The McGraw-Hill Companies, Inc. Chapter 2 Psychology’s Scientific Method.
Chapter 2: The Research Enterprise in Psychology
Work in the 21st Century Chapter 2
Chapter 5 Research Methods in the Study of Abnormal Behavior Ch 5.
Industrial and Organizational Psychology Methods For I/O Research Copyright Paul E. Spector, All rights reserved, March 15, 2005.
8-10% of AP Exam. » Does sleeping less than seven hours a day reduce how long you will live? » Do violent video games make people more aggressive? » Can.
Chapter 2: The Research Enterprise in Psychology
Research Methods Unit 2 (Chapter 2).
Chapter 4 Hypothesis Testing, Power, and Control: A Review of the Basics.
Chapter 2 Research Methods. The Scientific Approach: A Search for Laws Empiricism: testing hypothesis Basic assumption: events are governed by some lawful.
Applying Science Towards Understanding Behavior in Organizations Chapters 2 & 3.
Chapter 2 The Research Enterprise in Psychology. n Basic assumption: events are governed by some lawful order  Goals: Measurement and description Understanding.
Experimental Research
Data Analysis. Quantitative data: Reliability & Validity Reliability: the degree of consistency with which it measures the attribute it is supposed to.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
1 Chapter 2 Methods and Statistics in I-O Psychology Royalty-Free/CORBIS.
Understanding Statistics
Chapter 1: Research Methods
Chapter 1: The Research Enterprise in Psychology.
The Research Enterprise in Psychology. The Scientific Method: Terminology Operational definitions are used to clarify precisely what is meant by each.
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Chapter 1: Psychology, Research, and You Pages 2 – 21.
Instrumentation (cont.) February 28 Note: Measurement Plan Due Next Week.
© 2011 The McGraw-Hill Companies, Inc. with snazzy editions by Mrs. Short Chapter 2 Psychology’s Scientific Method.
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
Tests and Measurements Intersession 2006.
EDU 8603 Day 6. What do the following numbers mean?
Independent vs Dependent Variables PRESUMED CAUSE REFERRED TO AS INDEPENDENT VARIABLE (SMOKING). PRESUMED EFFECT IS DEPENDENT VARIABLE (LUNG CANCER). SEEK.
Chapter 4 – Research Methods in Clinical Psych Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
Review of Research Methods. Overview of the Research Process I. Develop a research question II. Develop a hypothesis III. Choose a research design IV.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 12 Testing for Relationships Tests of linear relationships –Correlation 2 continuous.
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
RESEARCH METHODS IN INDUSTRIAL PSYCHOLOGY & ORGANIZATION Pertemuan Matakuliah: D Sosiologi dan Psikologi Industri Tahun: Sep-2009.
Measurement Experiment - effect of IV on DV. Independent Variable (2 or more levels) MANIPULATED a) situational - features in the environment b) task.
Methodology: How Social Psychologists Do Research
©2005, Pearson Education/Prentice Hall CHAPTER 6 Nonexperimental Strategies.
1 Chapter 2 Studying & Interpreting Worker Behavior Copyright © The McGraw-Hill Companies, Inc. Royalty-Free/CORBIS.
Research in Psychology Chapter Two 8-10% of Exam AP Psychology.
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Data Analysis. Qualitative vs. Quantitative Data collection methods can be roughly divided into two groups. It is essential to understand the difference.
NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN RESEARCH STATISTICS.
Some Terminology experiment vs. correlational study IV vs. DV descriptive vs. inferential statistics sample vs. population statistic vs. parameter H 0.
Chapter 2 Research Methods.
Chapter 2: The Research Enterprise in Psychology
Understanding Results
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
© 2011 The McGraw-Hill Companies, Inc.
Research Methods: Concepts and Connections First Edition
Research in Psychology
Presentation transcript:

I/O Psychology Research Methods

What is Science? Science: Approach that involves the understanding, prediction, and control of some phenomenon of interest. Science: Approach that involves the understanding, prediction, and control of some phenomenon of interest. Scientific Knowledge is Scientific Knowledge is Logical and Concerned with Understanding Logical and Concerned with Understanding Empirical Empirical Communicable and Precise Communicable and Precise Probabilistic (Disprove, NOT Prove) Probabilistic (Disprove, NOT Prove) Objective / Disinterestedness Objective / Disinterestedness

Goals of Science Ex: We want to study absenteeism in an organization Ex: We want to study absenteeism in an organization Description: What is the current state of affairs? Description: What is the current state of affairs? Prediction: What will happen in the future? Prediction: What will happen in the future? Explanation: What is the cause of the phenomena we’re interested in? Explanation: What is the cause of the phenomena we’re interested in?

What is “research”? Systematic study of phenomena according to scientific principles. Systematic study of phenomena according to scientific principles. A set of procedures used to obtain empirical and verifiable information from which we then make informed, educated conclusions. A set of procedures used to obtain empirical and verifiable information from which we then make informed, educated conclusions.

The Empirical Research Process 1. Statement of the Problem 2. Design of the Research Study 3. Measurement of Variables 4. Analysis of Data 5. Interpretation/Conclusions

Step 1: Statement of the Problem Theory: statement that explains the relationship among phenomena; gives us a framework within which to conduct research. Theory: statement that explains the relationship among phenomena; gives us a framework within which to conduct research. “There is nothing quite so practical as a good theory.” Kurt Lewin “There is nothing quite so practical as a good theory.” Kurt Lewin Two Approaches: Two Approaches: Inductive – theory building; use data to derive theory. Inductive – theory building; use data to derive theory. Deductive – theory testing; start with theory and collect data to test that theory. Deductive – theory testing; start with theory and collect data to test that theory.

Step 1: Statement of the Problem Hypothesis Hypothesis A testable statement about the status of a variable or the relationship among multiple variables A testable statement about the status of a variable or the relationship among multiple variables Must be falsifiable! Must be falsifiable!

Step 1: Statement of the Problem Types of variables Types of variables Independent Variables (IV): Are variables that are manipulated by the researcher. Independent Variables (IV): Are variables that are manipulated by the researcher. Dependent Variables (DV): Are the outcomes of interest. Dependent Variables (DV): Are the outcomes of interest. Predictors and Criterion Predictors and Criterion Confounding variables: Uncontrolled extraneous variables that permits alternative explanations for the results of a study. Confounding variables: Uncontrolled extraneous variables that permits alternative explanations for the results of a study.

Moderator Variable Special type of IV that influences the relationship between 2 other variables Special type of IV that influences the relationship between 2 other variables X Y X Y M Example Example Gender & Hiring rate Gender & Hiring rate M = Type of job M = Type of job Relationship b/t gender and hiring rate may change depending on the type of job individuals are applying for. Relationship b/t gender and hiring rate may change depending on the type of job individuals are applying for.

Mediator Variable Special type of IV that accounts for the relation between the IV and the DV. Special type of IV that accounts for the relation between the IV and the DV. Mediation implies a causal relation in which an IV causes a mediator which causes a DV. Mediation implies a causal relation in which an IV causes a mediator which causes a DV. IVMEDDV IVMEDDV Example: Example: IV = negative feedback IV = negative feedback MED = negative thoughts MED = negative thoughts DV = willingness to participate DV = willingness to participate

Moderator vs. Mediator A moderator variable is one that influences the strength of a relationship between two other variables. A moderator variable is one that influences the strength of a relationship between two other variables. A mediator variable is one that explains the relationship between the two other variables. A mediator variable is one that explains the relationship between the two other variables.

Example You are an I/O psychologist working for an insurance company. You want to assess which of two training methods is most effective for training new secretaries. You give one group of secretaries on-the-job training and a booklet to study at home. You give the second group of secretaries on-the-job training and have them watch a 30-minute video. You are an I/O psychologist working for an insurance company. You want to assess which of two training methods is most effective for training new secretaries. You give one group of secretaries on-the-job training and a booklet to study at home. You give the second group of secretaries on-the-job training and have them watch a 30-minute video.

Step 2: Research Design A research design is the structure or architecture for the study. A research design is the structure or architecture for the study. A plan for how to treat variables that can influence results so as to rule out alternative interpretations. A plan for how to treat variables that can influence results so as to rule out alternative interpretations. Primary Research Methods: Primary Research Methods: Experimental (Laboratory vs. Field Research) Experimental (Laboratory vs. Field Research) Quasi-Experimental Quasi-Experimental Non-Experimental (Observational, Survey) Non-Experimental (Observational, Survey)

Step 2: Research Design Secondary Research Methods Secondary Research Methods Meta-analysis: statistical method for combining/analyzing the results from many studies to draw a general conclusion about relationships among variables (p.61). Meta-analysis: statistical method for combining/analyzing the results from many studies to draw a general conclusion about relationships among variables (p.61). Qualitative Research Methods Qualitative Research Methods Rely on observation, interview, case study, and analysis of diaries to produce narrative descriptions of events or processes. Rely on observation, interview, case study, and analysis of diaries to produce narrative descriptions of events or processes.

Evaluating Research Design Internal validity (Control) Internal validity (Control) Does X cause Y? Does X cause Y? Lab studies eliminate distracting variables through experimental control. Lab studies eliminate distracting variables through experimental control. Using of statistical techniques to control for the influences of certain variables is statistical control. Using of statistical techniques to control for the influences of certain variables is statistical control. External validity (Generalizability) External validity (Generalizability) Does the relation of X and Y hold in other settings and with other participants and stimuli? Does the relation of X and Y hold in other settings and with other participants and stimuli?

Threats to Internal Validity History History Instrumentation Instrumentation Selection Selection Maturation Maturation Mortality/Attrition Mortality/Attrition Testing Testing Experimenter Bias Experimenter Bias Awareness of Being a Subject Awareness of Being a Subject

Step 3: Measurement Goal: Quantify the IV and DV Goal: Quantify the IV and DV Psychological Measurement – the process of quantifying variables (called constructs) Psychological Measurement – the process of quantifying variables (called constructs) “The process of assigning numerical values to represent individual differences, that is, variations among individuals on the attribute of interest” “The process of assigning numerical values to represent individual differences, that is, variations among individuals on the attribute of interest” A “Measure” … A “Measure” … Any mechanism, procedure, tool, etc, that purports to translate attribute differences into numerical values Any mechanism, procedure, tool, etc, that purports to translate attribute differences into numerical values

Step 3: Measurement Two classes of measured variables: Two classes of measured variables: Categorical (or Qualitative) Categorical (or Qualitative) Differ in type but not amount Differ in type but not amount Continuous (or Quantitative) Continuous (or Quantitative) Differ in amount Differ in amount

Step 4: Data Analysis Statistics are what we use to summarize relationship among variables and to estimate the odds that they reflect more than mere chance Statistics are what we use to summarize relationship among variables and to estimate the odds that they reflect more than mere chance Descriptive Statistics: Summarize, organize, and describe a sample of data. Descriptive Statistics: Summarize, organize, and describe a sample of data. Inferential Statistics: Used to make inferences from sample data to a larger sample or population. Inferential Statistics: Used to make inferences from sample data to a larger sample or population. Distributions Distributions

Descriptive Statistics Measures of Central Tendency Measures of Central Tendency Mean, Median, Mode Mean, Median, Mode Measures of Variability Measures of Variability Range, Variance, SD Range, Variance, SD

Differences in Variance High variance Low variance Normal

Inferential Statistics Compares a hypothesis to an alternative Compares a hypothesis to an alternative Statistical Significance: The likelihood that the observed difference would be obtained if the null hypothesis were true Statistical Significance: The likelihood that the observed difference would be obtained if the null hypothesis were true Statistical Power: Likelihood of finding a statistically significant difference when a true difference exists Statistical Power: Likelihood of finding a statistically significant difference when a true difference exists

Correlation Correlation Correlation Used to assess the relationship between 2 variables Used to assess the relationship between 2 variables Represented by the correlation coefficient “r” Represented by the correlation coefficient “r” r can take on values from –1 to +1 r can take on values from –1 to +1 Size denotes the magnitude of the relationship Size denotes the magnitude of the relationship 0 means no relationship 0 means no relationship

Correlation and Regression Correlation Correlation Scatterplot Scatterplot Regression Line Regression Line Linear vs. Non-Linear Linear vs. Non-Linear Multiple Correlations Multiple Correlations Correlation and Causation Correlation and Causation

Prediction of the DV with one IV Correlations allow us to make predictions Correlations allow us to make predictions IV D V

Interpretation: Evaluating Measures How do you determine the usefulness of the information gathered from our measures? How do you determine the usefulness of the information gathered from our measures? The Answer: The Answer: Reliability Evidence Reliability Evidence Validity Evidence Validity Evidence

Interpretation: Evaluating Measures Reliability: Consistency or stability of a measure. Reliability: Consistency or stability of a measure. A measure should yield a similar score each time it is given A measure should yield a similar score each time it is given We can get a reliable measure by reducing errors of measurement: any factor that affects obtained scores but is not related to the thing we want to measure. We can get a reliable measure by reducing errors of measurement: any factor that affects obtained scores but is not related to the thing we want to measure. Errors of measurement Errors of measurement Random factors, practice effects, etc. Random factors, practice effects, etc.

Evaluating Measures: Reliability Test-Retest (Index of Stability) Test-Retest (Index of Stability) Method: Give the same test on two occasions and correlate sets of scores (coefficient of stability) Method: Give the same test on two occasions and correlate sets of scores (coefficient of stability) Error: Anything that differentially influences scores across time for the same test Error: Anything that differentially influences scores across time for the same test Issue: How long should the time interval be? Issue: How long should the time interval be? Limitations: Limitations: Not good for tests that are supposed to assess change Not good for tests that are supposed to assess change Not good for tests of things that change quickly (i.e., mood) Not good for tests of things that change quickly (i.e., mood) Difficult and expensive to retest Difficult and expensive to retest Memory/practice effects are likely Memory/practice effects are likely

Evaluating Measures: Reliability Equivalent Forms (Index of Equivalence) Equivalent Forms (Index of Equivalence) Method: Give two versions of a test and correlate scores (coefficient of equivalence) Method: Give two versions of a test and correlate scores (coefficient of equivalence) Reflects the extent to which the two different versions are measuring the same concept in the same way Reflects the extent to which the two different versions are measuring the same concept in the same way Issue: are tests really parallel?; length of interval? Issue: are tests really parallel?; length of interval? Limitations: Limitations: Difficult and expensive Difficult and expensive Testing time Testing time Unique estimate for each interval Unique estimate for each interval

Evaluating Measures: Reliability Internal Consistency Reliability Internal Consistency Reliability Method: take a single test and look at how well the items on the test relate to each other Method: take a single test and look at how well the items on the test relate to each other Split-half: similar to alternate forms (e.g., odd vs. even items) Split-half: similar to alternate forms (e.g., odd vs. even items) Cronbach’s Alpha: mathematically equivalent to the average of all possible split-half estimates Cronbach’s Alpha: mathematically equivalent to the average of all possible split-half estimates Limitations Limitations Only use for multiple item tests Only use for multiple item tests Some “tests” are not designed to be homogeneous Some “tests” are not designed to be homogeneous Doesn’t assess stability over time Doesn’t assess stability over time

Evaluating Measures: Reliability Inter-Rater Reliability Inter-Rater Reliability Method: two different raters rate the same target and the ratings are correlated Method: two different raters rate the same target and the ratings are correlated Correlation reflects the proportion of consistency among the ratings Correlation reflects the proportion of consistency among the ratings Issue: reliability doesn’t imply accuracy Issue: reliability doesn’t imply accuracy Limitations Limitations Need informed, trained raters Need informed, trained raters Ratings are not a good way to measure many attributes Ratings are not a good way to measure many attributes

Interpretation: Evaluating Measures Validity: Validity: The accurateness of inferences made based on data. The accurateness of inferences made based on data. Whether a measure accurately and completely represents what was intended to be measured. Whether a measure accurately and completely represents what was intended to be measured. Validity is not a property of the test Validity is not a property of the test It is a property of the inferences we make from the test scores It is a property of the inferences we make from the test scores

Evaluating Measures: Validity Criterion-Related Criterion-Related Predictive Predictive Concurrent Concurrent Content-Related Content-Related Construct-Related Construct-Related Reliability is a necessary but not sufficient condition for validity Reliability is a necessary but not sufficient condition for validity

Content Validity The extent to which a predictor provides a representative sample of the thing we’re measuring The extent to which a predictor provides a representative sample of the thing we’re measuring Example: First Exam Example: First Exam Content: history, research methods, criterion theory, job analysis, measurement in selection Content: history, research methods, criterion theory, job analysis, measurement in selection Evidence Evidence SME evaluation SME evaluation

Criterion-Related Validity The extent to which a predictor relates to a criterion The extent to which a predictor relates to a criterion Evidence Evidence Correlation (called the validity coefficient) Correlation (called the validity coefficient) A good validity coefficient is around.3 to.4 A good validity coefficient is around.3 to.4 Concurrent Validity Concurrent Validity Predictive Validity Predictive Validity

Construct Validity The extent to which a test is an accurate representation of the construct it is trying to measure The extent to which a test is an accurate representation of the construct it is trying to measure Construct validity results from the slow accumulation of evidence (multiple methods) Construct validity results from the slow accumulation of evidence (multiple methods) Evidence: Evidence: Content validity and criterion-related validity can provide support for construct validity Content validity and criterion-related validity can provide support for construct validity Convergent validity Convergent validity Divergent (discriminant) validity Divergent (discriminant) validity

Step 5: Conclusions From Research You are making inferences! You are making inferences! What if it you’re inferences seem “wrong”? What if it you’re inferences seem “wrong”? Theory is wrong? Theory is wrong? Information (data) is bad? Information (data) is bad? Bad measurement? Bad measurement? Bad research design? Bad research design? Bad sample? Bad sample? Analysis was wrong? Analysis was wrong?

Step 5: Conclusions From Research Cumulative Process Cumulative Process Dissemination Dissemination Conference presentations & journal publications Conference presentations & journal publications Boundary conditions Boundary conditions Generalizability Generalizability Causation Causation Serendipity Serendipity

Research Ethics Informed consent Informed consent Welfare of subjects Welfare of subjects Conflicting obligations to the organization and to the participants Conflicting obligations to the organization and to the participants