5 Policies Laptops Breaks? Food Contacting Me Etiquette Individual/group assignment ROEEnd of class cleanup
6 Why do we do educational research? Helps educators understand educational processes; make professional decisionsProvides information to policy groups to assist them with mandated changes in educationServes the information needs of concerned public, professional, and private organizationsReviews and interprets accumulated empirical evidenceIs readily availableIncludes educators in the field in research projectsResearch Process and Design (Umbach)
7 Research Process and Design (Umbach) More importantly, why is it important for you to learn how to read, evaluate, and design research?Research Process and Design (Umbach)
8 Principles of Scientific Evidence-Based Inquiry Adapted from National Research CouncilDefinition—evidenced-based inquiry is the search for knowledge using systematically gathered empirical dataPrinciple 1: pose significant questions that can be investigated empiricallyResearch Process and Design (Umbach)
9 Principles of Scientific Evidence-Based Inquiry Principle 2: link research to relevant theory or conceptual frameworkPrinciple 3: use methods that allow direct investigation of the research questionPrinciple 4: provide a coherent and explicit chain of reasoningResearch Process and Design (Umbach)
10 Principles of Scientific Evidence-Based Inquiry Principle 5: replicate/generalize or extend across studiesPrinciple 6: disclose research to encourage professional scrutiny and critiqueResearch Process and Design (Umbach)
11 Research Disciplines Education Sociology Psychology Policy History BiographyManagementPractice basedPedagogyLinguisticsetc
12 Classifying research Types Use Classifying exploratory, descriptive, explanatory, predictiveUsepure, applied, evaluative,action/practitionerClassifyingprimary vs. secondarytheoretical vs. empiricalquantitative vs. qualitative (vs. mixed methods)inductive vs. deductionTo be discussed in following slides….
13 Types of research exploratory generate new ideas, concepts, or hypotheseslittle or no prior knowledgelooks for clues or basic facts, settings, and concernscreates a general picture of conditionsformulate and focus questions for future research
14 Types of research descriptive provides a detailed, highly accurate picturecreate a set of categories or classify typesreport on the background or context of a situationlittle attempt to explain the results
15 Types of research explanatory predictive explains why something happenslook for casual relationships between conceptselaborate and enrich a theory’s explanation or extend a theory to new issues or topicssupport or refute an explanation or predictiondetermine which of several explanations is bestpredictiveforecasts future phenomena, based on findings suggested by explanatory research
16 Use of research pure, basic, or academic research applied research adds to the body of knowledgecontributes to theoryfocus on issues of importance to researchersapplied researchfocus on issues of importance to societyhelps to understand the nature and sources of human problemsused to make practical decisions
17 Use of research summative evaluation formative evaluation determines the effectiveness of an interventionsfocus on the goals of the interventionformative evaluationimproving an interventionfocus on providing recommendations or changes
18 Use of research action research solves problems in a program, organisation or communityfocus on empowering participants to solve issues themselves
19 Classifying research primary secondary involves the collection and analysis of original datasecondaryfind existing data and (re)analysecensus dataparticipation surveys
20 Classifying research theoretical empirical generation of new ideas through analysing existing theory and explanations.empiricalgeneration of new ideas through the collection and analysis of data
21 Deductive theory hypothesis test hypothesis accept/reject theory reasoningaccept/reject theory
23 The Research Process Review Identify Identify the concepts topic literatureIdentifyconcepts&theoryDrawconclusionClarifyresearchproblemAnalysedataCollectionofdataResearchdesign
24 Research Process (M & S) Select a general problemConduct literature reviewState conclusion/ generalization about problemPreliminary search, later expandedExhaustive reviewSelect specific problem, research question, or hypothesisCollect dataAnalyze and present dataInterpret findingsMore generally……Statistical tablesIntegrative diagramsDecide design and methodology
25 The Research Process background to the problem’s context Identifytopicbackground to the problem’s contextwhy is your problem importantwho will benefit?who will use your conclusionspolicy/ practice/ research(also why they will use it)
26 The Research Process dyslexia and memory Identify topic background Eleni Sakellariou (PhD Candidate)backgrounddyslexia appears to be linked to various aspects of memoryimportanceunderstanding how memory interacts with dyslexia will assist teachers in helping students
27 The Research Process general definitions Reviewtheliteraturegeneral definitionsgeneral discussion of your issue and related topicsspecific research that is related to your topicexisting work on your topicwho, why, where, when, findings, shortcomingsgeneral conclusions about work done to date
28 The Research Process for each article/study examine: Review the literaturefor each article/study examine:the central purpose of their studystate information about the sample and subjectsreview key results that relate to your studyhow is this article of relevance to your studyhow does this study inform your methods
29 The Research Process Review the literature dyslexia and memory Literature: BooksPickering, S. (2006). Working memory and education. Amsterdam: Elsevier Press.Miyake, A. & Shah, P. (1999). Models of Working Memory: Mechanisms of Active Maintenance and Executive Control. Cambridge, UK: Cambridge University Press.Torrance, M. & Jeffery, G. (1999). The cognitive demands of writing. Amsterdam: Amsterdam University Press.Kellogg, R. T. (1996). A model of working memory in writing. In C. M. Levy & S. Ransdell (Eds.), The science of writing: Theories, methods, individual differences, and applications. Mahwah, NJ: Erlbaum.
30 The Research Process Review the literature dyslexia and memory Literature: ArticlesBaddeley, A. D. (2002). Is working memory still working? European Psychologist, 56,Swanson, H. L. & Berninger, V. (1996). Individual differences in children’s working memory and writing skill. Journal of Experimental Child Psychology, 63,Berninger, V. W., Nielsen, K.H., Abbott, R.D., Wijsman, E. & Raskind, W. (2007). Writing problems in developmental dyslexia: Under-recognized and under-treated. Journal of School Psychology, 4,McCutchen, D. (1996). A capacity theory of writing: Working memory in composition. Educational Psychology Review, 8,Olive, T. (2004). Working Memory in Writing: Empirical evidence from the Dual-Task Technique. European Psychologist, 9, 1,Martin, R. C. (1993). Short-term memory and sentence processing: Evidence from neuropsychology. Memory & Cognition, 21,
31 The Research Process identify the concepts you are actually studying &theoryidentify the concepts you are actually studyingwhat theoretical background do they come from?methods of data collectionvalidity of research instrumentssampling issues
32 The Research Process dyslexia and memory Identify concepts & theory cognitive psychologyshort term memoryphonological loopvisual spatial sketchpadvisual memorycentral executiveworking memory
33 The Research Process Research Question or Research Problem or (null) Hypothesisdyslexia and memoryIs there a pattern incognitive profile whichinfluences the writingperformance of thedyslexic children?Clarifyresearchproblem
34 The Research Process Review Identify Identify the concepts topic literatureIdentifyconcepts&theoryDrawconclusionClarifyresearchproblemAnalysedataCollectionofdataResearchdesign
35 The Research Process Research design research philosophy / approach inductive/deductive/positivism/interpretism?research purposeexploratory/descriptive/explanatory/predictiveresearch strategyexperiment/survey/case study/grounded theory/ ethnographydictates:why you do thingshow you do thingsResearchdesign
36 The Research Process dyslexia and memory Research design the literature was relatively developedtherefore an deductive approachthe topic is about cognitive processestherefore it is relatively post-postivisticResearchdesign
37 The Research Process data collection Collection of data what is your datawhat is your samplewhat is your sampling methodwhat is your collection methodwhat is your collection instrumenttimeline?Collectionofdata
38 The Research Process data collection Collection of data do they need to berepresentativeThe Research Processsurveys, interviews,observationdata collectionwhat is your datawhat is your samplewhat is your sampling methodwhat is your collection methodwhat is your collection instrumenttimeline?ethics?privacy?self, questionnaireinterview schedule/guide,checklistpilottestCollectionofdatais itvalidis it reliable
39 The Research Process two stages Analyse data preparing the data transcribing interviewsentering surveys into computer programsanalysissummary of responsessimilaritiesdifferencesrelationshipsAnalysedata
40 The Research Process two stages what do you do with record interview missingdataThe Research Processrecord interviewthen transcribetwo stagespreparing the datatranscribing interviewsentering surveys into computer programsanalysissummary of responsescontent analysisopen/axial codingpattern codingthematic codingsimilaritiesdifferencesrelationshipsmeans/s.d.chi2, t-testsANOVA, correlationAnalysedata
41 The Research Process Conclusions must stem from your data Links to other peoples researchLimitations with findingsApplications of findingsDrawconclusion
42 The Research Process—Seven Phases Select a general problemReview the literature on the problemDecide the specific research problem, question, or hypothesisDetermine the design and methodologyCollect dataAnalyze data and present the resultsInterpret the findings and state conclusions or summary regarding the problemAgain, but we are going to look at research designResearch Process and Design (Umbach)
43 Research Process and Design (Umbach) Research DesignResearch design describes how the study was conductedWhat is general planHow research is set upWhat happens to the subjectsWhat were methods of data collectionResearch Process and Design (Umbach)
44 Research Process and Design (Umbach) Research DesignMatch the design to the question(s) being asked so as to best answer the question(s)Consider limitations and cautions in interpreting results from each designAnalyze data in keeping with research designProvide the most valid, accurate answers to research questionsCongruency between the research question and the research design selected to answer that questionImplications related to the type of data analysis with specific research designsResearch Process and Design (Umbach)
45 Three Major Categories of Research Design QuantitativeExperimental (true, quasi, single-subject)Nonexperimental (descriptive, comparative, correlational, ex post facto)Qualitativeany information that is not numerical in naturedata is ‘rich’ or ‘thick’Mixed MethodsMixed---exactly that, but only good if both qualitative components and qualitative components taken separately are valid and the interrelationship as wellResearch Process and Design (Umbach)
46 Quantitative and Qualitative Research Approaches Assumptions about the worldQuantitative—single reality (i.e., cause and effect, reduce to specific variables, test of theories)Qualitative—multiple reality (i.e., multiple meanings of individual experiences, meanings are socially constructed)Research purposeQuantitative—establish relationships or explain causes of changeQualitative—understand social phenomenon, explore a process, describe experiences, report storiesResearch Process and Design (Umbach)
47 Quantitative and Qualitative Research Approaches Research methods and processQuantitative—established set of procedures and stepsQualitative—flexible design, emergent designPrototypical studiesQuantitative—experimental or correlational designs, designed to reduce bias, error, and extraneous variablesQualitative—takes into account bias and subjectivityResearch Process and Design (Umbach)
48 Quantitative and Qualitative Research Approaches Researcher roleQuantitative—detached from study to avoid biasQualitative—immersed in phenomenon being studied; participant observationImportance of the context in the studyQuantitative—aims to establish universal context-free generalizationsQualitative—develops context-bound summariesResearch Process and Design (Umbach)
49 Elements of a research proposal IntroductionShould capture the reader’s interest and sell them on the idea that the study is worth doingCan serve as a standalone document that describes your studyReview of the literatureSummarizes and analyzes previous researchShows relationship of current study to what has been doneMethodClearly describes how you plan to take answer your research questions or test your hypothesesResearch Process and Design (Umbach)
50 Introduction should answer the following: What do you plan to study?Why is it important to study it?How do you plan to study it?Who do you plan to study?Research Process and Design (Umbach)
51 The introduction is likely to include: The research problemStudies that have addressed the problemDeficiencies in the studiesImportance of the proposed researchBrief introduction to theoretical frameworkPurpose statementResearch questions and/or hypotheses (sometimes included in the literature review section)Brief description of method (who? and how?)Limitations and delimitationsResearch Process and Design (Umbach)
52 One model for introduction (suggested by Creswell) Research problemReview of studies addressing problemDeficiencies of previous workImportance of studyPurpose of study, research questions, and/or hypothesesBrief statement of methodResearch Process and Design (Umbach)
53 Research Process and Design (Umbach) What research problem would you like to address in your proposal?Research Process and Design (Umbach)
55 Qualitative ResearchQualitative research is an interdisciplinary, transdisciplinary, and sometimes counterdisciplinary field. It crosses the humanities and the social and physical sciences. Qualitative research is many things at the same time. It is multiparadigmatic in focus. Its practitioners are sensitive to the value of the multimethod approach. They are committed to the naturalistic perspective, and to the interpretative understanding of human experience. At the same time, the field is inherently political and shaped by multiple ethical and political positions.Nelson et al’s (1992, p4)
56 Qualitative Research‘Qualitative Research…involves finding out what people think, and how they feel - or at any rate, what they say they think and how they say they feel. This kind of information is subjective. It involves feelings and impressions, rather than numbers’Bellenger, Bernhardt and Goldstucker, Qualitative Research in Marketing, American Marketing Association
57 Qualitative ResearchQualitative research is multimethod in focus, involving an interpretative, naturalistic approach to its subject matter.Qualitative Researchers study “things” (people and their thoughts) in their natural settings, attempting to make sense of, or interpret, phenomena in terms of the meanings people bring to them.
58 Qualitative ResearchQualitative research involves the studied use and collection of a variety of empirical materials - case study, personal experience, introspective, life story, interview, observational, historical, interactional, and visual texts-that describe routine and problematic moments and meanings in individuals lives.Deploy a wide range of interconnected methods, hoping always to get a better fix on the subject matter at hand.
60 Popularity of Qualitative Research Usually much cheaper than quantitative researchNo better way than qualitative research to understand in-depth the motivations and feelings of consumersQualitative research can improve the efficiency and effectiveness of quantitative research
61 Limitations of Qualitative Research Marketing successes and failures are based on small differences in the marketing mix.Qualitative research doesn’t distinguish these differences as well as quantitative research can.Not representative of the population that is of interest to the researcherThe multitude of individuals who, without formal training, profess to be experts in the field
63 Overview Stages of quantitative research Conceptualization and measurementReliabilityValidityMain preoccupations of quantitative researchers
64 Criticisms of quantitative research Gap between the ideal and the actual
65 The Stages of Quantitative Research Theory/hypothesisResearch designDevise measures of conceptsSelect site and sampleCollect dataCode and analyze dataWrite upThis series of stages represents an ‘ideal typical’ account of the way quantitative research is carried out. Each stage in the process follows on logically from the last.As you learn these stages, I would like you to also think critically about why research in the social sciences does not often follow this model.Remember the contrast between deductive and inductive theorising? This model is an example of a deductive relationship between theory and evidence.The focus of much of the lecture is on the third stage: devising measures of concepts. Let’s start with examples of concepts.
66 A basket of concepts… Gender Glass ceiling Femininity/MasculinityBody imageHousehold division of labourWomen and GirlsSexual harassmentGlass ceilingPatriarchyNotice that these are all concepts that are examined in courses about gender.Can you think of any other major concepts in Gender Studies that are missing from this slide?All social science disciplines and areas of study examine concepts.FeminismGender InequalityGenderBoys and Men
67 Concepts and Conceptualization Concepts = ‘categories for the organisation of ideas and observations’ (Bulmer, 1984: 43)May provide explanations of social phenomenaMay represent things we want to explainConceptualization = the process of specifying what is meant by a termE.g. Sexual harassment is a concept that (1) may provide an explanation for why there are few women in some professions, and (2) may be something that we want to explain.Theories specify the relationship between different concepts (e.g. sexual harassment and the glass ceiling)Conceptualization is not that easy… we assume we all know what we mean by a concept.Example: girl is a simple concept. How would you define it? (young female person)What is meant by young?What is meant by female?
68 Measurements. . . delineate fine differences between people/cases. are consistent and reliable.are more precise estimates of the degree of relatedness between concepts.The challenge of quantitative research is to find valid, reliable ways to measure concepts.Some concepts can be measured quite directly. E.g. alcohol consumption can be measured directly by asking people to report the number of drinks they have in a week.Example of student in seminar who asked about whether bullying in Columbine high school could be considered an example of racial discrimination. What is racial discrimination? How would you measure this concept.
69 Indicators of Concepts Produced by the operational definition of a concept and are less directly quantifiable than measuresCommon-sense understandings of the form a concept might takeMultiple-indicator measures:concept may have different dimensionsexample: ‘commitment to work’Example.Example SES = income, assets, education and occupational prestige (four indicators of this concept. Each must be measured) Before they can be measured, they must be further operationalized.Some indicators are more challenging to quantify than measures.e.g. loneliness... look for the forms loneliness might take – e.g. feelings of sadnessSome concepts with multiple dimensions require multiple indicators.
70 How to measure the concept of ‘Keeping up’ in a course? Direct measures?Indicators?Dimensions?Can we research a concept without using any measures or indicators?(e.g. poverty, body image, intelligence, etc.)Yes, we can research the concept inductively by asking people what these concepts mean to them subjectively, which is an interpretivist way of theorizing, suited to qualitative research.Implicit in a quantitative measure or indicator is a theory of what that concept means.
71 Reliability Stability over time Internal reliability test-retest method (correlation between measure on different occasions)Internal reliabilitysplit-half method (correlation between measures on two halves of a scale)Cronbach’s alphaReliability refers to the consistency of measures.Stability over time. E.g. An index of “fear of crime” is reliable if it gives the same scores when measured at two points in time.Internal reliabilityThe Shyness scaleInter-observerScale for measuring mental healthRatings should be consistent when different people assess the same client
72 Types of reliability Inter-rater reliability Test-retest reliability Do two (or more) researchers see the same thing?Used frequently in qualitative researchOur recent group observations in Student Center employed inter-rater reliabilityTest-retest reliabilityDoes a repeat study generate similar results?Do not have to be identical because of variations in population, sample, etc.Used in qualitative and quantitative research
73 Internal reliability/consistency How reliable are measures within one project?Used frequently for assessing reliability of scales and typologies but only good for unidimensional constructsSplit half reliability- randomly divide measure items and compare outcomes.Cronbach’s alpha- a average of all possible split-half scores.Parallel forms reliability- divide questions into two and administer each separately to the same sample.For all, the closer the score is to 1 the more reliable the scale, etc.
74 Inter-observer consistency agreement between different researchers
75 Measurement Validity Face validity Concurrent validity Construct validityConvergent validityValidity presupposes reliability (but not vice versa). Why is this?Face Validitye.g. number of drinks is a valid measure of alcohol consumption“do you love me?” “Yes.” = face validity, but may be other indicators that call this into question.Concurrent validity = correspondence with other measures that are known to be associated with the concept being measured. (also called criterion validity)e.g. if you want to create a test of sales ability, you would assess the test’s validity by comparing the results with actual salesConstruct validityBased on logical relationships among variablese.g. marital satisfaction is logically related to marital fidelity (except in non-monogamous or polyamorous partnerships)Measure of marital satisfaction is valid when results are correlated to measures of marital fidelity.Convergent validityMeasures of the same concepts through different methods should converge (i.e. give the same results)E.g. self-reports and observations (e.g. seminar participation as measured by (1) the seminar leader and (2) self-reports of seminar participation should converge)Think of the bathroom scales. If I am weighing my luggage to take a flight, the luggage must be less than 40 kilos, but my scale is in pounds. The measure is not valid. Also, the scale may be consistent (reliable), but it may be undercalibrated, so will always tell me that I weigh 5 pounds less than I do. It is not a valid measure.Tension between reliability and validity because concepts that lend themselves to reliable measurement are not necessarily the most valid.e.g. morale in different factories.Counting number of grievances put forward to a union is a very reliable measure.But is it the most valid way of knowing about morale?Could also do field work in the factories (observation, talking to workers, getting to know the routines and the climate on the assembly line).
76 Two general types of validity Internal validityExternal validityThe logic of the study designAccounting for alternative (or additional) explanations of causal relationshipsif study focuses on causal relationshipsGeneralizable (quantitative) or transferable (qualitative)
77 Main Preoccupations of Quantitative Researchers 1. MeasurementCan a concept be quantified?Comparisons between measuresChanges in a variable over time2. CausalityExplanations of social phenomenaCausal relationships between independent and dependent variablesInference only in cross-sectional designsThink critically about the difference between these and the priorities of qualitative researchers, who aim to understand, interpret and discover subjective meanings. What kinds of topics or subject matter would be more suited to each set of aims?
78 3. GeneralizationCan the results be applied to individuals beyond the sample?Aims to generalize to target populationRequires representative sample (random, probability sample)
79 4. ReplicationDetailed description of procedures allows other researchers to replicate studyLow incidence of published replications
80 TransferabilityNot all studies are intended to be generalizable to an entire populationRefers to the ability to apply research results in another context or to inform other researchAlso refers to the ability of the research to connect the reader with the researchMake study environment, respondents, social phenomena “come alive”Solicits comparisons between reader’s own experiences and experiences described in the research
81 All of these measuresOf validity and reliability are conducted after research is conductedFrustrating to have to report that your measures were invalid or unreliableBut that is still a legitimate finding!Just as frustrating sometimes to have to report you found no support for your hypothesis!
82 No way to know a prioriCan’t know for certain how reliable or valid something is before you’ve conducted the researchUnless you are using something that has reliability/validity previously establishedThat’s why so much time and effort is put into research designConceptualizationOperationalizationReviewing past researchExploring theoriesExploring methods
83 Pretesting and preliminary investigation Can also increased reliability and validityAs well as improving overall research designPre-testingAfter research instrument/guidelines establishedInvolves giving your survey, using your observation guidelines in the field, doing a few interviews with respondents or informantsAnalyzing data generated and soliciting feedback from respondents about instrument (if applicable)Revising measurements, instrument
84 Preliminary investigation Often occurs prior to creating research instrument/guidelinesMay talk informally with individuals from the target population or otherwise associated with social phenomenaMay do field observationsMay collect and analyze social artifacts associated with research topic
85 Criticisms of Quantitative Research Failure to distinguish between objects in the natural world and social phenomenaArtificial and false sense of precision and accuracypresumed connection between concepts and measuresrespondents make different interpretations of questions and other research toolsSome powerful criticisms of the quantitative approach have been presented by those working in the qualitative tradition.Remember that quantitative and qualitative research strategies differ on ontological, epistemological and methodological grounds.
86 Lack of external validity reliance on instruments and measurementslittle relevance to participants’ everyday livesvariation in the meaning of concepts to each individual
87 Static view of social life reduced to relationships between variablesignores processes of human definition and interpretation (Blumer, 1956)
88 The Gap Between the Ideal and the Actual Quantitative research design is an ideal-typical approachUseful as a guide to good practice but there is a discrepancy between ideal type and actual practice of social researchPragmatic concerns mean that researchers may not adhere rigidly to these principles‘Scientific’ research always takes place in a social context and researchers are constantly having to balance out what is ideal with what is feasible (see Chapter 2).
89 How does quantitative research sometimes depart from the principles of good practice? Three examples…
90 1. Reverse operationalism Quantitative research is usually deductive (operational definition of concepts), but measurements can sometimes lead to inductive theorizing (Bryman, 1988)example: factor analysisgroups of indicators cluster together and suggest a common factore.g. personality trait researchHere is an example of the inevitable deviation from “ideal typical” quantitative research.The operational definition of measurements can often lead to inductive theorising and so feed back into the loop of theory and data collection. The author gives factor analysis as an example of this; as an illustration, you might want to discuss Cattell’s (1973) trait theory of personality, in which he identified sixteen basic ‘personality factors’ from clusters of other, surface level forms of behaviour.Factor analysis is used with multiple-indicator measures to see if they cluster together into factors
91 2. Reliability and validity testing Published accounts of quantitative research rarely report evidence of reliability and validity (Podsakoff & Dalton, 1987)Researchers are primarily interested in the substantive content and findings of their researchTests of reliability and validity are often neglectedThere is a low incidence of reporting reliability and validity test results in published social science articles, because researchers are usually more interested in the substantive contents of their data.The academic community is a social context in which research is produced, and that academics are motivated as much by the values of status, reputation and credibility as they are by a ‘purist’ commitment to research.
92 3. SamplingGood practice in quantitative research calls for probability samplingSometimes it may not be possible to obtain a probability sample due to lack of time, lack of resources, or the nature of the population.
93 Peer ReviewsDespite the inevitable shortcomings of actual projects peer review helps ensure that quantitative researchers remain committed to the principles of good practice.
94 Backup slides for general info Research Process and Design (Umbach)
96 Topics Discussed in this Chapter Data collectionMeasuring instrumentsTerminologyInterpreting dataTypes of instrumentsTechnical issuesValidityReliabilitySelection of a test
97 Data CollectionScientific inquiry requires the collection, analysis, and interpretation of dataData – the pieces of information that are collected to examine the research topicIssues related to the collection of this information are the focus of this chapter
98 Data Collection Terminology related to data Constructs – abstractions that cannot be observed directly but are helpful when trying to explain behaviorIntelligenceTeacher effectivenessSelf conceptObj. 1.1 & 1.2
99 Data Collection Data terminology (continued) Operational definition – the ways by which constructs are observed and measuredWeschler IQ testVirgilio Teacher Effectiveness InventoryTennessee Self-Concept ScaleVariable – a construct that has been operationalized and has two or more valuesObj. 1.1 & 1.2
100 Data Collection Measurement scales Nominal – categories Gender, ethnicity, etc.Ordinal – ordered categoriesRank in class, order of finish, etc.Interval – equal intervalsTest scores, attitude scores, etc.Ratio – absolute zeroTime, height, weight, etc.Obj. 2.1
101 Data Collection Types of variables Categorical or quantitative Categorical variables reflect nominal scales and measure the presence of different qualities (e.g., gender, ethnicity, etc.)Quantitative variables reflect ordinal, interval, or ratio scales and measure different quantities of a variable (e.g., test scores, self-esteem scores, etc.)Obj. 2.2
102 Data Collection Types of variables Independent or dependent Independent variables are purported causesDependent variables are purported effectsTwo instructional strategies, co-operative groups and traditional lectures, were used during a three week social studies unit. Students’ exam scores were analyzed for differences between the groups.The independent variable is the instructional approach (of which there are two levels)The dependent variable is the students’ achievementObj. 2.3
103 Measurement Instruments Important termsInstrument – a tool used to collect dataTest – a formal, systematic procedure for gathering informationAssessment – the general process of collecting, synthesizing, and interpreting informationMeasurement – the process of quantifying or scoring a subject’s performanceObj. 3.1 & 3.2
104 Measurement Instruments Important terms (continued)Cognitive tests – examining subjects’ thoughts and thought processesAffective tests – examining subjects’ feelings, interests, attitudes, beliefs, etc.Standardized tests – tests that are administered, scored, and interpreted in a consistent mannerObj. 3.1
105 Measurement Instruments Important terms (continued)Selected response item format – respondents select answers from a set of alternativesMultiple choiceTrue-falseMatchingSupply response item format – respondents construct answersShort answerCompletionEssayObj. 3.3 & 11.3
106 Measurement Instruments Important terms (continued)Individual tests – tests administered on an individual basisGroup tests – tests administered to a group of subjects at the same timePerformance assessments – assessments that focus on processes or products that have been createdObj. 3.6
107 Measurement Instruments Interpreting dataRaw scores – the actual score made on a testStandard scores – statistical transformations of raw scoresPercentiles (0.00 – 99.9)Stanines (1 – 9)Normal Curve Equivalents (0.00 – 99.99)Obj. 3.4
108 Measurement Instruments Interpreting data (continued)Norm-referenced – scores are interpreted relative to the scores of others taking the testCriterion-referenced – scores are interpreted relative to a predetermined level of performanceSelf-referenced – scores are interpreted relative to changes over timeObj. 3.5
109 Measurement Instruments Types of instrumentsCognitive – measuring intellectual processes such as thinking, memorizing, problem solving, analyzing, or reasoningAchievement – measuring what students already knowAptitude – measuring general mental ability, usually for predicting future performanceObj. 4.1 & 4.2
110 Measurement Instruments Types of instruments (continued)Affective – assessing individuals’ feelings, values, attitudes, beliefs, etc.Typical affective characteristics of interestValues – deeply held beliefs about ideas, persons, or objectsAttitudes – dispositions that are favorable or unfavorable toward thingsInterests – inclinations to seek out or participate in particular activities, objects, ideas, etc.Personality – characteristics that represent a person’s typical behaviorsObj. 4.1 & 4.5
111 Measurement Instruments Types of instruments (continued)Affective (continued)Scales used for responding to items on affective testsLikertPositive or negative statements to which subjects respond on scales such as strongly disagree, disagree, neutral, agree, or strongly agreeSemantic differentialBipolar adjectives (i.e., two opposite adjectives) with a scale between each adjectiveDislike: ___ ___ ___ ___ ___ :LikeRating scales – rankings based on how a subject would rate the trait of interestObj. 5.1
112 Measurement Instruments Types of instruments (continued)Affective (continued)Scales used for responding to items on affective tests (continued)Thurstone – statements related to the trait of interest to which subjects agree or disagreeGuttman – statements representing a uni-dimensional traitObj. 5.1
113 Measurement Instruments Issues for cognitive, aptitude, or affective testsProblems inherent in the use of self-report measuresBias – distortions of a respondent’s performance or responses based on ethnicity, race, gender, language, etc.Responses to affective test itemsSocially acceptable responsesAccuracy of responsesResponse setsAlternatives include the use of projective testsObj. 4.3, 4.4