Presentation is loading. Please wait.

Presentation is loading. Please wait.

QNT 575 2009.

Similar presentations

Presentation on theme: "QNT 575 2009."— Presentation transcript:

1 QNT 575 2009

2 Overview Introductions Admin Syllabus Review material Learning teams
Next assignment

3 Introductions Where do you work? Your degree? Reason for an MBA?
Anything else interesting…..

4 Syllabus Want a different approach?

5 Policies Laptops Breaks? Food Contacting Me Etiquette
Individual/group assignment ROE End of class cleanup

6 Why do we do educational research?
Helps educators understand educational processes; make professional decisions Provides information to policy groups to assist them with mandated changes in education Serves the information needs of concerned public, professional, and private organizations Reviews and interprets accumulated empirical evidence Is readily available Includes educators in the field in research projects Research Process and Design (Umbach)

7 Research Process and Design (Umbach)
More importantly, why is it important for you to learn how to read, evaluate, and design research? Research Process and Design (Umbach)

8 Principles of Scientific Evidence-Based Inquiry
Adapted from National Research Council Definition—evidenced-based inquiry is the search for knowledge using systematically gathered empirical data Principle 1: pose significant questions that can be investigated empirically Research Process and Design (Umbach)

9 Principles of Scientific Evidence-Based Inquiry
Principle 2: link research to relevant theory or conceptual framework Principle 3: use methods that allow direct investigation of the research question Principle 4: provide a coherent and explicit chain of reasoning Research Process and Design (Umbach)

10 Principles of Scientific Evidence-Based Inquiry
Principle 5: replicate/generalize or extend across studies Principle 6: disclose research to encourage professional scrutiny and critique Research Process and Design (Umbach)

11 Research Disciplines Education Sociology Psychology Policy History
Biography Management Practice based Pedagogy Linguistics etc

12 Classifying research Types Use Classifying exploratory, descriptive,
explanatory, predictive Use pure, applied, evaluative, action/practitioner Classifying primary vs. secondary theoretical vs. empirical quantitative vs. qualitative (vs. mixed methods) inductive vs. deduction To be discussed in following slides….

13 Types of research exploratory
generate new ideas, concepts, or hypotheses little or no prior knowledge looks for clues or basic facts, settings, and concerns creates a general picture of conditions formulate and focus questions for future research

14 Types of research descriptive
provides a detailed, highly accurate picture create a set of categories or classify types report on the background or context of a situation little attempt to explain the results

15 Types of research explanatory predictive
explains why something happens look for casual relationships between concepts elaborate and enrich a theory’s explanation or extend a theory to new issues or topics support or refute an explanation or prediction determine which of several explanations is best predictive forecasts future phenomena, based on findings suggested by explanatory research

16 Use of research pure, basic, or academic research applied research
adds to the body of knowledge contributes to theory focus on issues of importance to researchers applied research focus on issues of importance to society helps to understand the nature and sources of human problems used to make practical decisions

17 Use of research summative evaluation formative evaluation
determines the effectiveness of an interventions focus on the goals of the intervention formative evaluation improving an intervention focus on providing recommendations or changes

18 Use of research action research
solves problems in a program, organisation or community focus on empowering participants to solve issues themselves

19 Classifying research primary secondary
involves the collection and analysis of original data secondary find existing data and (re)analyse census data participation surveys

20 Classifying research theoretical empirical
generation of new ideas through analysing existing theory and explanations. empirical generation of new ideas through the collection and analysis of data

21 Deductive theory hypothesis test hypothesis accept/reject theory
reasoning accept/reject theory

22 Inductive theory generalisations identify patterns observations

23 The Research Process Review Identify Identify the concepts topic
literature Identify concepts & theory Draw conclusion Clarify research problem Analyse data Collection of data Research design

24 Research Process (M & S)
Select a general problem Conduct literature review State conclusion/ generalization about problem Preliminary search, later expanded Exhaustive review Select specific problem, research question, or hypothesis Collect data Analyze and present data Interpret findings More generally…… Statistical tables Integrative diagrams Decide design and methodology

25 The Research Process background to the problem’s context
Identify topic background to the problem’s context why is your problem important who will benefit? who will use your conclusions policy/ practice/ research (also why they will use it)

26 The Research Process dyslexia and memory Identify topic background
Eleni Sakellariou (PhD Candidate) background dyslexia appears to be linked to various aspects of memory importance understanding how memory interacts with dyslexia will assist teachers in helping students

27 The Research Process general definitions
Review the literature general definitions general discussion of your issue and related topics specific research that is related to your topic existing work on your topic who, why, where, when, findings, shortcomings general conclusions about work done to date

28 The Research Process for each article/study examine: Review the
literature for each article/study examine: the central purpose of their study state information about the sample and subjects review key results that relate to your study how is this article of relevance to your study how does this study inform your methods

29 The Research Process Review the literature dyslexia and memory
Literature: Books Pickering, S. (2006). Working memory and education. Amsterdam: Elsevier Press. Miyake, A. & Shah, P. (1999). Models of Working Memory: Mechanisms of Active Maintenance and Executive Control. Cambridge, UK: Cambridge University Press. Torrance, M. & Jeffery, G. (1999). The cognitive demands of writing. Amsterdam: Amsterdam University Press. Kellogg, R. T. (1996). A model of working memory in writing. In C. M. Levy & S. Ransdell (Eds.), The science of writing: Theories, methods, individual differences, and applications. Mahwah, NJ: Erlbaum.

30 The Research Process Review the literature dyslexia and memory
Literature: Articles Baddeley, A. D. (2002). Is working memory still working? European Psychologist, 56, Swanson, H. L. & Berninger, V. (1996). Individual differences in children’s working memory and writing skill. Journal of Experimental Child Psychology, 63, Berninger, V. W., Nielsen, K.H., Abbott, R.D., Wijsman, E. & Raskind, W. (2007). Writing problems in developmental dyslexia: Under-recognized and under-treated. Journal of School Psychology, 4, McCutchen, D. (1996). A capacity theory of writing: Working memory in composition. Educational Psychology Review, 8, Olive, T. (2004). Working Memory in Writing: Empirical evidence from the Dual-Task Technique. European Psychologist, 9, 1, Martin, R. C. (1993). Short-term memory and sentence processing: Evidence from neuropsychology. Memory & Cognition, 21,

31 The Research Process identify the concepts you are actually studying
& theory identify the concepts you are actually studying what theoretical background do they come from? methods of data collection validity of research instruments sampling issues

32 The Research Process dyslexia and memory Identify concepts & theory
cognitive psychology short term memory phonological loop visual spatial sketchpad visual memory central executive working memory

33 The Research Process Research Question or Research Problem or
(null) Hypothesis dyslexia and memory Is there a pattern in cognitive profile which influences the writing performance of the dyslexic children? Clarify research problem

34 The Research Process Review Identify Identify the concepts topic
literature Identify concepts & theory Draw conclusion Clarify research problem Analyse data Collection of data Research design

35 The Research Process Research design research philosophy / approach
inductive/deductive/positivism/interpretism? research purpose exploratory/descriptive/explanatory/predictive research strategy experiment/survey/case study/grounded theory/ ethnography dictates: why you do things how you do things Research design

36 The Research Process dyslexia and memory Research design
the literature was relatively developed therefore an deductive approach the topic is about cognitive processes therefore it is relatively post-postivistic Research design

37 The Research Process data collection Collection of data
what is your data what is your sample what is your sampling method what is your collection method what is your collection instrument timeline? Collection of data

38 The Research Process data collection Collection of data
do they need to be representative The Research Process surveys, interviews, observation data collection what is your data what is your sample what is your sampling method what is your collection method what is your collection instrument timeline? ethics? privacy? self, questionnaire interview schedule/guide, checklist pilot test Collection of data is it valid is it reliable

39 The Research Process two stages Analyse data preparing the data
transcribing interviews entering surveys into computer programs analysis summary of responses similarities differences relationships Analyse data

40 The Research Process two stages what do you do with record interview
missing data The Research Process record interview then transcribe two stages preparing the data transcribing interviews entering surveys into computer programs analysis summary of responses content analysis open/axial coding pattern coding thematic coding similarities differences relationships means/s.d. chi2, t-tests ANOVA, correlation Analyse data

41 The Research Process Conclusions must stem from your data
Links to other peoples research Limitations with findings Applications of findings Draw conclusion

42 The Research Process—Seven Phases
Select a general problem Review the literature on the problem Decide the specific research problem, question, or hypothesis Determine the design and methodology Collect data Analyze data and present the results Interpret the findings and state conclusions or summary regarding the problem Again, but we are going to look at research design Research Process and Design (Umbach)

43 Research Process and Design (Umbach)
Research Design Research design describes how the study was conducted What is general plan How research is set up What happens to the subjects What were methods of data collection Research Process and Design (Umbach)

44 Research Process and Design (Umbach)
Research Design Match the design to the question(s) being asked so as to best answer the question(s) Consider limitations and cautions in interpreting results from each design Analyze data in keeping with research design Provide the most valid, accurate answers to research questions Congruency between the research question and the research design selected to answer that question Implications related to the type of data analysis with specific research designs Research Process and Design (Umbach)

45 Three Major Categories of Research Design
Quantitative Experimental (true, quasi, single-subject) Nonexperimental (descriptive, comparative, correlational, ex post facto) Qualitative any information that is not numerical in nature data is ‘rich’ or ‘thick’ Mixed Methods Mixed---exactly that, but only good if both qualitative components and qualitative components taken separately are valid and the interrelationship as well Research Process and Design (Umbach)

46 Quantitative and Qualitative Research Approaches
Assumptions about the world Quantitative—single reality (i.e., cause and effect, reduce to specific variables, test of theories) Qualitative—multiple reality (i.e., multiple meanings of individual experiences, meanings are socially constructed) Research purpose Quantitative—establish relationships or explain causes of change Qualitative—understand social phenomenon, explore a process, describe experiences, report stories Research Process and Design (Umbach)

47 Quantitative and Qualitative Research Approaches
Research methods and process Quantitative—established set of procedures and steps Qualitative—flexible design, emergent design Prototypical studies Quantitative—experimental or correlational designs, designed to reduce bias, error, and extraneous variables Qualitative—takes into account bias and subjectivity Research Process and Design (Umbach)

48 Quantitative and Qualitative Research Approaches
Researcher role Quantitative—detached from study to avoid bias Qualitative—immersed in phenomenon being studied; participant observation Importance of the context in the study Quantitative—aims to establish universal context-free generalizations Qualitative—develops context-bound summaries Research Process and Design (Umbach)

49 Elements of a research proposal
Introduction Should capture the reader’s interest and sell them on the idea that the study is worth doing Can serve as a standalone document that describes your study Review of the literature Summarizes and analyzes previous research Shows relationship of current study to what has been done Method Clearly describes how you plan to take answer your research questions or test your hypotheses Research Process and Design (Umbach)

50 Introduction should answer the following:
What do you plan to study? Why is it important to study it? How do you plan to study it? Who do you plan to study? Research Process and Design (Umbach)

51 The introduction is likely to include:
The research problem Studies that have addressed the problem Deficiencies in the studies Importance of the proposed research Brief introduction to theoretical framework Purpose statement Research questions and/or hypotheses (sometimes included in the literature review section) Brief description of method (who? and how?) Limitations and delimitations Research Process and Design (Umbach)

52 One model for introduction (suggested by Creswell)
Research problem Review of studies addressing problem Deficiencies of previous work Importance of study Purpose of study, research questions, and/or hypotheses Brief statement of method Research Process and Design (Umbach)

53 Research Process and Design (Umbach)
What research problem would you like to address in your proposal? Research Process and Design (Umbach)

54 Qualitative Research

55 Qualitative Research Qualitative research is an interdisciplinary, transdisciplinary, and sometimes counterdisciplinary field. It crosses the humanities and the social and physical sciences. Qualitative research is many things at the same time. It is multiparadigmatic in focus. Its practitioners are sensitive to the value of the multimethod approach. They are committed to the naturalistic perspective, and to the interpretative understanding of human experience. At the same time, the field is inherently political and shaped by multiple ethical and political positions. Nelson et al’s (1992, p4)

56 Qualitative Research ‘Qualitative Research…involves finding out what people think, and how they feel - or at any rate, what they say they think and how they say they feel. This kind of information is subjective. It involves feelings and impressions, rather than numbers’ Bellenger, Bernhardt and Goldstucker, Qualitative Research in Marketing, American Marketing Association

57 Qualitative Research Qualitative research is multimethod in focus, involving an interpretative, naturalistic approach to its subject matter. Qualitative Researchers study “things” (people and their thoughts) in their natural settings, attempting to make sense of, or interpret, phenomena in terms of the meanings people bring to them.

58 Qualitative Research Qualitative research involves the studied use and collection of a variety of empirical materials - case study, personal experience, introspective, life story, interview, observational, historical, interactional, and visual texts-that describe routine and problematic moments and meanings in individuals lives. Deploy a wide range of interconnected methods, hoping always to get a better fix on the subject matter at hand.

59 Qualitative v.'s Quantitative

60 Popularity of Qualitative Research
Usually much cheaper than quantitative research No better way than qualitative research to understand in-depth the motivations and feelings of consumers Qualitative research can improve the efficiency and effectiveness of quantitative research

61 Limitations of Qualitative Research
Marketing successes and failures are based on small differences in the marketing mix. Qualitative research doesn’t distinguish these differences as well as quantitative research can. Not representative of the population that is of interest to the researcher The multitude of individuals who, without formal training, profess to be experts in the field

62 The Nature of Quantitative Research

63 Overview Stages of quantitative research
Conceptualization and measurement Reliability Validity Main preoccupations of quantitative researchers

64 Criticisms of quantitative research
Gap between the ideal and the actual

65 The Stages of Quantitative Research
Theory/hypothesis Research design Devise measures of concepts Select site and sample Collect data Code and analyze data Write up This series of stages represents an ‘ideal typical’ account of the way quantitative research is carried out. Each stage in the process follows on logically from the last. As you learn these stages, I would like you to also think critically about why research in the social sciences does not often follow this model. Remember the contrast between deductive and inductive theorising? This model is an example of a deductive relationship between theory and evidence. The focus of much of the lecture is on the third stage: devising measures of concepts. Let’s start with examples of concepts.

66 A basket of concepts… Gender Glass ceiling
Femininity/Masculinity Body image Household division of labour Women and Girls Sexual harassment Glass ceiling Patriarchy Notice that these are all concepts that are examined in courses about gender. Can you think of any other major concepts in Gender Studies that are missing from this slide? All social science disciplines and areas of study examine concepts. Feminism Gender Inequality Gender Boys and Men

67 Concepts and Conceptualization
Concepts = ‘categories for the organisation of ideas and observations’ (Bulmer, 1984: 43) May provide explanations of social phenomena May represent things we want to explain Conceptualization = the process of specifying what is meant by a term E.g. Sexual harassment is a concept that (1) may provide an explanation for why there are few women in some professions, and (2) may be something that we want to explain. Theories specify the relationship between different concepts (e.g. sexual harassment and the glass ceiling) Conceptualization is not that easy… we assume we all know what we mean by a concept. Example: girl is a simple concept. How would you define it? (young female person) What is meant by young? What is meant by female?

68 Measurements. . . delineate fine differences between people/cases.
are consistent and reliable. are more precise estimates of the degree of relatedness between concepts. The challenge of quantitative research is to find valid, reliable ways to measure concepts. Some concepts can be measured quite directly. E.g. alcohol consumption can be measured directly by asking people to report the number of drinks they have in a week. Example of student in seminar who asked about whether bullying in Columbine high school could be considered an example of racial discrimination. What is racial discrimination? How would you measure this concept.

69 Indicators of Concepts
Produced by the operational definition of a concept and are less directly quantifiable than measures Common-sense understandings of the form a concept might take Multiple-indicator measures: concept may have different dimensions example: ‘commitment to work’ Example. Example SES = income, assets, education and occupational prestige (four indicators of this concept. Each must be measured) Before they can be measured, they must be further operationalized. Some indicators are more challenging to quantify than measures. e.g. loneliness... look for the forms loneliness might take – e.g. feelings of sadness Some concepts with multiple dimensions require multiple indicators.

70 How to measure the concept of ‘Keeping up’ in a course?
Direct measures? Indicators? Dimensions? Can we research a concept without using any measures or indicators? (e.g. poverty, body image, intelligence, etc.) Yes, we can research the concept inductively by asking people what these concepts mean to them subjectively, which is an interpretivist way of theorizing, suited to qualitative research. Implicit in a quantitative measure or indicator is a theory of what that concept means.

71 Reliability Stability over time Internal reliability
test-retest method (correlation between measure on different occasions) Internal reliability split-half method (correlation between measures on two halves of a scale) Cronbach’s alpha Reliability refers to the consistency of measures. Stability over time. E.g. An index of “fear of crime” is reliable if it gives the same scores when measured at two points in time. Internal reliability The Shyness scale Inter-observer Scale for measuring mental health Ratings should be consistent when different people assess the same client

72 Types of reliability Inter-rater reliability Test-retest reliability
Do two (or more) researchers see the same thing? Used frequently in qualitative research Our recent group observations in Student Center employed inter-rater reliability Test-retest reliability Does a repeat study generate similar results? Do not have to be identical because of variations in population, sample, etc. Used in qualitative and quantitative research

73 Internal reliability/consistency
How reliable are measures within one project? Used frequently for assessing reliability of scales and typologies but only good for unidimensional constructs Split half reliability- randomly divide measure items and compare outcomes. Cronbach’s alpha- a average of all possible split-half scores. Parallel forms reliability- divide questions into two and administer each separately to the same sample. For all, the closer the score is to 1 the more reliable the scale, etc.

74 Inter-observer consistency
agreement between different researchers

75 Measurement Validity Face validity Concurrent validity
Construct validity Convergent validity Validity presupposes reliability (but not vice versa). Why is this? Face Validity e.g. number of drinks is a valid measure of alcohol consumption “do you love me?” “Yes.” = face validity, but may be other indicators that call this into question. Concurrent validity = correspondence with other measures that are known to be associated with the concept being measured. (also called criterion validity) e.g. if you want to create a test of sales ability, you would assess the test’s validity by comparing the results with actual sales Construct validity Based on logical relationships among variables e.g. marital satisfaction is logically related to marital fidelity (except in non-monogamous or polyamorous partnerships) Measure of marital satisfaction is valid when results are correlated to measures of marital fidelity. Convergent validity Measures of the same concepts through different methods should converge (i.e. give the same results) E.g. self-reports and observations (e.g. seminar participation as measured by (1) the seminar leader and (2) self-reports of seminar participation should converge) Think of the bathroom scales. If I am weighing my luggage to take a flight, the luggage must be less than 40 kilos, but my scale is in pounds. The measure is not valid. Also, the scale may be consistent (reliable), but it may be undercalibrated, so will always tell me that I weigh 5 pounds less than I do. It is not a valid measure. Tension between reliability and validity because concepts that lend themselves to reliable measurement are not necessarily the most valid. e.g. morale in different factories. Counting number of grievances put forward to a union is a very reliable measure. But is it the most valid way of knowing about morale? Could also do field work in the factories (observation, talking to workers, getting to know the routines and the climate on the assembly line).

76 Two general types of validity
Internal validity External validity The logic of the study design Accounting for alternative (or additional) explanations of causal relationships if study focuses on causal relationships Generalizable (quantitative) or transferable (qualitative)

77 Main Preoccupations of Quantitative Researchers
1. Measurement Can a concept be quantified? Comparisons between measures Changes in a variable over time 2. Causality Explanations of social phenomena Causal relationships between independent and dependent variables Inference only in cross-sectional designs Think critically about the difference between these and the priorities of qualitative researchers, who aim to understand, interpret and discover subjective meanings. What kinds of topics or subject matter would be more suited to each set of aims?

78 3. Generalization Can the results be applied to individuals beyond the sample? Aims to generalize to target population Requires representative sample (random, probability sample)

79 4. Replication Detailed description of procedures allows other researchers to replicate study Low incidence of published replications

80 Transferability Not all studies are intended to be generalizable to an entire population Refers to the ability to apply research results in another context or to inform other research Also refers to the ability of the research to connect the reader with the research Make study environment, respondents, social phenomena “come alive” Solicits comparisons between reader’s own experiences and experiences described in the research

81 All of these measures Of validity and reliability are conducted after research is conducted Frustrating to have to report that your measures were invalid or unreliable But that is still a legitimate finding! Just as frustrating sometimes to have to report you found no support for your hypothesis!

82 No way to know a priori Can’t know for certain how reliable or valid something is before you’ve conducted the research Unless you are using something that has reliability/validity previously established That’s why so much time and effort is put into research design Conceptualization Operationalization Reviewing past research Exploring theories Exploring methods

83 Pretesting and preliminary investigation
Can also increased reliability and validity As well as improving overall research design Pre-testing After research instrument/guidelines established Involves giving your survey, using your observation guidelines in the field, doing a few interviews with respondents or informants Analyzing data generated and soliciting feedback from respondents about instrument (if applicable) Revising measurements, instrument

84 Preliminary investigation
Often occurs prior to creating research instrument/guidelines May talk informally with individuals from the target population or otherwise associated with social phenomena May do field observations May collect and analyze social artifacts associated with research topic

85 Criticisms of Quantitative Research
Failure to distinguish between objects in the natural world and social phenomena Artificial and false sense of precision and accuracy presumed connection between concepts and measures respondents make different interpretations of questions and other research tools Some powerful criticisms of the quantitative approach have been presented by those working in the qualitative tradition. Remember that quantitative and qualitative research strategies differ on ontological, epistemological and methodological grounds.

86 Lack of external validity
reliance on instruments and measurements little relevance to participants’ everyday lives variation in the meaning of concepts to each individual

87 Static view of social life
reduced to relationships between variables ignores processes of human definition and interpretation (Blumer, 1956)

88 The Gap Between the Ideal and the Actual
Quantitative research design is an ideal-typical approach Useful as a guide to good practice but there is a discrepancy between ideal type and actual practice of social research Pragmatic concerns mean that researchers may not adhere rigidly to these principles ‘Scientific’ research always takes place in a social context and researchers are constantly having to balance out what is ideal with what is feasible (see Chapter 2).

89 How does quantitative research sometimes depart from the principles of good practice?
Three examples…

90 1. Reverse operationalism
Quantitative research is usually deductive (operational definition of concepts), but measurements can sometimes lead to inductive theorizing (Bryman, 1988) example: factor analysis groups of indicators cluster together and suggest a common factor e.g. personality trait research Here is an example of the inevitable deviation from “ideal typical” quantitative research. The operational definition of measurements can often lead to inductive theorising and so feed back into the loop of theory and data collection. The author gives factor analysis as an example of this; as an illustration, you might want to discuss Cattell’s (1973) trait theory of personality, in which he identified sixteen basic ‘personality factors’ from clusters of other, surface level forms of behaviour. Factor analysis is used with multiple-indicator measures to see if they cluster together into factors

91 2. Reliability and validity testing
Published accounts of quantitative research rarely report evidence of reliability and validity (Podsakoff & Dalton, 1987) Researchers are primarily interested in the substantive content and findings of their research Tests of reliability and validity are often neglected There is a low incidence of reporting reliability and validity test results in published social science articles, because researchers are usually more interested in the substantive contents of their data. The academic community is a social context in which research is produced, and that academics are motivated as much by the values of status, reputation and credibility as they are by a ‘purist’ commitment to research.

92 3. Sampling Good practice in quantitative research calls for probability sampling Sometimes it may not be possible to obtain a probability sample due to lack of time, lack of resources, or the nature of the population.

93 Peer Reviews Despite the inevitable shortcomings of actual projects peer review helps ensure that quantitative researchers remain committed to the principles of good practice.

94 Backup slides for general info
Research Process and Design (Umbach)

95 Educational Research

96 Topics Discussed in this Chapter
Data collection Measuring instruments Terminology Interpreting data Types of instruments Technical issues Validity Reliability Selection of a test

97 Data Collection Scientific inquiry requires the collection, analysis, and interpretation of data Data – the pieces of information that are collected to examine the research topic Issues related to the collection of this information are the focus of this chapter

98 Data Collection Terminology related to data
Constructs – abstractions that cannot be observed directly but are helpful when trying to explain behavior Intelligence Teacher effectiveness Self concept Obj. 1.1 & 1.2

99 Data Collection Data terminology (continued)
Operational definition – the ways by which constructs are observed and measured Weschler IQ test Virgilio Teacher Effectiveness Inventory Tennessee Self-Concept Scale Variable – a construct that has been operationalized and has two or more values Obj. 1.1 & 1.2

100 Data Collection Measurement scales Nominal – categories
Gender, ethnicity, etc. Ordinal – ordered categories Rank in class, order of finish, etc. Interval – equal intervals Test scores, attitude scores, etc. Ratio – absolute zero Time, height, weight, etc. Obj. 2.1

101 Data Collection Types of variables Categorical or quantitative
Categorical variables reflect nominal scales and measure the presence of different qualities (e.g., gender, ethnicity, etc.) Quantitative variables reflect ordinal, interval, or ratio scales and measure different quantities of a variable (e.g., test scores, self-esteem scores, etc.) Obj. 2.2

102 Data Collection Types of variables Independent or dependent
Independent variables are purported causes Dependent variables are purported effects Two instructional strategies, co-operative groups and traditional lectures, were used during a three week social studies unit. Students’ exam scores were analyzed for differences between the groups. The independent variable is the instructional approach (of which there are two levels) The dependent variable is the students’ achievement Obj. 2.3

103 Measurement Instruments
Important terms Instrument – a tool used to collect data Test – a formal, systematic procedure for gathering information Assessment – the general process of collecting, synthesizing, and interpreting information Measurement – the process of quantifying or scoring a subject’s performance Obj. 3.1 & 3.2

104 Measurement Instruments
Important terms (continued) Cognitive tests – examining subjects’ thoughts and thought processes Affective tests – examining subjects’ feelings, interests, attitudes, beliefs, etc. Standardized tests – tests that are administered, scored, and interpreted in a consistent manner Obj. 3.1

105 Measurement Instruments
Important terms (continued) Selected response item format – respondents select answers from a set of alternatives Multiple choice True-false Matching Supply response item format – respondents construct answers Short answer Completion Essay Obj. 3.3 & 11.3

106 Measurement Instruments
Important terms (continued) Individual tests – tests administered on an individual basis Group tests – tests administered to a group of subjects at the same time Performance assessments – assessments that focus on processes or products that have been created Obj. 3.6

107 Measurement Instruments
Interpreting data Raw scores – the actual score made on a test Standard scores – statistical transformations of raw scores Percentiles (0.00 – 99.9) Stanines (1 – 9) Normal Curve Equivalents (0.00 – 99.99) Obj. 3.4

108 Measurement Instruments
Interpreting data (continued) Norm-referenced – scores are interpreted relative to the scores of others taking the test Criterion-referenced – scores are interpreted relative to a predetermined level of performance Self-referenced – scores are interpreted relative to changes over time Obj. 3.5

109 Measurement Instruments
Types of instruments Cognitive – measuring intellectual processes such as thinking, memorizing, problem solving, analyzing, or reasoning Achievement – measuring what students already know Aptitude – measuring general mental ability, usually for predicting future performance Obj. 4.1 & 4.2

110 Measurement Instruments
Types of instruments (continued) Affective – assessing individuals’ feelings, values, attitudes, beliefs, etc. Typical affective characteristics of interest Values – deeply held beliefs about ideas, persons, or objects Attitudes – dispositions that are favorable or unfavorable toward things Interests – inclinations to seek out or participate in particular activities, objects, ideas, etc. Personality – characteristics that represent a person’s typical behaviors Obj. 4.1 & 4.5

111 Measurement Instruments
Types of instruments (continued) Affective (continued) Scales used for responding to items on affective tests Likert Positive or negative statements to which subjects respond on scales such as strongly disagree, disagree, neutral, agree, or strongly agree Semantic differential Bipolar adjectives (i.e., two opposite adjectives) with a scale between each adjective Dislike: ___ ___ ___ ___ ___ :Like Rating scales – rankings based on how a subject would rate the trait of interest Obj. 5.1

112 Measurement Instruments
Types of instruments (continued) Affective (continued) Scales used for responding to items on affective tests (continued) Thurstone – statements related to the trait of interest to which subjects agree or disagree Guttman – statements representing a uni-dimensional trait Obj. 5.1

113 Measurement Instruments
Issues for cognitive, aptitude, or affective tests Problems inherent in the use of self-report measures Bias – distortions of a respondent’s performance or responses based on ethnicity, race, gender, language, etc. Responses to affective test items Socially acceptable responses Accuracy of responses Response sets Alternatives include the use of projective tests Obj. 4.3, 4.4

Download ppt "QNT 575 2009."

Similar presentations

Ads by Google