Presentation is loading. Please wait.

Presentation is loading. Please wait.

QNT 575 2009. Overview Introductions Admin Syllabus Review material Learning teams Next assignment.

Similar presentations

Presentation on theme: "QNT 575 2009. Overview Introductions Admin Syllabus Review material Learning teams Next assignment."— Presentation transcript:


2 Overview Introductions Admin Syllabus Review material Learning teams Next assignment

3 Introductions Where do you work? Your degree? Reason for an MBA? Anything else interesting…..

4 Syllabus Want a different approach?

5 Policies Laptops Breaks? Food Contacting Me Etiquette Individual/group assignment ROE End of class cleanup

6 Research Process and Design (Umbach) 6 Why do we do educational research? Helps educators understand educational processes; make professional decisions Provides information to policy groups to assist them with mandated changes in education Serves the information needs of concerned public, professional, and private organizations Reviews and interprets accumulated empirical evidence Is readily available Includes educators in the field in research projects

7 Research Process and Design (Umbach) 7 More importantly, why is it important for you to learn how to read, evaluate, and design research?

8 Research Process and Design (Umbach) 8 Principles of Scientific Evidence-Based Inquiry Adapted from National Research Council Definitionevidenced-based inquiry is the search for knowledge using systematically gathered empirical data Principle 1: pose significant questions that can be investigated empirically

9 Research Process and Design (Umbach) 9 Principles of Scientific Evidence-Based Inquiry Principle 2: link research to relevant theory or conceptual framework Principle 3: use methods that allow direct investigation of the research question Principle 4: provide a coherent and explicit chain of reasoning

10 Research Process and Design (Umbach) 10 Principles of Scientific Evidence-Based Inquiry Principle 5: replicate/generalize or extend across studies Principle 6: disclose research to encourage professional scrutiny and critique

11 Research Disciplines Education Sociology Psychology Policy History Biography Management Practice based Pedagogy Linguistics etc

12 Classifying research Types exploratory, descriptive, explanatory, predictive Use pure, applied, evaluative, action/practitioner Classifying primary vs. secondary theoretical vs. empirical quantitative vs. qualitative (vs. mixed methods) inductive vs. deduction

13 Types of research exploratory generate new ideas, concepts, or hypotheses little or no prior knowledge looks for clues or basic facts, settings, and concerns creates a general picture of conditions formulate and focus questions for future research

14 Types of research descriptive provides a detailed, highly accurate picture create a set of categories or classify types report on the background or context of a situation little attempt to explain the results

15 Types of research explanatory explains why something happens look for casual relationships between concepts elaborate and enrich a theorys explanation or extend a theory to new issues or topics support or refute an explanation or prediction determine which of several explanations is best predictive forecasts future phenomena, based on findings suggested by explanatory research

16 Use of research pure, basic, or academic research adds to the body of knowledge contributes to theory focus on issues of importance to researchers applied research focus on issues of importance to society helps to understand the nature and sources of human problems used to make practical decisions

17 Use of research summative evaluation determines the effectiveness of an interventions focus on the goals of the intervention formative evaluation improving an intervention focus on providing recommendations or changes

18 Use of research action research solves problems in a program, organisation or community focus on empowering participants to solve issues themselves

19 Classifying research primary involves the collection and analysis of original data secondary find existing data and (re)analyse census data participation surveys

20 Classifying research theoretical generation of new ideas through analysing existing theory and explanations. empirical generation of new ideas through the collection and analysis of data

21 Deductive hypothesis theory test hypothesis accept/reject theory

22 Inductive identify patterns observationsobservations generalisationsgeneralisations theorytheory

23 The Research Process Identify topic Identify topic Review the literature Review the literature Identify concepts & theory Identify concepts & theory Clarify research problem Clarify research problem Research design Research design Collection of data Collection of data Analyse data Analyse data Draw conclusion Draw conclusion

24 24 Research Process (M & S) Select a general problem Conduct literature review Exhaustive review Preliminary search, later expanded Select specific problem, research question, or hypothesis Decide design and methodology Collect data Analyze and present data Interpret findings State conclusion/ generalization about problem Integrative diagrams Statistical tables

25 The Research Process background to the problems context why is your problem important who will benefit? who will use your conclusions policy/ practice/ research (also why they will use it) Identify topic Identify topic

26 The Research Process dyslexia and memory Eleni Sakellariou (PhD Candidate) background dyslexia appears to be linked to various aspects of memory importance understanding how memory interacts with dyslexia will assist teachers in helping students Identify topic Identify topic

27 The Research Process general definitions general discussion of your issue and related topics specific research that is related to your topic existing work on your topic who, why, where, when, findings, shortcomings general conclusions about work done to date Review the literature Review the literature

28 The Research Process for each article/study examine: the central purpose of their study state information about the sample and subjects review key results that relate to your study how is this article of relevance to your study how does this study inform your methods Review the literature Review the literature

29 The Research Process dyslexia and memory Literature: Books Pickering, S. (2006). Working memory and education. Amsterdam: Elsevier Press. Miyake, A. & Shah, P. (1999). Models of Working Memory: Mechanisms of Active Maintenance and Executive Control. Cambridge, UK: Cambridge University Press. Torrance, M. & Jeffery, G. (1999). The cognitive demands of writing. Amsterdam: Amsterdam University Press. Kellogg, R. T. (1996). A model of working memory in writing. In C. M. Levy & S. Ransdell (Eds.), The science of writing: Theories, methods, individual differences, and applications. Mahwah, NJ: Erlbaum. Review the literature Review the literature

30 The Research Process dyslexia and memory Literature: Articles Baddeley, A. D. (2002). Is working memory still working? European Psychologist, 56, Swanson, H. L. & Berninger, V. (1996). Individual differences in childrens working memory and writing skill. Journal of Experimental Child Psychology, 63, Berninger, V. W., Nielsen, K.H., Abbott, R.D., Wijsman, E. & Raskind, W. (2007). Writing problems in developmental dyslexia: Under-recognized and under-treated. Journal of School Psychology, 4, McCutchen, D. (1996). A capacity theory of writing: Working memory in composition. Educational Psychology Review, 8, Olive, T. (2004). Working Memory in Writing: Empirical evidence from the Dual-Task Technique. European Psychologist, 9, 1, Martin, R. C. (1993). Short-term memory and sentence processing: Evidence from neuropsychology. Memory & Cognition, 21, Review the literature Review the literature

31 The Research Process identify the concepts you are actually studying what theoretical background do they come from? methods of data collection validity of research instruments sampling issues Identify concepts & theory Identify concepts & theory

32 The Research Process dyslexia and memory cognitive psychology short term memory phonological loop visual spatial sketchpad visual memory central executive working memory Identify concepts & theory Identify concepts & theory

33 The Research Process Research Question or Research Problem or (null) Hypothesis dyslexia and memory Is there a pattern in cognitive profile which influences the writing performance of the dyslexic children? Clarify research problem Clarify research problem

34 The Research Process Identify topic Identify topic Review the literature Review the literature Identify concepts & theory Identify concepts & theory Clarify research problem Clarify research problem Research design Research design Collection of data Collection of data Analyse data Analyse data Draw conclusion Draw conclusion

35 The Research Process research philosophy / approach inductive/deductive/positivism/interpretism? research purpose exploratory/descriptive/explanatory/predictive research strategy experiment/survey/case study/grounded theory/ ethnography dictates: why you do things how you do things Research design Research design

36 The Research Process dyslexia and memory the literature was relatively developed therefore an deductive approach the topic is about cognitive processes therefore it is relatively post-postivistic Research design Research design

37 The Research Process data collection what is your data what is your sample what is your sampling method what is your collection method what is your collection instrument timeline? Collection of data Collection of data

38 The Research Process data collection what is your data what is your sample what is your sampling method what is your collection method what is your collection instrument timeline? ethics? privacy? Collection of data Collection of data pilot test is it reliable is it valid do they need to be representative surveys, interviews, observation self, questionnaire interview schedule/guide, checklist

39 The Research Process two stages preparing the data transcribing interviews entering surveys into computer programs analysis summary of responses Analyse data Analyse data similarities differences relationships

40 The Research Process two stages preparing the data transcribing interviews entering surveys into computer programs analysis summary of responses Analyse data Analyse data record interview then transcribe record interview then transcribe what do you do with missing data what do you do with missing data content analysis open/axial coding pattern coding thematic coding content analysis open/axial coding pattern coding thematic coding means/s.d. chi2, t-tests ANOVA, correlation means/s.d. chi2, t-tests ANOVA, correlation similarities differences relationships

41 The Research Process Conclusions must stem from your data Links to other peoples research Limitations with findings Applications of findings Draw conclusion Draw conclusion

42 Research Process and Design (Umbach) 42 The Research ProcessSeven Phases 1.Select a general problem 2.Review the literature on the problem 3.Decide the specific research problem, question, or hypothesis 4.Determine the design and methodology 5.Collect data 6.Analyze data and present the results 7.Interpret the findings and state conclusions or summary regarding the problem

43 Research Process and Design (Umbach) 43 Research Design Research design describes how the study was conducted What is general plan How research is set up What happens to the subjects What were methods of data collection

44 Research Process and Design (Umbach) 44 Research Design Match the design to the question(s) being asked so as to best answer the question(s) Consider limitations and cautions in interpreting results from each design Analyze data in keeping with research design Provide the most valid, accurate answers to research questions Congruency between the research question and the research design selected to answer that question Implications related to the type of data analysis with specific research designs

45 Research Process and Design (Umbach) 45 Three Major Categories of Research Design Quantitative Experimental (true, quasi, single-subject) Nonexperimental (descriptive, comparative, correlational, ex post facto) Qualitative any information that is not numerical in nature data is rich or thick Mixed Methods

46 Research Process and Design (Umbach) 46 Quantitative and Qualitative Research Approaches Assumptions about the world Quantitativesingle reality (i.e., cause and effect, reduce to specific variables, test of theories) Qualitativemultiple reality (i.e., multiple meanings of individual experiences, meanings are socially constructed) Research purpose Quantitativeestablish relationships or explain causes of change Qualitativeunderstand social phenomenon, explore a process, describe experiences, report stories

47 Research Process and Design (Umbach) 47 Quantitative and Qualitative Research Approaches Research methods and process Quantitativeestablished set of procedures and steps Qualitativeflexible design, emergent design Prototypical studies Quantitativeexperimental or correlational designs, designed to reduce bias, error, and extraneous variables Qualitativetakes into account bias and subjectivity

48 Research Process and Design (Umbach) 48 Quantitative and Qualitative Research Approaches Researcher role Quantitativedetached from study to avoid bias Qualitativeimmersed in phenomenon being studied; participant observation Importance of the context in the study Quantitativeaims to establish universal context- free generalizations Qualitativedevelops context-bound summaries

49 Research Process and Design (Umbach) 49 Elements of a research proposal Introduction Should capture the readers interest and sell them on the idea that the study is worth doing Can serve as a standalone document that describes your study Review of the literature Summarizes and analyzes previous research Shows relationship of current study to what has been done Method Clearly describes how you plan to take answer your research questions or test your hypotheses

50 Research Process and Design (Umbach) 50 Introduction should answer the following: What do you plan to study? Why is it important to study it? How do you plan to study it? Who do you plan to study?

51 Research Process and Design (Umbach) 51 The introduction is likely to include: The research problem Studies that have addressed the problem Deficiencies in the studies Importance of the proposed research Brief introduction to theoretical framework Purpose statement Research questions and/or hypotheses (sometimes included in the literature review section) Brief description of method (who? and how?) Limitations and delimitations

52 Research Process and Design (Umbach) 52 One model for introduction (suggested by Creswell) Research problem Review of studies addressing problem Deficiencies of previous work Importance of study Purpose of study, research questions, and/or hypotheses Brief statement of method

53 Research Process and Design (Umbach) 53 What research problem would you like to address in your proposal?

54 Qualitative Research

55 Qualitative research is an interdisciplinary, transdisciplinary, and sometimes counterdisciplinary field. It crosses the humanities and the social and physical sciences. Qualitative research is many things at the same time. It is multiparadigmatic in focus. Its practitioners are sensitive to the value of the multimethod approach. They are committed to the naturalistic perspective, and to the interpretative understanding of human experience. At the same time, the field is inherently political and shaped by multiple ethical and political positions. Nelson et als (1992, p4)

56 Qualitative Research Qualitative Research…involves finding out what people think, and how they feel - or at any rate, what they say they think and how they say they feel. This kind of information is subjective. It involves feelings and impressions, rather than numbers Bellenger, Bernhardt and Goldstucker, Qualitative Research in Marketing, American Marketing Association

57 Qualitative Research Qualitative research is multimethod in focus, involving an interpretative, naturalistic approach to its subject matter. Qualitative Researchers study things (people and their thoughts) in their natural settings, attempting to make sense of, or interpret, phenomena in terms of the meanings people bring to them.

58 Qualitative Research Qualitative research involves the studied use and collection of a variety of empirical materials - case study, personal experience, introspective, life story, interview, observational, historical, interactional, and visual texts- that describe routine and problematic moments and meanings in individuals lives. Deploy a wide range of interconnected methods, hoping always to get a better fix on the subject matter at hand.

59 Qualitative v.'s Quantitative

60 Popularity of Qualitative Research 1 Usually much cheaper than quantitative research 2 No better way than qualitative research to understand in-depth the motivations and feelings of consumers 3 Qualitative research can improve the efficiency and effectiveness of quantitative research

61 Limitations of Qualitative Research 1 Marketing successes and failures are based on small differences in the marketing mix. Qualitative research doesnt distinguish these differences as well as quantitative research can. 2 Not representative of the population that is of interest to the researcher 3 The multitude of individuals who, without formal training, profess to be experts in the field

62 The Nature of Quantitative Research

63 Overview Stages of quantitative research Conceptualization and measurement Reliability Validity Main preoccupations of quantitative researchers

64 Criticisms of quantitative research Gap between the ideal and the actual

65 The Stages of Quantitative Research Theory/hypothesis Research design Devise measures of concepts Select site and sample Collect data Code and analyze data Write up

66 A basket of concepts… Femininity/Masculinity Household division of labour Body image Glass ceiling Gender Inequality Sexual harassment Gender Patriarchy Feminism Women and Girls Boys and Men

67 Concepts and Conceptualization Concepts = categories for the organisation of ideas and observations (Bulmer, 1984: 43) May provide explanations of social phenomena May represent things we want to explain Conceptualization = the process of specifying what is meant by a term

68 Measurements... delineate fine differences between people/cases. are consistent and reliable. are more precise estimates of the degree of relatedness between concepts.

69 Indicators of Concepts Produced by the operational definition of a concept and are less directly quantifiable than measures Common-sense understandings of the form a concept might take Multiple-indicator measures: concept may have different dimensions example: commitment to work

70 How to measure the concept of Keeping up in a course? Direct measures? Indicators? Dimensions? Can we research a concept without using any measures or indicators? (e.g. poverty, body image, intelligence, etc.)

71 Reliability Stability over time test-retest method (correlation between measure on different occasions) Internal reliability split-half method (correlation between measures on two halves of a scale) Cronbachs alpha

72 Types of reliability Inter-rater reliability Do two (or more) researchers see the same thing? Used frequently in qualitative research Our recent group observations in Student Center employed inter-rater reliability Test-retest reliability Does a repeat study generate similar results? Do not have to be identical because of variations in population, sample, etc. Used in qualitative and quantitative research

73 Internal reliability/consistency How reliable are measures within one project? Used frequently for assessing reliability of scales and typologies but only good for unidimensional constructs Split half reliability- randomly divide measure items and compare outcomes. Cronbachs alpha- a average of all possible split- half scores. Parallel forms reliability- divide questions into two and administer each separately to the same sample. For all, the closer the score is to 1 the more reliable the scale, etc.

74 Inter-observer consistency agreement between different researchers

75 Measurement Validity Face validity Concurrent validity Construct validity Convergent validity Validity presupposes reliability (but not vice versa). Why is this?

76 Two general types of validity Internal validity External validity The logic of the study design Accounting for alternative (or additional) explanations of causal relationships if study focuses on causal relationships Generalizable (quantitative) or transferable (qualitative)

77 Main Preoccupations of Quantitative Researchers 1. Measurement Can a concept be quantified? Comparisons between measures Changes in a variable over time 2. Causality Explanations of social phenomena Causal relationships between independent and dependent variables Inference only in cross-sectional designs

78 3. Generalization Can the results be applied to individuals beyond the sample? Aims to generalize to target population Requires representative sample (random, probability sample)

79 4. Replication Detailed description of procedures allows other researchers to replicate study Low incidence of published replications

80 Transferability Not all studies are intended to be generalizable to an entire population Refers to the ability to apply research results in another context or to inform other research Also refers to the ability of the research to connect the reader with the research Make study environment, respondents, social phenomena come alive Solicits comparisons between readers own experiences and experiences described in the research

81 All of these measures Of validity and reliability are conducted after research is conducted Frustrating to have to report that your measures were invalid or unreliable But that is still a legitimate finding! Just as frustrating sometimes to have to report you found no support for your hypothesis!

82 No way to know a priori Cant know for certain how reliable or valid something is before youve conducted the research Unless you are using something that has reliability/validity previously established Thats why so much time and effort is put into research design Conceptualization Operationalization Reviewing past research Exploring theories Exploring methods

83 Pretesting and preliminary investigation Can also increased reliability and validity As well as improving overall research design Pre-testing After research instrument/guidelines established Involves giving your survey, using your observation guidelines in the field, doing a few interviews with respondents or informants Analyzing data generated and soliciting feedback from respondents about instrument (if applicable) Revising measurements, instrument

84 Preliminary investigation Often occurs prior to creating research instrument/guidelines May talk informally with individuals from the target population or otherwise associated with social phenomena May do field observations May collect and analyze social artifacts associated with research topic

85 Criticisms of Quantitative Research Failure to distinguish between objects in the natural world and social phenomena Artificial and false sense of precision and accuracy presumed connection between concepts and measures respondents make different interpretations of questions and other research tools

86 Lack of external validity reliance on instruments and measurements little relevance to participants everyday lives variation in the meaning of concepts to each individual

87 Static view of social life reduced to relationships between variables ignores processes of human definition and interpretation (Blumer, 1956)

88 The Gap Between the Ideal and the Actual Quantitative research design is an ideal- typical approach Useful as a guide to good practice but there is a discrepancy between ideal type and actual practice of social research Pragmatic concerns mean that researchers may not adhere rigidly to these principles

89 How does quantitative research sometimes depart from the principles of good practice? Three examples…

90 1. Reverse operationalism Quantitative research is usually deductive (operational definition of concepts), but measurements can sometimes lead to inductive theorizing (Bryman, 1988) example: factor analysis groups of indicators cluster together and suggest a common factor e.g. personality trait research

91 2. Reliability and validity testing Published accounts of quantitative research rarely report evidence of reliability and validity (Podsakoff & Dalton, 1987) Researchers are primarily interested in the substantive content and findings of their research Tests of reliability and validity are often neglected

92 3. Sampling Good practice in quantitative research calls for probability sampling Sometimes it may not be possible to obtain a probability sample due to lack of time, lack of resources, or the nature of the population.

93 Peer Reviews Despite the inevitable shortcomings of actual projects peer review helps ensure that quantitative researchers remain committed to the principles of good practice.

94 Backup slides for general info Research Process and Design (Umbach) 94

95 Educational Research

96 Topics Discussed in this Chapter Data collection Measuring instruments Terminology Interpreting data Types of instruments Technical issues Validity Reliability Selection of a test

97 Data Collection Scientific inquiry requires the collection, analysis, and interpretation of data Data – the pieces of information that are collected to examine the research topic Issues related to the collection of this information are the focus of this chapter

98 Data Collection Terminology related to data Constructs – abstractions that cannot be observed directly but are helpful when trying to explain behavior Intelligence Teacher effectiveness Self concept Obj. 1.1 & 1.2

99 Data Collection Data terminology (continued) Operational definition – the ways by which constructs are observed and measured Weschler IQ test Virgilio Teacher Effectiveness Inventory Tennessee Self-Concept Scale Variable – a construct that has been operationalized and has two or more values Obj. 1.1 & 1.2

100 Data Collection Measurement scales Nominal – categories Gender, ethnicity, etc. Ordinal – ordered categories Rank in class, order of finish, etc. Interval – equal intervals Test scores, attitude scores, etc. Ratio – absolute zero Time, height, weight, etc. Obj. 2.1

101 Data Collection Types of variables Categorical or quantitative Categorical variables reflect nominal scales and measure the presence of different qualities (e.g., gender, ethnicity, etc.) Quantitative variables reflect ordinal, interval, or ratio scales and measure different quantities of a variable (e.g., test scores, self-esteem scores, etc.) Obj. 2.2

102 Data Collection Types of variables Independent or dependent Independent variables are purported causes Dependent variables are purported effects Two instructional strategies, co-operative groups and traditional lectures, were used during a three week social studies unit. Students exam scores were analyzed for differences between the groups. The independent variable is the instructional approach (of which there are two levels) The dependent variable is the students achievement Obj. 2.3

103 Measurement Instruments Important terms Instrument – a tool used to collect data Test – a formal, systematic procedure for gathering information Assessment – the general process of collecting, synthesizing, and interpreting information Measurement – the process of quantifying or scoring a subjects performance Obj. 3.1 & 3.2

104 Measurement Instruments Important terms (continued) Cognitive tests – examining subjects thoughts and thought processes Affective tests – examining subjects feelings, interests, attitudes, beliefs, etc. Standardized tests – tests that are administered, scored, and interpreted in a consistent manner Obj. 3.1

105 Measurement Instruments Important terms (continued) Selected response item format – respondents select answers from a set of alternatives Multiple choice True-false Matching Supply response item format – respondents construct answers Short answer Completion Essay Obj. 3.3 & 11.3

106 Measurement Instruments Important terms (continued) Individual tests – tests administered on an individual basis Group tests – tests administered to a group of subjects at the same time Performance assessments – assessments that focus on processes or products that have been created Obj. 3.6

107 Measurement Instruments Interpreting data Raw scores – the actual score made on a test Standard scores – statistical transformations of raw scores Percentiles (0.00 – 99.9) Stanines (1 – 9) Normal Curve Equivalents (0.00 – 99.99) Obj. 3.4

108 Measurement Instruments Interpreting data (continued) Norm-referenced – scores are interpreted relative to the scores of others taking the test Criterion-referenced – scores are interpreted relative to a predetermined level of performance Self-referenced – scores are interpreted relative to changes over time Obj. 3.5

109 Measurement Instruments Types of instruments Cognitive – measuring intellectual processes such as thinking, memorizing, problem solving, analyzing, or reasoning Achievement – measuring what students already know Aptitude – measuring general mental ability, usually for predicting future performance Obj. 4.1 & 4.2

110 Measurement Instruments Types of instruments (continued) Affective – assessing individuals feelings, values, attitudes, beliefs, etc. Typical affective characteristics of interest Values – deeply held beliefs about ideas, persons, or objects Attitudes – dispositions that are favorable or unfavorable toward things Interests – inclinations to seek out or participate in particular activities, objects, ideas, etc. Personality – characteristics that represent a persons typical behaviors Obj. 4.1 & 4.5

111 Measurement Instruments Types of instruments (continued) Affective (continued) Scales used for responding to items on affective tests Likert Positive or negative statements to which subjects respond on scales such as strongly disagree, disagree, neutral, agree, or strongly agree Semantic differential Bipolar adjectives (i.e., two opposite adjectives) with a scale between each adjective Dislike: ___ ___ ___ ___ ___ :Like Rating scales – rankings based on how a subject would rate the trait of interest Obj. 5.1

112 Measurement Instruments Types of instruments (continued) Affective (continued) Scales used for responding to items on affective tests (continued) Thurstone – statements related to the trait of interest to which subjects agree or disagree Guttman – statements representing a uni-dimensional trait Obj. 5.1

113 Measurement Instruments Issues for cognitive, aptitude, or affective tests Problems inherent in the use of self-report measures Bias – distortions of a respondents performance or responses based on ethnicity, race, gender, language, etc. Responses to affective test items Socially acceptable responses Accuracy of responses Response sets Alternatives include the use of projective tests Obj. 4.3, 4.4

Download ppt "QNT 575 2009. Overview Introductions Admin Syllabus Review material Learning teams Next assignment."

Similar presentations

Ads by Google