Presentation on theme: "Notes on Research Proposals. Components of the Research Proposal Problem Description/Statement Research Objectives Importance/Benefits of the Study Literature."— Presentation transcript:
Notes on Research Proposals
Components of the Research Proposal Problem Description/Statement Research Objectives Importance/Benefits of the Study Literature Review Research Design / Data Analysis Deliverables Schedule [Facilities and Special Resources] References Budget (Appendix)
Problem Statement Review the discussion from Week 2 on problem statements.
Purpose of the Problem Statement Represents the reasons/motivation behind your proposal (based on the specific domain of study). It specifies the conditions you want to change or the gaps in existing knowledge you intend to fill (this is the specification of the research problem). Should be supported by evidence. Specifies your hypothesis that suggests a solution to the problem. Shows your familiarity with prior research on the topic and why it needs to be extended. Even if the problem is obvious, your reviewers want to know how clearly you can state it.
Guidelines for writing a good abstract/problem statement All should have the following elements in this order: 1.State the general case / problem 2.Describe what others have done 3.What’s missing / where is the gap in knowledge? 4.Describe the proposed solution or research objectives/questions 5.Specify one or more specific hypotheses –Should include specific metrics/measurements –Discuss how their validation addresses the research questions 6.Specific results (or research design, if it is a proposal)
Purpose of the Research Objectives Section Specify the outcome of your project, the end product(s) Keep you objectives –Specific: indicate precisely what you intend to change through your project –Measurable –what you accept as proof of project success –Logical – how each objective contributes to systematically to achieving your overall goal
Research Objectives Flows naturally from the problem statement –state your hypotheses clearly –give the reader a concrete, achievable goal Verify the consistency of the proposal –check to see that each objective is discussed in the research design, data analysis and results sections
Literature Review Recent or historically significant research studies Always refer to the original source Discuss how the literature applies, show the weaknesses in the design, discuss how you would avoid similar problems How is your idea different/better?
Importance/Benefits of the Study Importance of the doing the study now What are the potential impacts on –Research in the area –Applications of the research if successful –Broader impact (in other areas, on the society, in education, etc.) If you find this difficult to write, then most likely you have not understood the problem
Research Design What you are going to do in technical terms. –May contain many subsections –Be specific about what research methodology you will use and why –Provide details of your proposed solutions to the problem and sub-problems –Provide information for tasks such as sample selection, data collection, instrumentation, validation, procedures, ethical requirements
Schedule & Deliverables Include the major phases of the project exploratory studies, data analysis, report generation Critical Path Method (CPM) of scheduling may help Deliverables: –Measurement instruments –Algorithms –Computer programs / prototypes –Comparative evaluation –Other technical reports
Budget and Resources Itemized Budget –Access to special systems or computers –Infrastructure needs –Costs of surveys, user studies, etc. –Cost of travel if related to research design Provide a Budget Narrative This part is usually an appendix.
Proposal Characteristics Straightforward document –No extraneous or irreverent material Don’t tell us why you became interested in the topic –The first words you write are the most important ones Not a literary production –Clear, sharp and precise –economy of words; no rambling sentences Clearly organized –Outlined with proper use of headings and subheadings
Suggested Organization Title, Abstract, Keywords (problem statement) Introduction and Overview –Background information; problem description in context –Hypotheses and objectives –Assumptions and delimitations –Importance and benefits Related Work/Literature Review Research Design and Methodology Plan of Work and Outcomes (deliverables, schedule) Conclusions and Future Work References Budget (appendix)
Weaknesses in Research Proposals Research Problem –unfocused –unimportant (done before!) –more complex –limited relevance
Weaknesses in Research Proposals Research Design –so vague it prevents evaluation –inappropriate or impossible data –procedures inappropriate for problem Threats to validity Lack of reliable measures –lacking controls
A Sample Research Proposal Read (and study) the sample proposal in Chapter 5 of in Practical Research Fill in the critique in Chapter 12 for this proposal. –Since the critique is designed for a REPORT, simply change the tense for most questions. Is the sample size adequate? -> Will the sample size be adequate
Guide to Writing the Research Proposal
5 Key Questions to Answer in Your Problem Statement Does your problem statement: –Demonstrate a precise understanding of the problem you are attempting to solve? –Clearly convey the focus of your project early in the narrative? –Indicate the relationship of your project to a larger set of problems and justify why your particular focus has been chosen? –Demonstrate that your problem is feasible to solve? –Make others what to read it further?
5 Key Questions to Answer for Purpose and Objectives Does this section –Clearly describe your project’s objective, hypotheses and/or research question? –Bury them in a morass of narrative? –Demonstrate that your objectives are important, significant and timely? –Include objectives that comprehensively describe the intended outcomes of the project? –State objectives, hypothesis or questions in a way they can be evaluated or tested later
Writing Tips for Objectives Section Don’t confuse your objectives (ends) with you methods (means). A good objective emphasizes what will be done, whereas a method will explain why or how it will be done. Include goals (ultimate) and objectives (immediate)
Purpose of the Research Design Describes your project activities in detail Indicates how your objective will be accomplished Description should include the sequence, flow, and interrelationship of activities It should discuss the risks of your method, and indicate why your success is probable Relate what is unique about your approach.
Data Analysis Data Analysis is essentially a four step process 1.Identify precisely what will be evaluated. If you wrote measurable objectives, you already know. 2.Determine the methods used to evaluate each objective. More precisely, you will need to describe the information you will need and how you propose to collect it. 3. Specify the analyses you plan to make and the data you need to collect. Your design may be simply to observe behavior of a particular population or something more complex like a rigorous experimental and multiple control group design. 4. Summarize the resulting data analyses and indicate its use. Consider mock data tables that show what your resulting data might look like.
Key Questions to Answer for Research Design/Data Analysis Does the research design and data analysis section –Describe why analysis is needed in the project? –Clearly identify the purpose of your analysis? –Demonstrate that an appropriate analysis procedure is included for each project objective –Provide a general organizational plan or model? –Demonstrate what information will be needed to complete the analysis, the potential sources and the instruments that will be used to collect it.
Writing Tips for Research Design Begin with your objectives Describe the precise steps you will follow to carry out each objective, including what will be done, and who will do it. Keep asking and answering the “What’s next?” question. Once you have determined the sequence of events, cast the major milestones into a time-and-task chart
Scientific Writing Prosaic Clear, accurate, but not dull Economy – every sentence necessary but not to the point of over condensing Ego less – you are writing for the readers not yourself
Scientific Tone Objective and accurate To inform not entertain Do not over qualify – modify every claim with caveats and cautions Never use idioms like “crop up”, “loose track”, “it turned out that”, etc. Use examples if they aid in clarification
Scientific Motivation Brief summaries at the beginning and end of each section The connection between one paragraph and the next should be obvious Make sure your reader has sufficient knowledge to understand what follows
Other Writing Issues The upper hand – inclusion of offhanded remarks like “ …this is a straightforward application …” Obfuscation – aim is to give an impression of having done something without actually claiming to have done it Analogies – only worthwhile if it significantly reduces the work of understanding, most of the time bad analogies lead the reader astray
Writing Issues Straw men – indefensible hypothesis posed for the sole purpose of being demolished –“it can be argued that databases do not require indexes” Also use to contrast a new idea with some impossibly bad alternative, to put the new idea in a favorable light
Unsubstantiated Claims Example: –Most user prefer the graphical style of interface. –We believe that …. Example –Another possibility would be a disk-based method, but this approach is unlikely to be successful. –Another …, but our experience suggests that …
References and Citation Up-to-date Relevant (no padding) Original source First order: books and journal articles Second order: conference article Third order: technical report No private communications or forums ( material cannot be accessed or verified) if you must leave as a footnote not in the bibliography Do not cite support for common knowledge
Reference and Citation Carefully relate your new work to existing work, show how your work builds on previous knowledge, and how it differs from other relevant results. References – demonstrate the claims of new, knowledge of the research area, pointers to background reading
Citation Style References should not be anonymous –Other work  -> Marsden  has … In self-references, readers should know that you are using yourself to support your argument not independent authorities Avoid unnecessary discussion of references, Several authors …., we cite …
Citation style Ordinal-number style, name-and-date style, superscripted ordinal numbers, and strings. Use anyone, but use one! Entries ordered –By appearance of citation –alphabetically
Quotation Text from another source If short – enclosed in double quotes If long – set aside in an indented block Long quotations, full material, algorithms, figures may require permission from the publisher and from the author of the original Use of quotes for other reasons is not recommended
Acknowledgements Anyone who made a contribution Advice, proofreading, technical support, funding resources Don’t list your family, unless they really contributed to the scientific contents
Ethics Don’t –Present opinions as fact –Distort truths –Plagiarize –Imply that previously published results are original –Papers available on the internet – authors put out an informal publication and becomes accepted as a formal. It is expected that the informal version will be removed
Notes on Research Design You have decided –What the problem is –What the study goals are –Why it is important for you to do the study Now you will construct the research design which describes what you are going to do in technical terms.
General Structure of Research Proposals
Research Design Is a plan for selecting the sources and types of information used to answer the research question. Is a framework for specifying the relationships among the study’s variables Is a blueprint that outlines each procedure from the hypothesis to the analysis of data.
Research Design The research design will provide information for tasks such as Sample selection and size Data collection method Instrumentation Procedures Ethical requirements Rejected alternative designs
Classification of Research Designs Exploratory or formal Observational or communication based Experimental or ex post facto Descriptive or causal Cross-sectional or longitudinal Case or statistical study Field, laboratory or simulation
Exploratory or Formal Exploratory studies tend toward loose structures with the objective of discovering future research tasks –Goal - to develop hypotheses or questions for further research Formal study begins where the exploration leaves off and begins with the hypothesis or research question –Goal – test the hypothesis or answer the research question posed
Observational or Communication Based Observational studies – the researcher inspects the activities of a subject or the nature of some material without attempting elicit responses from anyone. Communicational – the researcher questions the subjects and collects response by personal or impersonal means.
Experimental or Ex Post Facto In an experiment the researcher attempts to control and/or manipulate the variables in the study. Experimentation provides the most powerful support possible for a hypothesis of causation With an ex post facto design, investigators have no control over the variables in the sense of being able to manipulate them. Report only what has happened or what is happening. Important that researches do not influence variables
Descriptive or Causal If the research is concerned with finding out who, what, where, when or how much then the study is descriptive. If is concerned with finding out why then it is causal. How one variable produces changes in another.
Cross-sectional or Longitudinal Cross-sectional are carried out once and represent a snapshot of one point in time. Longitudinal are repeated over an extended period
Case or Statistical Study Statistical studies are designed for breath rather than depth. They attempt to capture a population’s characteristics by making inference from a sample’s characteristics. Case studies – full contextual analysis of fewer events or conditions and their interrelations. (Remember that a universal can be falsified by a single counter-instance)
Field, Laboratory or Simulation Designs differ in the actual environmental conditions
Quantitative v. Qualitative Approaches Categorize research studies into two broad categories Quantitative – relationships among measured variable for the purpose of explaining, predicting and controlling phenomena Qualitative – answer question about the complex nature of phenomena with the purpose of describing and understanding from the participant’s point of view
The Validity of Your Method Accuracy, meaningfulness, an credibility Most important questions: –Does the study have sufficient controls to ensure that the conclusions we draw are truly warranted by the data? (internal validity) –Can we use what we have observed in the research situation to make generalizations about the world beyond that specific situation? (external validity)
Strategies to reduce internal validity problems Controlled laboratory study A double-blind experiment Unobtrusive measures ( to see where people use the library look at worn flooring) Triangulation – multiple sources
Strategies to enhance external validity A real-life setting – artificial settings may be quite dissimilar from real-life circumstances Representative sample Replication in a different context
Formal Notion of Validity “The best available approximation to the truth of a given proposition, inference, or conclusion” Source: Research Methods Knowledgebase
Types of Validity Conclusion Validity: –Is there a relationship between the two variables? Internal Validity: –Assuming that there is a relationship, is it a causal one? Construct Validity: –Assuming that there is a causal relationship, can we claim that the program reflected our construct of the program and that our measure reflected well our idea of the construct of the measure? External Validity: –Can we generalize the (causal) effect to other settings, domains, persons, places or times?
Types of Validity Source: Research Methods Knowledgebase
Statement of the Problem Goal Establishes Setting of the Problem hypothesis Additional information to comprehend fully the meaning of the problem scopedefinitionsassumptions
Hypotheses Tentative proposition formulated for empirical testing Means for guiding and directing –kinds of data to be collected –analysis and interpretation have nothing to do with proof acceptance or rejection is dependant on “data”
Examples of Hypotheses Error-based pruning reduces the size of decision trees (as measured in the number of nodes) without decreasing accuracy (as measured by error rate) The use of relevance feedback in an information retrieval system, results in more effective information discovery by users (as measured in terms of time to task completion) The proposed approach for generating item recommendations based on association rule discovery on purchase histories results in more accurate predictions of future purchases when compared to the baseline approach. [From a recent Google experiment] Longer documents tend to be ranked more accurately than shorter documents because their topics can be estimated with lower variance.
Rejecting the Hypothesis Often researchers set out to disprove an opposite/competing hypothesis Example: We believe that test strategy A uncovers more faults than test strategy B. So our hypothesis will be that –Programmers using test strategy A will uncover more faults than programmers using test strategy B for the same program.
Rejecting the Hypothesis However, we cannot actually prove this hypothesis, we instead will try to disprove an opposite hypothesis –There will be no difference in the fault detection rate of programmers using test strategy A and those using test strategy B for the same program.
Types of Hypotheses Existential –An entity or phenomenon exists (perhaps with a specified frequency) –“Atoms contain uncharged subatomic particles (neutrons)” Compositional –An entity or phenomenon consists of a number of related parts or components (perhaps with a specified frequency) –“Atoms consist of proton, electrons, and neutrons.” –“All decision tree algorithms can be divided into a growing phase and a pruning phase.”
Types of Hypotheses Correlational –Two measurable quantities have a specified association –“An element’s atomic weight and its properties are correlated.” –“The size of a decision tree constructed using error-based pruning grows linearly with the size of training set.” Casual –A given behavior has a specified causal mechanism –“The low reactivity of noble gases is caused by their full outer shell of valence electrons.” –“The use of relevance feedback results in more effective information discovery by users”
Rejecting the Hypothesis If there is a significant difference in the fault detection rate we can reject the “no difference” and by default, support our research hypothesis the “no difference” = null hypothesis
Recall: Falsifiability Falsifiability is the logical possibility that an assertion can be shown to be false by evidence Does not mean “false.” Instead, if a falsifiable proposition is false, its falsehood can be shown by experimentation, proof, or simulation. There are different degrees of falsifiability What make a hypothesis unfalsifiable? –Vagueness – theory does not predict any particular experimental outcome –Complexity/Generality – theory “explains” any experimental result –Special pleading – traditional experimental methods are claimed not to apply
Delimiting the Research This is what the researcher does not want to do in the project –Should be stated clearly and explicitly. What will be done is part of the problem statement.
Experiments Studies involving the intervention by the researcher beyond that required for measurement usually, manipulate some variable in a setting and observe how it affects the subject (cause and effect) there is at least one independent variable and one dependent variable
Independent Variable Variable the researcher manipulates For our hypothesis concerning test strategies, we may take a sample of software engineers and randomly assign each to one of two groups: one using test strategy A and the other test strategy B. Later we compare the fault detection rate in the two groups. –We are manipulating the test strategy, thus it is the independent variable
Dependent Variable Variable that is potentially influenced by the independent variable in our last example, the dependent variable is fault detection rate Presumably the fault detection rate is influenced by test strategy applied there can be more than one dependent variable
Conducting an Experiment Seven activities –select relevant variables –specify the level(s) of treatment –control the experimental environment –choose the experimental design –select and assign the subjects –pilot-test, revise, and test –analyze the data
Select the Relevant Variables Translate our problem into the hypothesis that best states the objectives of the research how concepts are transformed into variables to make them measurable and subject to testing research question: –Does a product presentation that describes product benefits in the introduction lead to improved retention of the product knowledge?
The Speculation Product presentations in which the benefits module is placed in the introduction of a 12 minute message produce better retention of product knowledge that those where the benefits module is placed in the conclusion.
Researcher’s Challenge Select variables that are the best operational representations of the original concepts. –Sales presentation, product benefits retention, product knowledge Determine how many variables to test –constrained by budget, the time allocated, the availability of appropriate controls, and the number of subjects
Researcher’s Challenge Select or design appropriate measures/metrics for them –thorough review of the available literature and instruments. –Adapted to unique needs of the research situation
Choosing an Experimental Design Experimental designs are unique to the experimental method statistical plans to designate relationships between experimental treatments and the experimenter’s observations improve the probability that the observed change in the dependent variable was caused by the manipulation of the independent variable
Validity in Measurements Validity: the extend to which instrument measures what is supposed to be measured –E.g., thermometer temperature –E.g., IQ Test Intelligence? –E.g., CPU time algorithm complexity or efficiency
Reliability of Measurement Reliability: accuracy and consistency by which the instrument can perform measurement –Accuracy exists only if there is consistency (not necessarily the other way around) –Need to measure more than once –Reliability is a necessary but not sufficient condition for validity