Presentation on theme: "National Science Foundation Project Evaluation and Assessment"— Presentation transcript:
1 National Science Foundation Project Evaluation and Assessment Dennis W. SunalSusan E. ThomasAlabama Science Teachingand Learning CenterCollege of EducationUniversity of Alabama
2 Evaluation and Assessment of National Science Foundation Projects OutlineConsider your present approach and difficulties toward evaluationConsider NSF’s perception of evaluation difficultiesExplore the process and skills associated with proficiently planning a project evaluationInvestigate difficulties with a case study exampleCreating your own evaluation plan
3 Group Activity:Form small groups of two or threeGroup members will list “project evaluation” difficulties they have recognized in the pastCreate a question from one of the listed items " What is the difficulty?”
4 Difficulties with NSF Project Evaluation Whole Audience Activity:Consider the question;What are the critical areas that define “evaluation difficulties” in your projects?
5 NSF Recognized Common Difficulties with Project Evaluations?
6 NSF Diagnosed Difficulties ? Lack of funding planned for evaluationToo short a time, (e.g. longer than life of the project, follow grad student for years)Use of only quantitative methodsLack of knowledge of evaluation techniquesDid not use information available about what has worked in other projectsData collection not extended extended over timeAmbiguous requirements and/or outcomes
7 Evaluation Evaluation has varied definitions…. Accepted definition for NSF Projects: “Systematic investigation of the worth or merit of an object…”-Joint committee on Standards for Education Evaluation
8 NSF Rationale and Purpose of Conducting Project Evaluation Develop a diverse internationally competitive and globally engaged workforce of scientists, engineers, and well-prepared citizens;Enabling discoveries across the frontiers of science and engineering connected to learning, innovations, and service to society;Providing broadly accessible, state-of-the-art information bases and shared research and education tools.
9 How NSF Thinks About Evaluation! A component that is an integral part of the research and development processIt is not something that comes at the end of the projectIt is a continuous process that begins during planningEvaluation is regularly and iteratively performed during the project and is completed at the end of the projectDifferent questions are appropriate at different phases of the project
10 Evaluation can be accomplished in different ways There is no single model that can be universally appliedData gathered may be quantitative, qualitative, or both
11 Important Criteria for Evaluation Focus on an important question about what is being accomplished and learned in the projectEmphasis on gathering data that can be used to identify necessary mid-project changesPlan a strong evaluation design (e.g. with comparison groups and well-chosen samples) that clearly addresses the main questions and rules out threats to validityUse sound data collection instruments, appropriate to the questions addressedEstablish procedures to assure the evaluation is carried out objectively and sources of bias are eliminatedData analysis appropriate to questions asked and data collection methodologies being usedA reasonable budget given the size of the project, about 5 to 10 percentFocus on important question about what is being accomplished and learnedAn emphasis on gathering data that can be used to identify necessary mid-project changesA strong evaluation design, with comparison groups and well-chosen samples, that clearly address the main questions and rules out threats to validityUse sound data collection instruments, appropriate to the questions addressedEstablish procedures to assure the evaluation sis carried out objectively and sources of bias are eliminatedData analysis appropriate to questions asked and data collection methodologies beoing usedA reasonable budget given the size of the project, about 5 to 10 percent
12 NSF ExpectsGrantee will clearly lay out and evaluation plan in the proposalRefine the plan after the awardInclude in the final report a separate section on the evaluation, its purpose, and what was foundIn some cases a separate report or interim reports may be expected.
13 Example RFP - Faculty Early Career Development Proposal Content -Project Summary:Summarize the integrated education and research activities of the planB. Project description:Provide results from prior NSF support.Provide a specific proposal for activities over 5 years that will build a firm foundation for a lifetime of integrated contributions to research and education.
14 Plan for development should include: The objectives and significance of the research and education activities.The relationship of the research to current knowledge in the field.An outline of the plan; including evaluation of the educational activities on a yearly basis.The relationship of the plan to career goals and objectives.A summary of prior research and educational accomplishments.
15 NSF Merit Review Broader Impacts Criterion: Representative Activities NSF criteria relate to 1) intellectual merit and 2) broader impact.Criteria: Broader Impacts of the Proposed ActivityDoes the activity promote discover, understanding, teaching, training and learning?Does the proposed activity include participants of underrepresented groups?Does it enhance the infrastructure for research and educationWill the results be disseminated broadly to enhance scientific and technological understandings?What are the benefits of the proposed activity to society?
16 1. Advance discovery and understanding while promoting teaching, training and learning Integrate research activities into the teaching of science at all levelsInvolve studentsParticipate in recruiting and professional development of teachersDevelop research based educational materials and databasesPartner researchers and educatorsIntegrate graduate and undergraduate studentsDevelop, adapt, or disseminate effective models and pedagogic approaches to teaching
17 2. Broaden participation of underrepresented groups Establish research and education collaborations with students and teachersInclude students from underrepresented groupsMake visits and presentations on school campusesMentor early year scientists and engineersParticipate in developing new approaches to engage underserved individualsParticipate in conferences, workshops, and field activities
18 3. Enhance infrastructure for research and education Identify and establish collaborations between disciplines and institutionsStimulate and support next generation instrumentation and research and education platformsMaintain and modernize shared research and education infrastructureUpgrade the computation and computing infrastructureDevelop activities that ensure multi-user facilities are sites of research and mentoring
19 4. Broaden dissemination to enhance scientific and technological understanding Partner with museums, science centers, and others to develop exhibitsInvolve public and industry in research and education activitiesGive presentations to broader communityMake data available in a timely mannerPublish in diverse mediaPresent research and education results in formats useful to policy makersParticipate in multi- and interdisciplinary conferences, workshops, and research activities.Integrate research with education activities
20 5. Benefits to societyDemonstrate linkage between discovery and societal benefit through application of research and education results.Partner with others on projects to integrate research into broader programsAnalyze, interpret, and synthesize research and education results in formats useful to non-scientists
21 The criteria can also indicate a baseline for measuring success. Planning EvaluationNecessary to assess understanding of a project’s goals, objectives, strategies and timelines.The criteria can also indicate a baseline for measuring success.
22 Planning Evaluation Addresses the Following… Why was the project developed?Who are the stakeholders?What do the stakeholders want to know?What are the activities and strategies that will address the problem which was identified?Where will the program be located?How long will the program operate?How much does it cost in relation to outcomes?What are the measurable outcomes to be achieved?How will data be collected?
23 Two Kinds of Evaluation Program Evaluation determines the value of the collection of projects (e.g. the Alabama DOE-EPSCoR Program)Project Evaluation focuses on an individual project and its many components funded under an umbrella of the program (research plan components, educational plan components). It answers a limited number of questions.
24 Types of Evaluation Formative Evaluation Summative Evaluation ImplementationProgressSummative EvaluationNear end of a major milestone orAt the end of a project
25 Assesses ongoing project activities … Formative EvaluationAssesses ongoing project activities …Purpose: assess initial and ongoing project activitiesDone regularly at several points in throughout the developmental life of the project.Main goal is to check, monitor, and improve - to see if activities are being conducted and components are progressing toward project goals
26 Early check to see if all elements in place Implementation EvaluationEarly check to see if all elements in placeAssess whether project is being conducted as plannedDone early, several times during a life cycleCannot evaluate outcomes or impact unless you are sure components are operating according to planSample question guides:Were appropriate students selected, was the make-up of group consistent with NSF’s goal for a more diverse workforce?Were appropriate recruitment strategies used?Do activities and strategies match those described in the plan?
27 Assess progress in meeting goals Progress Evaluation:Assess progress in meeting goalsCollect information to learn if benchmarks of progress were met, what impact activities have had, and to determine unexpected developmentsUseful throughout life of projectCan contribute to summative evaluationSample question guides:Are participants moving toward goals, improving understanding of the research processAre numbers of students reached increasing?Is progress sufficient to reach goals?
28 Assesses project’s success and answers… Summative EvaluationAssesses project’s success and answers…Purpose: Assess the quality and impact of a fully mature projectWas the project successful?To what extent did the project meet the overall goals?What components were the most effective?Were the results worth the projects cost?Is the project replicable?
29 Consider unexpected outcomes CharacteristicsCollects information about outcomes and impacts and the processes, strategies, and activities that led to themNeeded for decision making – disseminate, continue probationary status, modify, or discontinueImportant to have external evaluator who is seen as objective and unbiased or have an internal evaluation with an outside agent review of the design and findingsConsider unexpected outcomes
30 Evaluation Compared to other Data Gathering Activities Evaluation differs from other types of activities that provide information on accountabilityDifferent information serves different purposes
31 Formative vs. Summative “When the cook tastes the soup, that is formative; when the guest taste the soup, that is summative.”
32 The Evaluation Process Steps in conducting an evaluation (six phases)Develop a conceptual model of the program and identify key evaluation pointsDevelop evaluation questions and define measurable outcomesDevelop an evaluation designCollect dataAnalyze dataReporting
33 Conditions to be metInformation gathered is not perceived as valuable or useful (Wrong questions asked)Information gathered is not seen as credible or convincing (wrong techniques used)Report is late or not understandable (does not contribute to decision making process)
34 1. Develop a Conceptual Model Start with a conceptual model to which an evaluation design is applied. In the case below a “logic model” is applied. Identify program components and show expected connections among them.InputsActivitiesShort-Term OutcomesLong-Term Outcomes
35 Inputs (Examples) Resource streams NSF FundsLocal and State fundsOther partnershipsIn-kind contributionsActivities (Examples) services, materials, and actions that characterize project goalsRecruit traditionally underrepresented studentsInfrastructure developmentProvisions of extended standards-based professional developmentPublic outreachMentoring by senior scientist
36 Short-Term Outcomes (Examples) Effective use of new materialsNumbers of people, products or institutions reached (17 students mentored)Changes resulting from experience (impact on choice of major of research RAs)Long-Term Outcomes (Examples) Broader more enduring impactChanges in instructional practice leading to enhanced student learning and performanceSelecting a career in NSF-related research activity
37 Next steps:Determine, review, and/or clarify timelineIdentify critical achievements and times that need to be met.
38 2. Question DevelopmentIdentify key stakeholders and audiences early to help shape questions. Multiple audiences exist. (Scientists, NSF, students, administration, community ..)Formulate potential evaluation questions of interest considering stakeholders and audiences.Define outcomes in measurable terms, including criteria for success.Determine feasibility and prioritize and eliminate questions.
39 Questions to consider when developing an evaluation approach… Who is the information for and who will use the findings?What kinds of information are needed?How is the information to be used?When is the information needed?What resources are available to conduct the evaluation?Given the answers to the preceding questions, what methods are appropriate?
40 2. …Defining Measurable Outcomes Briefly describe the purpose of the project.State in terms of a general goal.State an objective to be evaluated as clearly as you can.Can this objective be broken down further?Is the objective measurable? If not – restate.Once you have completed the above steps, go back to # 3 and write the next objective. Continue with steps 4, 5, and 6.
41 3. Develop and Evaluation Plan Select a methodological approach and data collection instrumentsQuantitative or qualitativeLead to different questions asked, timeframe, skills needed, type of data seen as credibleDetermine who will be studied and whenSampling, use of comparison groups, timing, sequencing, frequency of data collection, and cost.
42 4. Data Collection Obtain necessary clearance and permission. Consider the needs and sensitivities of the respondents.Make sure your data collectors are adequately trained and will operate in an objective, unbiased manner.Obtain data from as many members of your sample as possible.Cause as little disruption as possible to the ongoing effort.
43 4. Data Collection Sources and Techniques Checklists or inventoriesRating scalesSemantic differentialsQuestionnairesInterviewsWritten responsesSamples of workTestsObservationsAudiotapesVideotapesTime-lapse photographs
44 5. Analysis of Data Check raw data and prepare data for analysis. Conduct initial analysis based on the evaluation plan.Conduct additional analysis based on the initial results.Integrate and synthesize findingsDevelop conclusions regarding what the data shows
45 Final Reports typically include six major sections: 6. ReportingFinal Reports typically include six major sections:BackgroundEvaluation study questionsEvaluation proceduresData analysisFindingsConclusions (and recommendations)
46 The background section describes: a. BackgroundThe background section describes:The problem or needs addressedA literature review (if relevant)The stakeholders and their information needsThe participantsThe project’s objectivesThe activities and componentsLocation and planned longevity of the projectThe project’s expected measurable outcomes
47 b. Evaluation and Study Questions Describes and lists the questions the evaluation addressed.Based on:Need for specific informationStakeholders
48 c. Evaluation Procedures This section describes the groups and types of data collected and the instruments used for the data collection activities. For example:Data for identified critical indicatorsRatings obtained in questionnaires and interviewsDescriptions of activities from observations of key instrumental components of the projectExamination of extant data records
49 d. Data AnalysisDescribes the techniques used to analyze the data collectedDescribes the various stages of analysis that were implementedDescribes checks that were carried out to make sure that the data were free of as many confounding factors as possibleContains a discussion of the techniques used
50 e. Findings Presents the results of the analysis described previously Organized in terms of the questions presented in the section on evaluation study questionsProvides a summary that presents the major conclusions
51 f. ConclusionsReports the findings with more broad-based and summative statementsStatements must relate to the findings of the project’s evaluation questions ad to the goals of the overall program.Sometimes includes recommendations for NSF or the other undertaking projects similar in goals, focus, and scope.Recommendations must be based solely on robust findings that are data-based and not on anecdotal evidence.
52 Other SectionsAn Abstract: a summary of the study and its findings presented in approximately one half page of text.An executive summary: a summary which may be as long as 4 to 10 pages, that provides an overview of the evolution, its findings, and implications.
53 Formal Report Outline Summary Sections Abstract Executive summary BackgroundProblems or needs addressedLiterature reviewStakeholders and their information needsParticipantsProjects’ objectivesActivities and components
54 Location and planed longevity of the project Resources used to implement the projectConstraintsEvaluation study questionsQuestions addressed by the studyQuestions that could not be addressed by the studyEvaluation ProceduresSample:Selection proceduresRepresentativeness of the sampleData collectionMethodsInstruments
55 Summary matrixEvaluation questionsVariablesData gathering approachesRespondentsData collection scheduleFindingsResults of the analysis organized by study questionsConclusionsBroad-based, summative statementsRecommendations, when applicable
56 Disseminating the Information Consider what various groups need to knowBest manner for communicating information to themAudiencesFunding sources and potential funding sourcesOthers involved with similar projects or areas of researchCommunity members, especially those who are directly involved with the project or might be involvedMembers of the business or political community, etc.
57 Finding an EvaluatorUniversity setting - contact department chairs for availability of staff skilled in project evaluationIndependent contractors – department chairs, phone book, state departments, private foundations (Kellogg Foundation in Michigan), and other local colleges and universities will be cognizant of available servicesContact other researchers or peruse research and evaluation reports
58 Overview of Quantitative and Qualitative Data Collection Methods
59 Data Collection Methods: Some Tips and Comparisons Theoretical IssuesPractical IssuesUsing the Mixed-Method Approach
60 Theoretical Issues The value of the types of data The relative scientific rigor of the dataBasic, underlying philosophies of evaluation
61 Practical Issues Credibility of findings Staff skills Costs Time constraints
62 Using the Mixed-Method Approach Methodology: Qualitative Quantitative QualitativeData Collection Approach: Exploratory Survey Personal Interview focus group
63 Review and Comparison of Selected Techniques SurveysInterviewsFocus GroupsObservationsTestsOther MethodsDocument StudiesKey InformationCase Studies
64 Surveys Advantages: Good for gathering descriptive data Can cover a wide range of topicsAre relatively inexpensive to useCan be analyzed using a variety of existing softwareDisadvantages:Self-report may lead to biased reportingData may provide a general picture but lack depthMay not provide adequate information on context
65 Interviews Use interviews to answer the following questions: What does the program look and feel like to the participants? To other stakeholders?What do stakeholders know about the project?What thoughts do stakeholders knowledgeable about the program have concerning program operations, processes, and outcomes?What are the participants’ and stakeholders’ expectations?What features of the project are most salient to the participants?What changes do participants perceive in themselves as a result of their involvement in the project?
66 Interviews Advantages: Usually yield richest data, details, new insightsPermit face-to-face contact with respondentsProvide opportunity to explore topics in depthAllow interviewer to experience the affective as well as cognitive aspects of responsesAllow interviewer to explain or help clarify questions, increasing the likelihood of useful responsesDisadvantages:Expensive and time-consumingNeed well-qualified, highly trained interviewersInterviewee may distort information through recall error, selective perceptions, dire to please interviewerFlexibility can result in inconsistencies across interviewsVolume of information vary large; may be difficult to transcribe and reduce data
67 Focus Groups When to use focus groups: Identifying and defining problems in project implementationPre-testing topics or ideasIdentifying project strengths, weaknesses, and recommendationsAssisting with interpretation of quantitative findingsObtaining perceptions of project outcomes and impactsGenerating new ideas
68 Observations Advantages: Provide direct information about behavior of individuals and groupsPermit evaluator to enter into and understand situation/contextProvide good opportunities for identifying unanticipated outcomesExist in natural, unstructured, and flexible settingDisadvantages:Expensive and time consumingNeed well-qualified, highly trained observers; may need to be content expertsMay affect behavior of participantsSelective perception of observer may distort dataBehavior or set of behaviors observed may be atypical
69 TestAdvantages:Provide objective information on what the test taker knows and can doCan be constructed to match a given curriculum or set of skillsCan be scored in a straightforward mannerAre accepted by the public as a credible indicator of learningDisadvantages:May be oversimplified and superficialMay be very time consumingMay be biased against some groups of test takersMay be subject to corruption via coaching or cheating
70 Other Methods – Document Studies Advantages:Available locallyInexpensiveGrounded in setting and language in which they occurUseful for determining value, interest, positions, political climate, public attitudesProvide information on historical trends or sequencesProvide opportunity for study of trends over timeUnobtrusiveDisadvantages:May be incompleteMay be inaccurate or of questionable authenticityLocation suitable documents may pose challengesAnalysis may be time consuming and access may be difficult
71 Other Methods – Key Informants Advantages:Information concerning causes, reasons, and/or best approaches is gathered form an “insider” point of viewAdvice/feedback increases credibility of study pipeline to pivotal groupsMay have side benefit to solidify relationships among evaluators, clients participants, and other stakeholdersDisadvantages:Time required to select and get commitment may be substantialRelationship between evaluator and informants may influence type of data obtainedInformants may interject own biases and impressionsDisagreements among individuals may be hard to resolve
72 Other Methods – Case Studies Advantages:Provide a rich picture of what is happening, as seen through the eyes of many individualsAllow a through exploration of interactions between treatment and contextual factorsCan help explain changes or facilitating factors that might otherwise not emerge form the dataDisadvantages:Require a sophisticated and well-trained data collection and reporting teamCan be costly in terms of the demands on time and resourcesIndividual cases may be over-interpreted or over-generalized
73 Evaluation Application Activity Group Activity:Form small groups and assign roles.All group members should read the “Sample Proposal Outline” on the next set of slides.Consider the question, Develop an Evaluation Plan for the NSF Project. " What should be considered?”
74 Proposal Outline Sample Research Proposal CAREER: Fundamental Micromechanics and Materials Dynamics of Thermal Barrier Coating Systems Containing Multiple LayersA. Research PlanIntroductionGeneral backgroundGeneral definitionsExplanation of proceduresBenefitsObjectivesCharacterize the dynamics of the processMonitoring techniquesDevelopment of models
75 Experimental Approach Materials Micro-structural Characterization 2. Proposed ResearchIntroductionExperimental ApproachMaterialsMicro-structural CharacterizationMechanical PropertiesMicro-mechanical Characterization – Nano-indentationBulk Mechanical PropertiesResidual Stresses and TechniquesModeling3. Prior Research Accomplishments4. Significance and Impact of Research5. Industrial Interest
76 B. Education Plan 1. Objectives Enhance the undergraduate curriculum Encourage the best undergraduate students to pursue graduate studiesTo increase diversity by attracting underrepresented minority students
77 4. Teaching & Education Accomplishments 2. Education ActivitiesThe education of undergraduate and graduate students in materials and mechanical characterization and laboratory report preparation.Encourage undergraduates to pursue graduate work.Actively recruit undergraduate minority students3. Teaching Activities4. Teaching & Education Accomplishments
78 ReferencesNOVA Web Site (NASA Opportunities for Visionary Academics)NSF (2002). Division of Research, Evaluation and Communication National Science Foundation (2002).The 2002 User-Friendly Handbook for Project Evaluations.Henson, K. (2004). Grant writing in higher education, Boston: Pearson Publishers.Knowles, C. (2002). The first time grant writer's guide to success, Thousand Oaks CA: Corwin PressBurke, M. (2002). Simplified grant writing, Thousand Oaks CA: Corwin PressNSF (2004). A Guide for Proposal Writing nsf04016_Criteria for Evaluation: Intellectual merit and broader impacts. -
79 NSF : The 2002 User-Friendly Handbook for Project Evaluation, a basic guide to quantitative and qualitative evaluation methods for educational projectsNSF : User-Friendly Handbook for Mixed Method Evaluations, a monograph "initiated to provide more information on qualitative [evaluation] techniques and ... how they can be combined effectively with quantitative measures"Online Evaluation Resource Library (OERL) for NSF's Directorate for Education and Human Resources, a collection of evaluation plans, instruments, reports, glossaries of evaluation terminology, and best practices, with guidance for adapting and implementing evaluation resourcesField-Tested Learning Assessment Guide (FLAG) for Science, Math, Engineering, and Technology Instructors, a collection of "broadly applicable, self-contained modular classroom assessment techniques and discipline-specific tools for ... instructors interested in new approaches to evaluating student learning, attitudes, and performance."
80 National Science Foundation Project Evaluation and Assessment Dennis W. SunalSusan E. ThomasAlabama Science Teachingand Learning CenterCollege of EducationUniversity of Alabama