1.  Most of the information presented in this workshop represents the presenters’ opinions and not an official NSF position  Local facilitators will.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Analyzing Student Work
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
Session 5 Intellectual Merit and Broader Significance FISH 521.
Experiential Learning Cycle
The Proposal Review Process Matt Germonprez Mutual of Omaha Associate Professor ISQA College of IS&T.
How to Integrate Students with Diverse Learning Needs in a General Education Classroom By: Tammie McElaney.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Russell Pimmel, Roger Seals and Stephanie Beard.  Spring 2010, NSF/DUE Engineering PDs initiate IWBW Series; Spring 2011-CS PDs join  Overall goals.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Evaluation of Education Development Projects Barb Anderegg, Connie Della-Piana, and Russ Pimmel National Science Foundation FIE Conference October 29,
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Proposal Strengths and Weakness as Identified by Reviewers Russ Pimmel & Sheryl Sorby FIE Conference Oct 13, 2007.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Top Ten Ways To Write a Good Proposal… That Won’t Get Funded.
March 20, 2012 Susan Finger & Sue Fitzgerald Division of Undergraduate Education National Science Foundation March 21, 2012 Sue Fitzgerald & Maura Borrego.
1 Evaluation of Education Development Projects CCLI PI Meeting August 15, 2008.
Project Evaluation Webinar 3 of the Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics Series Scott Grissom & Janis.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
Connie Della-Piana, Susan Finger, and Sue Fitzgerald Division of Undergraduate Education National Science Foundation CCLI/TUES PI Meeting January 27, 2011.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Integrating Diversity into.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
Project Evaluation Don Millard John Yu March 27, 2012 Guy-Alain Amoussou Lou Everett
Proposal Enhancement Strategies Russell Pimmel AASCU Workshop March 4, 2006.
GOAL: To improve conceptual understanding and processing skills  In the context of course ◦ Draw free-body diagrams for textbook problems ◦ Solve 3-D.
CriteriaExemplary (4 - 5) Good (2 – 3) Needs Improvement (0 – 1) Identifying Problem and Main Objective Initial QuestionsQuestions are probing and help.
Project Evaluation Connie Della-Piana Russ Pimmel Bev Watford Workshop for Faculty from Minority Serving Intuitions Feb. 8 –10, 2006.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Understanding of the fundamentals ◦ By the end of the class, 70% of the students will be able to:  Correctly draw freebody diagrams of 2D truss structures.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
EDU 385 Education Assessment in the Classroom
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Project Evaluation Barb Anderegg and Russ Pimmel Division of Undergraduate Educxcation National Science Foundation Annual ASEE Conference June 24, 2007.
Evaluating a Research Report
Writing More Effective NSF Proposals Jeanne R. Small Oklahoma City, Oklahoma March 2, 2006 Division of Undergraduate Education (DUE) National Science Foundation.
=_A-ZVCjfWf8 Nets for students 2007.
Teaching Today: An Introduction to Education 8th edition
Project Evaluation Barb Anderegg, Connie Della-Piana, Russ Pimmel Division of Undergraduate Education National Science Foundation FIE Annual Conference.
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
 An in-class quiz could be called assessment.  Using the in-class quiz results to realize the students are not achieving a learning outcome is an evaluation.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
Curriculum Report Card Implementation Presentations
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
1. Most of the information presented in this workshop represents the presenter’s opinion and not an official NSF position 2.
 NSF Merit Review Criteria Intellectual Merit Broader Impacts  Additional Considerations Integration of Research & Education Broadening Participation.
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
1 Developing a Competitive Proposal ( An Interactive, Web-Based Workshop) Russell Pimmel Division of Undergraduate Education National Science Foundation.
Part 1 1. March 20, 2012 Susan Finger & Sue Fitzgerald Division of Undergraduate Education National Science Foundation March 21, 2012 Sue Fitzgerald &
1Mobile Computing Systems © 2001 Carnegie Mellon University Writing a Successful NSF Proposal November 4, 2003 Website: nsf.gov.
NSF’s Broader Impacts Criteria Bev Watford, Sue Kemnitzer, Russ Pimmel Division of Undergraduate Education National Science Foundation Session T4B, Thursday.
M= Math in STEM College and Career Ready- Conference Summer, 2015.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
PBL Instructional Design. PBL Instructional Design Name: Name of PBL: Grade Level: Content Area:
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Planning for and Attending an Important Meeting Advanced Social Communication High School: Lesson Seven.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
NSF’s Broader Impacts Criterion Bev Watford and Russ Pimmel Division of Undergraduate Education National Science Foundation Annual ASEE Conference June.
IB Assessments CRITERION!!!.
ASSESSMENT OF STUDENT LEARNING
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Presentation transcript:

1

 Most of the information presented in this workshop represents the presenters’ opinions and not an official NSF position  Local facilitators will provide the link to the workshop slides at the completion of the webinar.  Participants may ask questions by “raising their virtual hand” during a question session. We will call on selected sites and enable their microphone so that the question can be asked.  Responses will be collected from a few sites at the end of each Exercise. At the start of the Exercise, we will identify these sites in the Chat Box and then call on them one at a time to provide their responses. 2

 Learning must build on prior knowledge ◦ Some knowledge correct ◦ Some knowledge incorrect – Misconceptions  Learning is ◦ Connecting new knowledge to prior knowledge ◦ Correcting misconceptions  Learning requires engagement ◦ Actively recalling prior knowledge ◦ Sharing new knowledge ◦ Forming a new understanding

 Effective learning activities ◦ Recall prior knowledge -- actively, explicitly ◦ Connect new concepts to existing ones ◦ Challenge and alter misconceptions  Active & collaborative processes ◦ Think individually ◦ Share with partner ◦ Report to local and virtual groups ◦ Learn from program directors’ responses 4

 Coordinate the local activities  Watch the time ◦ Allow for think, share, and report phases ◦ Reconvene on time -- 1 min warning slide  Ensure the individual think phase is devoted to thinking and not talking  Coordinate the asking of questions by local participants and reporting local responses to exercises 5

 Long Exercise min ◦ Think individually ~2 min ◦ Share with a partner ~2 min ◦ Report in local group ---- ~2 min  Short Exercise min ◦ Think individually ~2 min ◦ Report in local group ---- ~2 min  Individual Exercise min  Questions min  Reports to Virtual Group min 6

The session will enable you to collaborate more effectively with evaluation experts in preparing credible and comprehensive project evaluation plans …. it will not make you an evaluation expert.

After the session, participants should be able to:  Discuss the importance of goals, outcomes, and questions in the evaluation process ◦ Cognitive and affective outcomes  Describe several types of evaluation tools ◦ Advantages, limitations, and appropriateness  Discuss data interpretation issues ◦ Variability, alternative explanations  Develop an evaluation plan in collaboration with an evaluator ◦ Outline a first draft of an evaluation plan

 The terms evaluation and assessment have many meanings ◦ One definition  Assessment is gathering evidence  Evaluation is interpreting data and making value judgments  Examples of evaluation and assessment ◦ Individual’s performance (grading) ◦ Program’s effectiveness (ABET and regional accreditation) ◦ Project’s progress and success (monitoring and validating)  Session addresses project evaluation ◦ May involve evaluating individual and group performance – but in the context of the project  Project evaluation ◦ Formative – monitoring progress to improve approach ◦ Summative – characterizing and documenting final accomplishments

Project Goals, Expected Outcomes, and Evaluation Questions

 Effective evaluation starts with carefully defined project goals and expected outcomes  Goals and expected outcomes related to: ◦ Project management  Initiating or completing an activity  Finishing a “product” ◦ Student behavior  Modifying a learning outcome  Modifying an attitude or a perception

 Goals provide overarching statements of project intention What is your overall ambition? What do you hope to achieve?  Expected outcomes identify specific observable or measureable results for each goal How will achieving your “intention” be reflected by changes in student behavior? How will it change their learning and their attitudes?

 Goals → Expected outcomes  Expected outcomes → Evaluation questions  Questions form the basis of the evaluation process  The evaluation process consists of the collection and interpretation of data to answer evaluation questions

 Read the abstract -- Goal statement removed  Suggest two plausible goals ◦ One on student learning  Cognitive behavior ◦ One on some other aspect of student behavior  Affective behavior  Focus on what will happen to the students ◦ Do not focus on what the instructor will do  Long Exercise min ◦ Think individually ~2 min ◦ Share with a partner ~2 min ◦ Report in local group ---- ~2 min  Watch time and reconvene after 6 min  Use THINK time to think – no discussion, Selected local facilitators report to virtual group

The goal of the project is …… The project is developing computer-based instructional modules for statics and mechanics of materials. The project uses 3D rendering and animation software, in which the user manipulates virtual 3D objects in much the same manner as they would physical objects. Tools being developed enable instructors to realistically include external forces and internal reactions on 3D objects as topics are being explained during lectures. Exercises are being developed for students to be able to communicate with peers and instructors through real-time voice and text interactions. The project is being evaluated by … The project is being disseminated through … The broader impacts of the project are … Non engineers should substitute: “Organic chemistry” for “statics and mechanics of materials” “Interactions” for “external forces and internal reactions”

One Minute

GOAL: To improve conceptual understanding and processing skills  In the context of course ◦ Draw free-body diagrams for textbook problems ◦ Solve 3-D textbook problems ◦ Describe the effect(s) of external forces on a solid object orally  In a broader context ◦ Solve out-of-context problems ◦ Visualize 3-D problems ◦ Communicate technical problems orally ◦ Improve critical thinking skills ◦ Enhance intellectual development

GOAL: To improve ◦ Self- confidence ◦ Attitude about engineering as a career

 Write one expected measurable outcome for each of the following goals: ◦ Improve the students’ understanding of the fundamental concepts in statics (cognitive) ◦ Improve the students’ self confidence (affective)  Individual exercise ~ 2 minutes ◦ Individually write a response

One Minute

Understanding of the fundamentals ◦ Students will be better able to:  Describe all parameters, variable, and elemental relationships  Describe the governing laws  Describe the effects of changing some variable in a simple problem  Changes in the frictional force on a block when the angle of an inclined plane changes  Changes in the forces in the members of a simple three element truss when the connecting angles change Self-Confidence ◦ Students will:  Do more of the homework  Have less test anxiety  Express more confidence in their solutions  Be more willing to discuss their solutions

 Write an evaluation question for these expected measurable outcomes:  Understanding of the fundamentals ◦ Students will be better able to describe the effects of changing some variable in a simple problem Self-Confidence ◦ Students will express more confidence in their solutions  Individually identify a question for each  Report to the group

One Minute

Understanding of the fundamentals ◦ Are the students better able to describe the effects of changing some variable in a simple problem ◦ Are the students better able to describe the effects of changing some variable in a simple problem as a result of the intervention Self-Confidence ◦ Do the students express more confidence in their solutions ◦ Do the students express more confidence in their solutions as a result of the intervention

Tools for Evaluating Learning Outcomes

 Surveys ◦ Forced choice or open-ended responses  Concept Inventories ◦ Multiple-choice questions to measure conceptual understanding  Rubrics for analyzing student products ◦ Guides for scoring student reports, tests, etc.  Interviews ◦ Structured (fixed questions) or in-depth (free flowing)  Focus groups ◦ Like interviews but with group interaction  Observations ◦ Actually monitor and evaluate behavior Olds et al, JEE 94:13, 2005 NSF’s Evaluation Handbook

Surveys  Efficient  Accuracy depends on subject’s honesty  Difficult to develop reliable and valid survey  Low response rate threatens reliability, validity & interpretation Observations  Time & labor intensive  Inter-rater reliability must be established  Captures behavior that subjects are unlikely to report  Useful for observable behavior Olds et al, JEE 94:13, 2005

 Use interviews to answer these questions: ◦ What does program look and feel like? ◦ What do stakeholders know about the project? ◦ What are stakeholders’ and participants’ expectations? ◦ What features are most salient? ◦ What changes do participants perceive in themselves? The 2002 User Friendly Handbook for Project Evaluation, NSF publication REC

 Originated in physics -- Force Concept Inventory (FCI)  Several are being developed in engineering fields  Series of multiple choice questions ◦ Questions involve single concept  Formulas, calculations or problem solving skills not required ◦ Possible answers include detractors  Common errors -- misconceptions  Developing CI is involved ◦ Identify misconceptions and detractors ◦ Develop, test, and refine questions ◦ Establish validity and reliability of tool ◦ Language is a major issue

 Pittsburgh Freshman Engineering Survey ◦ Questions about perception  Confidence in their skills in chemistry, communications, engineering, etc.  Impressions about engineering as a precise science, as a lucrative profession, etc.  Validated using alternate approaches: ◦ Item analysis ◦ Verbal protocol elicitation ◦ Factor analysis  Compared results for students who stayed in engineering to those who left Besterfield-Sacre et al, JEE 86:37, 1997

 Levels of Intellectual Development ◦ Students see knowledge, beliefs, and authority in different ways  “ Knowledge is absolute” versus “Knowledge is contextual”  Tools ◦ Measure of Intellectual Development (MID) ◦ Measure of Epistemological Reflection (MER) ◦ Learning Environment Preferences (LEP) Felder et al, JEE 94:57, 2005

 Suppose you where considering an existing tool (e. g., a concept inventory) for use in your project’s evaluation of learning outcomes  What questions would you consider in deciding if the tool is appropriate?  Long Exercise min ◦ Think individually ~2 min ◦ Share with a partner ~2 min ◦ Report in local group ---- ~2 min  Watch time and reconvene after 6 min  Use THINK time to think – no discussion  Selected local facilitators report to virtual group

One Minute

 Nature of the tool ◦ Is the tool relevant to what was taught? ◦ Is the tool competency based? ◦ Is the tool conceptual or procedural?  Prior validation of the tool ◦ Has the tool been tested? ◦ Is there information concerning its reliability and validity? ◦ Has it been compared to other tools? ◦ Is it sensitive? Does it discriminate between a novice and an expert?  Experience of others with the tool ◦ Has the tool been used by others besides the developer? At other sites? With other populations? ◦ Is there normative data?

Question s Hold up your “virtual hand” to ask a question.

 Data suggest that the understanding of Concept #2 increased  One interpretation is that the intervention caused the change  List some alternative explanations ◦ Confounding factors ◦ Other factors that could explain the change  Individual Exercise min ◦ Individually write a response

One Minute

 Students learned the concept out of class (e. g., in another course or in study groups with students not in the course)  Students answered with what they thought the instructor wanted rather than what they believed or “knew”  An external event distorted the pretest data  The instrument was unreliable  Other changes in the course and not the intervention was responsible for the improvement  The characteristics of groups were not similar

 Data suggest that the understanding of the concept tested by Q1 did not improve  One interpretation is that the intervention did cause a change that was masked by other factors  Think about alternative explanations  How would these alternative explanations (confounding factors) differ from the previous list?

Evaluation Plan

 List the topics that need to be addressed in the evaluation plan  Long Exercise min ◦ Think individually ~2 min ◦ Share with a partner ~2 min ◦ Report in local group ---- ~2 min  Watch time and reconvene after 6 min  Use THINK time to think – no discussion  Selected local facilitators report to virtual group

One Minute

 Name & qualifications of the evaluation expert ◦ Get the evaluator involved early in the proposal development phase  Goals, outcomes, and evaluation questions  Instruments for evaluating each outcome  Protocols defining when and how data will be collected  Analysis & interpretation procedures  Confounding factors & approaches for minimizing their impact  Formative evaluation techniques for monitoring and improving the project as it evolves  Summative evaluation techniques for characterizing the accomplishments of the completed project.

 Workshop on Evaluation of Educational Development Projects ◦  NSF’s User Friendly Handbook for Project Evaluation ◦  Online Evaluation Resource Library (OERL) ◦  Field-Tested Learning Assessment Guide (FLAG) ◦  Student Assessment of Their Learning Gains (SALG) ◦  Science education literature

Identify the most interesting, important, or surprising ideas you encountered in the workshop on dealing with project evaluation 47

Question s Hold up your “virtual hand” to ask a question.

BREAK 15 min 49

BREAK 1 min 50

 NSF proposals evaluated using two review criteria ◦ Intellectual merit ◦ Broader impacts  Most proposals ◦ Intellectual merit done fairly well ◦ Broader impacts done poorly 52

 To increase the community’s ability to design projects that respond effectively to NSF’s broader impacts criterion 53

 At the end of the workshop, participants should be able to: ◦ List categories for broader impacts ◦ List activities for each category ◦ Evaluate a proposed broader impacts plan ◦ Develop an effective broader impacts plan 54

Broader Impacts: Categories and Activities 55

TASK: ◦ What does NSF mean by broader impacts?  Individual Exercise min ◦ Individually write a response 56

One Minute

 Every NSF solicitation has a set of questions that provide context for the broader impacts criterion  Suggested questions are a guide for considering broader impacts  Suggested questions are NOT ◦ A complete list of “requirements” ◦ Applicable to every proposal ◦ An official checklist

Will the project…  Advance discovery - promote teaching & learning?  Broaden participation of underrepresented groups?  Enhance the infrastructure?  Include broad dissemination?  Benefit society? NOTE: Broader impacts includes more than broadening participation

Will the project…  Involve a significant effort to facilitate adaptation at other sites?  Contribute to the understanding of STEM education?  Help build and diversify the STEM education community?  Have a broad impact on STEM education in an area of recognized need or opportunity?  Have the potential to contribute to a paradigm shift in undergraduate STEM education?

TASK:  Identify activities that “broadly disseminate results to enhance scientific and technological understanding”  Pay special attention to activities that will help transport the approach to other sites  Long Exercise min ◦ Think individually ~2 min ◦ Share with a partner ~2 min ◦ Report in local group ---- ~2 min  Watch time and reconvene after 6 min  Use THINK time to think – no discussion  Selected local facilitators report to virtual group 61

One Minute

 Dissemination to general public ◦ Applies to research and education development proposals ◦ See handout  Dissemination to peers (other instructors) ◦ Education projects should include strategies for-  Making other instructors aware of material and methods  Enabling other instructors to use material and methods

 Partner with museums, nature centers, science centers, and similar institutions to develop exhibits in science, math, and engineering.  Involve the public or industry, where possible, in research and education activities.  Give science and engineering presentations to the broader community (e.g., at museums and libraries, on radio shows, and in other such venues).  Make data available in a timely manner by means of databases, digital libraries, or other venues such as CD-ROMs 64

 Publish in diverse media (e.g., non-technical literature, and websites, CD-ROMs, press kits) to reach broad audiences.  Present research and education results in formats useful to policy-makers, members of Congress, industry, and broad audiences.  Participate in multi- and interdisciplinary conferences, workshops, and research activities.  Integrate research with education activities in order to communicate in a broader context. 65

 Standard approaches ◦ Post material on website ◦ Present papers at conferences ◦ Publish journal articles  Consider other approaches ◦ NSDL ◦ Specialty websites and list servers (e.g., Connexions) ◦ Targeting and involving a specific sub-population ◦ Commercialization of products ◦ Beta test sites  Focus on active rather than passive approaches 66

Question s Hold up your “virtual hand” to ask a question.

Reviewing a Project’s Broader Impacts 68

 Review the Project Summary & the excerpts from the Project Description  Assume the proposal is a TUES Type 1 with a $200K budget and a 3-year duration and that the technical merit is considered to be meritorious Write the broader impacts section of a review ◦ Identify strengths and weaknesses ◦ Use a bullet format  (Extra) Long Exercise min ◦ Think individually ~4min ◦ Share with a partner ~2 min ◦ Report in local group ---- ~2 min  Watch time and reconvene after 8 min  Use THINK time to think – no discussion  Selected local facilitators report to virtual group 69

One Minute

 Scope of activities ◦ Overall-very inclusive and good ◦ Well done but “standard things" ◦ Did not address the issue of quality ◦ No clear-cut plan ◦ Activities not justified by research base  Dissemination ◦ Limited to standard channels ◦ Perfunctory 71

 Industrial advisory committee a strength  Collaboration with other higher education institutions ◦ Institutions appear to be quite diverse but use of diversity not explicit ◦ Interactions not clearly explained ◦ Sends mixed message – raises questions about effectiveness of partnership  High school outreach ◦ Real commitment not evident ◦ Passive -- not proactive ◦ High school counselors and teachers not involved 72

 Modules are versatile  Broader (societal) benefits ◦ Need for materials not well described ◦ Value of the product not explained ◦ Not clear who will benefit and how much  Assessment of broader impacts not addressed 73

TASK: ◦ Identify desirable features of a broader impacts plan or strategy  General aspects or characteristics  Long Exercise min ◦ Think individually ~2 min ◦ Share with a partner ~2 min ◦ Report in local group ---- ~2 min  Watch time and reconvene after 6 min  Use THINK time to think – no discussion  Selected local facilitators report to virtual group 74

One Minute

 Include strategy to achieve impact ◦ Have a well-defined set of expected outcomes ◦ Make results meaningful and valuable ◦ Make consistent with technical project tasks ◦ Have detailed plan for activities ◦ Provide rationale to justify activities ◦ Include evaluation of impacts ◦ Have a well-stated relationship to the audience or audiences 76

WRAP-UP 77

 Use and build on NSF suggestions ◦ List of categories in solicitations ◦ Representative activities on website  Not a comprehensive checklist  Expand on these -- be creative  Develop activities to show impact impact  Integrate and align with other project activities 78

 Help reviewers (and NSF program officers) ◦ Provide sufficient detail  Include goals, objectives, strategy, evaluation ◦ Make broader impacts obvious  Easy to find  Easy to relate to NSF criterion 79

 Make broader impacts credible ◦ Realistic and believable  Include appropriate funds in budget ◦ Make broader impacts consistent with  Project’s scope and objectives  Institution's or College’s mission and culture  PI’s interest and experience  Assure agreement between content of Project Summary and Project Description 80

Identify the most interesting, important or surprising ideas you encountered in the workshop on dealing with broader impacts 81

Grant Proposal Guide Proposal & Award Policies & Procedures Guide Broader Impacts Activities 82

Question s Hold up your “virtual hand” to ask a question.

 To download a copy of the presentation- go to:  Please complete the assessment survey-go to: