Download presentation
Presentation is loading. Please wait.
Published byBrittany Warren Modified over 8 years ago
1
1 The College of William and Mary “Cutting Edge” Early Career Workshop June, 2008 Jill Singer Division of Undergraduate Education Directorate for Education & Human Resources National Science Foundation Email: jksinger@nsf.gov The NSF Course, Curriculum, and Laboratory Improvement (CCLI) Program
2
2
3
3 Outline of Talk Programs in DUE The CCLI Program Advice and Resources What Happens to Your Proposal? Questions
4
4 NSF web site (www.nsf.gov)
5
5 Directorate for Education & Human Resources (EHR)
6
6 Course, Curriculum, and Laboratory Improvement (CCLI) DUE’s broadest, most innovative program Purpose of the Program To improve the quality of STEM education for all students by targeting activities affecting learning environments, course content, curricula, and educational practices Supports projects at all levels of undergraduate education. Supports activities in the classroom, laboratory, and field settings
7
7 CCLI: Three Scales of Projects Phase 1 Projects (small grants) Up to $150,000 ($200,000 when 4-year & 2-year schools collaborate); 1 to 3 years (can occur at a single institution with primarily local impact) Phase 2 Projects (medium grants) Up to $500,000; 2 to 4 years; build on smaller-scale proven ideas. Diverse users at several institutions Phase 3 Projects (large grants) Up to $2,000,000; 3 to 5 years; combine proven results and mature products. Involve several diverse institutions Current CCLI Program Solicitation (NSF07-543) New Program Announcement for ’08-’09
8
8 Implementing Educational Innovations Creating New Learning Materials and Teaching Strategies Assessing Learning and Evaluating Innovations Developing Faculty Expertise Project Components Research on Undergraduate STEM Teaching and Learning CCLI “Cycle of Innovation”
9
9 CCLI - Creating New Learning Materials and Teaching Strategies Phase 1 projects can focus on piloting new educational materials and instructional methodologies; Phase 2 projects on larger-scale development, broad testing, and assessment. Similar to the old “proof-of-concept” and “full development” CCLI-EMD projects, respectively. Phase 1 projects can focus on outcomes at a single site, but must include assessment and community engagement. Can be combined with other components, especially faculty development in phase 2.
10
10 CCLI - Developing Faculty Expertise Methods that enable faculty to gain expertise May range from short-term workshops to sustained activities Foster new communities of scientists in undergraduate education Cost-effective professional development Diverse group of faculty Leading to implementation May be combined with other components, especially materials development and assessment Excellent opportunities exist for you to participate in regional and national workshops
11
11 CCLI - Implementing Educational Innovations Approximately equivalent to the CCLI-A&I track projects. “Phase 1 projects generally” Projects must result in improved STEM education at local institution via implementing exemplary materials, laboratory experiences, or educational practices developed and tested at other institutions. CCLI-Implementation projects should stand as models for broader adaptation in the community. Proposals may request funds in any budget category supported by NSF, including instrumentation
12
12 CCLI - Assessing Learning and Evaluating Innovations Design and test new assessment and evaluation tools and processes. Apply new and existing tools to conduct broad-based assessments Must span multiple projects and be of general interest
13
13 CCLI - Conducting Research on STEM Teaching and Learning Develop new research on teaching and learning Synthesize previous results and theories Practical focus Testable new ideas Impact on STEM educational practices. May be combined with other components
14
14 Ways CCLI Can Support UGR Activities Acquisition of research quality equipment and its integration into undergraduate courses. Labs can be constructed that integrate advanced equipment, prepare students for research, and draw on faculty research expertise. Incorporation of inquiry-based projects into laboratory courses. Partnerships with local research and informal education institutions. Service learning can provide relevant problems while addressing the needs of the local community.
15
15 Examples of CCLI Projects with UGR PI: Jeanette Jerz, DePauw University “Enhancing Student Understanding of Environmental Systems with Ion Chromatography”, NSF #0311211 PI: David Gonzales, Fort Lewis College “Enhancing Science Education and Undergraduate Research Through Geochemical Studies Using ICP-OES”, NSF #0310902 PI: Jeff Ryan, University of South Florida “Preparing Undergraduates for Research: Examining the Use of Remote Instrumentation in Earth and Planetary Science Classrooms”, NSF #0633077
16
16 Human Subjects and the IRB (Institutional Review Board) Projects collecting data from or on students or faculty members are considered to involve human subjects and require IRB review Proposal should indicate IRB status on cover Exempt, Approved, Pending Grants will require official statement from IRB declaring the research exempt or approved Not the PI See “Human Subjects” section in GPG NOTE: For CCLI, IRB approval usually is obtained during award negotiations
17
17 Important Features of Successful CCLI Projects Quality, Relevance, and Impact Student Focus Use of and Contribution to the STEM Education Knowledge Base STEM Education Community-Building Expected Measurable Outcomes Project Evaluation
18
18 Quality, Relevance and Impact Innovative State-of-the-art products, processes, and ideas Latest technology in laboratories and classrooms Have broad implication for STEM education Even projects that involve a local implementation Advance knowledge and understanding Within the discipline Within STEM education in general
19
19 Student Focus Focus on student learning Project activities linked to STEM learning Consistent with the nature of today’s students Reflect the students’ perspective Student input in design of the project
20
20 STEM Education Knowledge Base Reflect high quality science, technology, engineering, and mathematics Rationale and methods derived from the existing STEM education knowledge base Effective approach for adding the results to knowledge base
21
21 Community-Building Include interactions with Investigators working on similar or related approaches in PI’s descipline and others Experts in evaluation, educational psychology or other similar fields Benefit from the knowledge and experience of others Engage experts in the development and evaluation of the educational innovation
22
22 Expected Measurable Outcomes Goals and objectives translated into expected measurable outcomes Project specific Some expected measurable outcomes on Student learning Contributions to the knowledge base Community building Used to monitor progress, guide the project, and evaluate its ultimate impact
23
23 Project Evaluation Include strategies for Monitoring the project as it evolves Evaluating the project’s effectiveness when completed Based on the project-specific expected measurable outcomes Appropriate for scope of the project
24
24 Lessons From Prior Rounds of the Program Phase 1 is an open competition – many new players; Phase 2 requires substantial demonstrated preliminary work; Phase 3 is for projects from an experienced team with a national scale. Program for 2009 may include minor to major changes – read solicitation!
25
25 Funding and Deadlines Expect to fund, all disciplines 130 Phase 1 projects 45 Phase 2 projects 4-6 Phase 3 projects Proposal Deadlines Phase 2 and Phase 3 : January, 2009 Phase 1: May, 2009 Note: Solicitation is still being prepared and above dates are subject to change
26
26 Resources for Models and Examples Journal of Geoscience Education CUR “Quarterly” “Cutting Edge” Workshops (CCLI Phase 3 project) NSF Award Search http://nsf.gov/awardsearch/ Search by program, key word(s) Programs often includes link to recent awards (abstracts)
27
27 Merit Review Criteria Intellectual merit of the proposed activity How important is the proposed activity to advancing knowledge and understanding within its own field or across different fields? How well qualified is the proposer to conduct the project? How well conceived and organized is the proposed activity? Is there sufficient access to resources?
28
28 Merit Review Criteria Broader impacts of the proposed activity How well does the proposed activity advance discovery and understanding while promoting teaching, training, and learning? How well does the proposed activity broaden the participation of underrepresented groups? To what extent will it enhance the infrastructure for research and education? Will the results be disseminated broadly to enhance scientific and technological understanding What may be the benefits of the proposed activity to society?
29
29 Writing a Proposal: Getting Started Start EARLY Get acquainted with FASTLANE Read the Program Solicitation and follow the guidelines Learn about the recent DUE awards using PIRS Become an NSF reviewer Contact (e-mail is best) a program officer to discuss your idea. This may cause you to refine your idea and may prevent you from applying to the wrong program Program Officers in DUE: Check the solicitations
30
30 Formatting, Fastlane, and Grants.gov NSF proposal format requirements 15 single-spaced pages Specified type fonts required Intellectual Merit and Broader Impact explicit in Project Summary Fastlane submission Web-based software – access from any browser Mature, well-supported system for NSF Accepts many file types, converts to.pdf Grants.gov Stand-alone software downloaded to local computer May eventually be used for any Federal agency Still under development and does not support all NSF processes (for example, collaborative proposals) Accepts only.pdf files
31
31 What Happens to your Proposal? Submission of proposal via FastLane Proposals are reviewed by mail and/or panels of faculty within the discipline(s) A minimum of three persons outside NSF review each proposal [DUE primarily uses panels] For proposals reviewed by a panel, individual reviews and a panel summary are prepared for each proposal NSF program staff member attends the panel discussion The Program Officer assigned to manage the proposal’s review considers the advice of reviewers and formulates a recommendation Negotiations may be necessary to address reviewers’ comments, budget issues, and other concerns
32
32 What Happens to Your Proposal (cont) NSF is striving to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. Verbatim copies of reviews, not including the identity of the reviewer, is provided to the PI. Proposals recommended for funding are forwarded to the Division of Grants and Agreements for review. Only Grants and Agreements Officers may make awards. Notification of the award is made to the submitting organization by a DGA Officer.
33
33 How to Really Learn about Programs and Process Become a reviewer for the proposals submitted to the program Give us a business card Send e-mail to the lead or disciplinary program officer Your name will be added to the database of potential reviewers We want to use many new reviewers each year, especially for Phase 1
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.