Presentation is loading. Please wait.

Presentation is loading. Please wait.

SRT Review Panel Charge Discipline Scientists for Geospace Science NASA Headquarters.

Similar presentations


Presentation on theme: "SRT Review Panel Charge Discipline Scientists for Geospace Science NASA Headquarters."— Presentation transcript:

1 SRT Review Panel Charge Discipline Scientists for Geospace Science NASA Headquarters

2 2 Introductions NASA Headquarters staff –NASA Discipline Scientists are here to answer questions dealing with review content and procedure NASA Peer Review Services (NPRS) –Susan leads the support effort and is in charge of the process of getting the reviews captured in the system and other support services. –See Monique for questions about the computers, projectors, etc. –See Ellen for any questions on travel/reimbursements –The NPRS support suite is located in the Crystal VII Room You can check out the NPRS provided laptops tonight to work on your reviews

3 3 Why are we all here? The NRA03-OSS-01 solicited proposals for funding to begin in Fiscal Year 2004 for the Sun Earth Connection’s SR&T and LCAS The NRA describing this opportunity resulted in: –The submission of 92 proposals The 92 proposals requested a total of $8.8M –~$1.6M available –Average first year request is $96k –Requests are ~5 times the available funding –~ 20% of the SR&T proposals can be funded Broad-based community input (panel and mail-in) is the primary basis for selection of the proposals of the highest merit.

4 4 Review Panel Goals Select those proposals of the highest scientific and technical merit that are consistent with the SRT Program described in the ROSS. –Scientific impact –Technical feasibility Scientific impact includes evidence for significant and tangible advance toward addressing Sun Earth Connection (SEC) Division Goals. –Understand the response of magnetospheres and atmospheres to external and internal drivers –Discover how magnetic fields are created and evolve and how charged particles are accelerated –Understand coupling across multiple scale lengths and its generality in plasma systems.

5 5 Selection of Investigations This panel is comprised of unbiased objective and knowledgeable representatives of the Geospace portion of the Sun Earth Connection Division peer science community The consensus evaluation form you generate in the next couple of days is the first and most important step in the proposal selection process.

6 6 Proposal Selection Process

7 7 Philosophy –It is the PI’s privilege to choose the question to address –It is the PI’s obligation to explain the scientific impact of the question along with sufficient details of an appropriate methodology

8 8 Conflict of Interest NASA takes conflict of interest seriously: –No panel member is PI or Co-I on any proposal competing for funds in this competition –Panelists from the same institution as the PI or Co-I on any proposal will not be in the room while that proposal is being discussed by the panel. –Panelists should identify other conflicting relationships with PI’s or Co- I’s (e.g. former advisor-student) –The executive secretary is responsible for noting conflicts before discussion of a given proposal begins and for keeping an official log of conflicts.

9 9 Confidentiality NASA takes the confidentiality of this process very seriously: – NASA holds reviewer identity and panel deliberations in strictest confidence Panelists should also preserve the confidentiality of the process – Avoid statements in a review that could identify you – Leave the material associated with review behind. – Details of the review should not be discussed with anyone outside of the process

10 10 DIFFERENT FROM THE WEB FORM Consensus Review Form DIFFERENT FROM THE WEB FORM Proposal Summary In your own words, a concise statement of Science question being addressed: Methodology used to answer science question: Proposal Evaluation Your evaluation of 1.Scientific impact: How important is the question? 2.Technical feasibility: Is the method clear and will it solve the problem? 3.Closure: Will it make a substantial contribution toward the resolution of the question? ** Capture relevant material from the mail-in reviews SCIENTIFIC and TECHNICAL STRENGTHS Bulletized encapsulations (1 or 2 sentences per bullet) of the main strengths discussed in the Proposal Evaluation Section above (importance, methodology, closure, etc.)

11 11 Consensus Review Form (continued) SCIENTIFIC and TECHNICAL WEAKNESSES Bulletized encapsulations (1 or 2 sentences per bullet) of the main weaknesses discussed in the Proposal Evaluation Section above (lack of importance, defects in methodology, lack of closure, etc.) Rational for rating –A few sentences summarizing the importance of the question, the feasibility of the method and the most important strengths or weaknesses. Example Rationale: This proposal addresses several compelling questions about the downward auroral current region. The answer to any of these would represent a major step towards a better understanding of magnetosphere-ionosphere interaction. The methodology is sound and robust, since it relies on data of proven high quality and completeness. The proposal is therefore rated EXCELLENT

12 12 Consensus Review Form (continued) OVERALL GRADE Additional Comments This section is for noting programmatic information which may be important to NASA, but which is separate from the assessment of scientific merit. The proposal seeks financial support for two collaborators who are not named on the proposal cover sheet The budget includes $14,000 per year for computer equipment and support that is not well justified in the text The PI's current and pending support didn't identify sources of the funding The proposal was poorly organized and appeared to be thrown together at the last moment The proposal would be better suited to the LWS program ExcellentVery GoodGoodPoor X

13 13 The Process The reviews will be created and submitted on a file server There is a folder on the server for each proposal. The folder will have: 1.A compiled file containing all of the reviews received up to yesterday. 2.A blank Consensus Review Form 3.An empty file for taking notes if you wish Cut and paste or type new text into the appropriate place in the consensus form. Add a new version number (-v1, -v2, -v3, etc.) to the file name each time you modify a review file. Just before we break Wednesday night, print the latest version and change the version number to –Wed.

14 14 Mail In Reviews The Consensus Review you produce is that of the panel, taking into account relevant information from the mail-in reviews. 4 mail-in reviews have been solicited for each proposal. Some mail-ins may be conflicted. It is not possible to select enough knowledgeable reviewers by choosing only from people who are not associated with these proposals. ~350 collaborators (PIs, Co-Is, etc). The mail-in reviewers are identified on their reviews. The most important information on the mail-in reviews are the strengths and weaknesses. The average grade on mail-in reviews is generally higher than the final panel grade.

15 15 Recycled Proposals 3 of the 37 proposals appear similar to proposals we reviewed last year. Presenting reviewers will get a copy of the previous proposal and panel review. –The previous review should be treated as additional information similar to a mail-in review.

16 16 Rating Definitions EXCELLENT –Addresses compelling or fundamental scientific questions that are consistent with the goals and objectives of the Sun Earth Connection Division –AND uses a well defined, feasible, and appropriate methodology that is likely to produce substantial progress towards the achievement of the goals VERY GOOD –Addresses relevant and fundamental scientific questions, BUT there are important questions about the methodology OR –Has appropriate and sound methodology and addresses important though not compelling questions. GOOD –Addresses unclear or peripheral scientific questions, OR does not have clear methodologies, even though the results may be of some interest POOR - Proposals that are seriously flawed

17 17 Funding Priorities As a Rule of Thumb: EXCELLENT proposals should be funded >>>>NOW VERY GOOD proposals should be funded >>>> IFpossible GOOD proposals should be funded >>>> IF unlimited funds POOR proposals should NOT be funded >>>> Even if unlimited funding were available

18 18 ITM funding Mail-in reviews indicate 8% Excellent 36% E/VG ---------------------------------------------------- 31%VG 22%VG/G 3%Good FY 2004 funding ~$1.6M 92 proposals with average proposal cost ~96K => ~20% selection => Competitive range is E and E/VG

19 19 HINTS for Effective Reviews Use one-on-one conference time to identify disagreements and issues rather than to resolve them. Don’t worry about the rating of the proposal on the first cut Your audiences are (1) PI’s, (2) Discipline Scientists, and (3) SEC Division Director Ask for and volunteer help on proposals where extra expertise is needed. Remember that you are refereeing a proposal not a paper. We need a clear statement of the most significant strengths and weaknesses rather than a list of quibbles Include examples to illustrate general judgments Require sufficient explanation in the proposal that you are comfortable that the PI has considered possible pitfalls or weaknesses

20 20 Hints (continued) In enumerating STRENGTHS, –Comment on relative importance of the science goal and relevance to SEC –Give a succinct statement, in your own words, of the expected result –Address positive aspects of the method –Likelihood of substantial progress In discussing WEAKNESSES, –Distinguish between major and minor weaknesses –Point out lack of a clear science question or relevance to SEC –Comment when it is not clear what science progress might be expected –Specify the impact of the weakness on the anticipated result –Comment on lack of uniqueness of method –Point out obstacles to closure –Comment when a technical description is inadequate

21 21 Hints (continued) Document as clearly, precisely, and as detailed as possible Remember that the text in the consensus review is the only admissible evidence of your effort Let the proposal speak for itself. E.g. “The proposal (not the PI) failed to demonstrate…” “The proposal does not contain sufficient description…” The Supporting panelist should act as a secretary and document the panel’s comments while the Presenting panelist leads the discussion The Supporting panelist can take notes directly in the consensus review form file during the presentation, on a blank file in the same folder, or the old-fashioned way (on paper). Stop and write often; memories fade fast

22 22 Things to Avoid Reading between the lines - your judgment of the scientific merit should be based on the material in the proposal, not on your faith in the PI Referring to competing proposals in the main body of review. Put that on the back page. Using the first person in the text. E.g. “I could not figure out what this proposal is all about.” Rather write “The proposal was not clear in its goals.”

23 23 Tentative Agenda Conferencing: 3-4 hours (7-8 proposals @ 1/2 hour per proposal ) Presentations: 15 minutes on first reading, 5 on second reading 1)Tue: 8:30 Welcome/Panel Instructions 2)Tue: 9:30Presenting and Supporting panelists confer 3)Tue: 11:30Present first few proposals 4)Tue: 1:00 Presenting and Supporting panelists confer again 5)Tue: 3:00 Begin 1 st Round of Presentations to Panel 6)Tue: 7:00 Break for the day 7)Wed: 3:00 Begin 2 nd Round of Presentations 8)Wed: 7:00 Break for the day 9)Thu: 1:00 Discuss review process in the bar

24 24 Reasonableness of Cost Are the costs reasonable for the quality and amount of research? If Costs are out of line with work proposed, note this on the Other Factors (last) page. We need specific comments on 3 proposals that are significantly higher in cost than average (more than one sigma above average).


Download ppt "SRT Review Panel Charge Discipline Scientists for Geospace Science NASA Headquarters."

Similar presentations


Ads by Google