Presentation is loading. Please wait.

Presentation is loading. Please wait.

MARC Technical Assistance Workshop MORE Division, NIGMS February 13, 2009 NIH, DHHS.

Similar presentations


Presentation on theme: "MARC Technical Assistance Workshop MORE Division, NIGMS February 13, 2009 NIH, DHHS."— Presentation transcript:

1 MARC Technical Assistance Workshop MORE Division, NIGMS February 13, 2009 NIH, DHHS

2 MARC-U*STAR: Introduction & Case Studies

3 MARC-U*STAR is an INSTITUTIONAL research training program that provides an opportunity to develop the research and academic skills of students and the training capabilities of the grantee institution MARC-U*STAR is an INSTITUTIONAL research training program that provides an opportunity to develop the research and academic skills of students and the training capabilities of the grantee institution The program emphasizes: Institutional impact/improvement Institutional impact/improvement Curricular reform (quantitative sciences) Curricular reform (quantitative sciences) Activities that increase the development of students in preparation for research careers Activities that increase the development of students in preparation for research careers Summer research internships at research intensive institutions Summer research internships at research intensive institutions

4 What will be the delta or difference at your institution with a MARC-U*STAR program?

5 Case Study Instructions Read Case Study 1 & discuss in groups for 20 minutes. List:   Strengths   Weaknesses   Gaps   Recommendations General discussion of Case Study 1 Read Case Study 2& discuss in groups for 20 minutes. List:   Strengths   Weaknesses   Gaps   Recommendations General Discussion of Case Study 2

6 Case Study I Whowahr U (WU) is a moderate sized liberal arts college with a student population that is 85% underrepresented minority (43% African American, 40% Hispanic, 2% Native American, 10% Asian and 5% Caucasian). WU enrolls 1100 students that express an interest in the sciences, (Chemistry, Biology, Physics and Computer Science). The number of BA degrees awarded by the combined departments has averaged 95 per year for the past five years. On average, 2 alumni per year graduate with a PhD from prestigious universities around the country. Under a new Dean of Science, who is the PI of the application, the college plans to improve its reputation as a scholarly institution. Faculty will be judged on their scholarly productivity as well as their teaching. The faculty are interested in research but their current teaching workload limits their time for research. The Dean would like the college to become more selective, have a better graduation rate, and send more students on to post graduate training. WU proposes to use the MARC U*STAR program to motivate student interest and preparation for research careers. Having a MARC program will provide financial support that will help the college recruit stronger students.

7 Case Study I (continued) The specific objectives of the proposed MARC U*STAR program are to: Support 6 juniors and 6 senior honors students with stipends and a strong research experience with college faculty who are outstanding mentors Provide the students with mentored research experiences at intramural and extramural sites Provide an enriched curriculum with an interdisciplinary strength in the neurosciences with special MARC courses developed for MARC trainees Expose the MARC Trainees to role models and provide career guidance through a seminar series Have at least 33% of the MARC graduates enroll in graduate school and go on to receive a PhD

8 Group Discussion of Case Study I Strengths: Diverse Student Body/URM Pool Large Pool of Science Students Admin Support - Dean as PI Career Guidance for MARC students Some Measurable Objectives Some Prior Success/PhD track Record Enriched Curriculum Meets Mission of MARC program, including students getting research experiences Faculty judged on scholarly activities

9 Group Discussion of Case Study I Weaknesses: Using MARC as financial support to help institution recruit 'better' students/ Dean wants to recruit 'better' students for improved graduation rates (instead of focusing on improvement for existing students) No institutional impact/improvement demonstrated What's called "Objectives" is not - really activities Some Objectives are not Measurable No evidence of administrative support Tone is institution-centered instead of student-centered (addressing needs of institute instead of students) Faculty teaching load limits time for research/research training although proposed academic year/intramural research for students Low proposed outcomes, only 33% to go on to PhD programs - questionable. Is this an improvement over baseline. Not clear. No goal stated/Abstract is not logical, no overall BIG PICTURE No data on quality of student pool Numbers listed in abstract are not informative (no base line, from X to Y in Z amount of time not stated) Giving past PhD track record at institution, the number of 12 MARC slots seems too high Increase graduation rates of current students not the focus No mentioned of the required summer research training experience

10 Group Discussion of Case Study I Gaps: No Baseline data in proper context No developmental training plan Recommendation: NOT RECOMMENDED FOR FUNDING

11 Case Study II Whatsa Matter University (WMU) is a major research institution offering both undergraduate and graduate programs. WMU enrolls close to 15,000 students. Its student profile is 15% Hispanic, 13% African American, 10% Hawaiian and/or Samoan, 20% Asian American, 40% Caucasian non-Hispanic, and 2% foreign students with visas. The university student body includes 1% students with disabilities, 65% female and 35% males. The academic departments in science include the traditional science departments (biology, chemistry and physics) and social behavioral sciences (psychology, and anthropology) as well as a school of engineering, a law school, a school of medicine, and a school of public health. The faculty in these areas are funded by several government agencies and foundations such as the National Science Foundation (NSF), Kellogg, and Howard Hughes Medical Institute (HHMI). The institution’s portfolio of student development programs includes: STEP, Louis Strokes Alliance for Minority Participation, IGERT and GK12 Programs from the NSF, undergraduate and graduate HHMI programs, a Bridges to the Doctorate Program, an IMSD Program from MORE/NIGMS, and five T32 Research Training grants from NIGMS.

12 Case Study II (continued) WMU students also have the opportunity to compete for EPA, NASA and USDA fellowships on a regular basis. Most the researchers who have R01 type funding support students on as research associates on their grants. Whatsa Matter U science students complete the undergraduate degree in approximately five years. WMU annually graduates approximately 1,800 students of which 5% are science undergraduate majors and 8% are science doctorates. WMU graduates 2% underrepresented students at the undergraduate level and 1% of them pursue or are pursuing PhD degrees in science fields. At the graduate level, Whatsa Matter U awards 0.5% doctorates in science to minorities and foreign students.

13 Group Discussion of Case Study II Strengths: Good research environment (faculty with R01 grants)/students may have opportunity to engage in research Good research environment (faculty with R01 grants)/students may have opportunity to engage in research Have data Have data Infer institutional commitment due to other URM student support programs Infer institutional commitment due to other URM student support programs Potential student numbers from different schools Potential student numbers from different schools Diverse student body/URM pool Diverse student body/URM poolWeaknesses: Many student development programs, but not good outcomes in sending URMs to PhD programs Many student development programs, but not good outcomes in sending URMs to PhD programs Don't know baseline Don't know baseline Low % of URM students going into PhD Low % of URM students going into PhD No plan with no MGOs (Measurable Goal and Objectives) No plan with no MGOs (Measurable Goal and Objectives) Low retention and low graduation rates Low retention and low graduation rates The word "MARC" does not appear in the abstract - questionable The word "MARC" does not appear in the abstract - questionable Focus on graduate training and not undergraduate Focus on graduate training and not undergraduate

14 Group Discussion of Case Study I GAPS: Need to show synergy with other existing student development programs Need to specifically state with MARC is going to do for the institution If demo. is "0" hard to show a gap Recommendations: Note Recommended for Funding

15 MARC’s “10 MUST Haves” 1. Institutional Setting 2. Institutional Past Training Record 3. Institutional Commitment 4. Program Director 5. Research Training Environment 6. Recruitment & Student Development Plan 7. Skills Development Pre-MARC 8. Skills Development MARC 9. Responsible Conduct of Research Training 10. Evaluation and Tracking

16 1. Institutional Setting (Baseline Data) #URMs in science departments #URMs in science departments # of honors URMs # of honors URMs # of junior/senior honors URMs # of junior/senior honors URMs # URMs graduating per year # URMs graduating per year # URMs enrolled in PhD or MD/PhD # URMs enrolled in PhD or MD/PhD # URMs enrolled in MD/other professional # URMs enrolled in MD/other professional # URMs enrolled in postbacc # URMs enrolled in postbacc ALL ALL URMs in Participating Departments

17 MARC applicants must describe the past 5 year record of the institution in sending URMs to science PhD programs MARC applicants must describe the past 5 year record of the institution in sending URMs to science PhD programs Competing Renewal MARC applicants must also describe MARC alumni outcomes (suggested table in application) Competing Renewal MARC applicants must also describe MARC alumni outcomes (suggested table in application) 2. Institutional Past Training Record

18 Sample Table for Renewal Applications MARC PROGRAM CUMULATIVE REPORT Outcomes Progress ReportCurrent Non-Competing Continuation Reporting Period YR1YR2YR3YR4YR5 200X-200Y Number of trainees slots awarded: Number of trainees appointed: Junior Number of trainees appointed: Senior Number of trainees graduating with BS or BA: Number of trainees enrolled in Ph.D programs: Number of trainees enrolled in MD/Ph.D programs: Number of trainees enrolled in MD programs: Number of trainees enrolled in MS programs: Number of trainees in post-bacc programs: Number of trainees in other professional degree: Number of trainees in teaching positions: Number of publications by MARC trainees:

19 3. Institutional Commitment Documented commitment to proposed research training program’s goals and assurance that the institution intends the MARC program to be an integral part of its research and research training endeavor (financial or otherwise)

20 4. Program Director Must be full-time faculty or administrator Must be full-time faculty or administrator Posses scientific background, leadership, research training experience and administrative capabilities Posses scientific background, leadership, research training experience and administrative capabilities Responsible for overall direction, management, administration, and evaluation of the program Responsible for overall direction, management, administration, and evaluation of the program

21 5. Research Training Environment Extramural research – summer requirement at T32 or like institution required Extramural research – summer requirement at T32 or like institution required Intramural research Intramural research if research intensive (RI) environment with active research faculty OR partnerships w/nearby RI (T32) institutions and/or “Research Classroom” training

22 Community for Advanced Graduate Training (CAGT) https://cagt.nigms.nih.gov/ On-line “matching service” exclusively for MARC- U*STAR students and NIGMS predoctoral T32 Programs for the MARC students extramural summer research experience and/or graduate school (Ph.D.) training.

23 6. Recruitment & Student Dev. Plan Recruitment and development plan for enhancing the pool of potential trainees (prefreshmen, freshmen, & sophomores) Recruitment and development plan for enhancing the pool of potential trainees (prefreshmen, freshmen, & sophomores) A plan for trainee selection A plan for trainee selection

24 7. Skills Development Pre-MARC Program must develop the skills of pre-trainees (pre- freshmen, freshmen, sophomores) via workshops, etc. 8. Skills Development MARC trainees Program must develop the skills of MARC trainees: - research - critical thinking/problem solving - communication - career guidance

25 9. Responsible Conduct of Research (RCR) Mandatory RCR training Mandatory RCR training “NIH does not establish specific curricula or formal requirements, all programs are encouraged to consider instruction in the following areas: conflict of interest, responsible authorship, policies for handling misconduct, data management, data sharing, and policies regarding the use of human and animal subjects. Within the context of training in scientific integrity, it is also beneficial to discuss the relationship and the specific responsibilities of the institution and the graduate students or post-doctorates appointed to the program” “NIH does not establish specific curricula or formal requirements, all programs are encouraged to consider instruction in the following areas: conflict of interest, responsible authorship, policies for handling misconduct, data management, data sharing, and policies regarding the use of human and animal subjects. Within the context of training in scientific integrity, it is also beneficial to discuss the relationship and the specific responsibilities of the institution and the graduate students or post-doctorates appointed to the program”

26 10. Evaluation and Tracking Evaluation should be for your institution, not for NIH Evaluation should be for your institution, not for NIH Evaluation that’s in-line with measurable goals and objectives - - Did you get expected outcomes? If not, what would you change? Evaluation that’s in-line with measurable goals and objectives - - Did you get expected outcomes? If not, what would you change? Tracking - 10 year tracking system to follow MARC trainee alumni Tracking - 10 year tracking system to follow MARC trainee alumni

27 Budget No cap; 5 year grant award No cap; 5 year grant award Allowable Costs: Allowable Costs: - Stipends (~11K/yr/trainee), partial tuition & fees - Trainee travel (mtgs and summer research) - Summer per diem ($931/mo. + travel)

28 Unallowable Costs Stipends to pre-trainees Stipends to pre-trainees Funds to support more than awarded number of trainees Funds to support more than awarded number of trainees Recruitment activities Recruitment activities Faculty research Faculty research Faculty payment for mentoring Faculty payment for mentoring Workshops for specific tests (GRE) Workshops for specific tests (GRE)

29 Training Related Expenses Activities to strengthen the pool: pre- freshmen, freshmen, & sophomores e.g., curricular improvement Activities to strengthen the pool: pre- freshmen, freshmen, & sophomores e.g., curricular improvement Costs for workshops for faculty development (pedagogical) Costs for workshops for faculty development (pedagogical) Evaluation Evaluation Workshops to improve student critical thinking skills Workshops to improve student critical thinking skills

30 MARC-U*STAR: Grant Writing Tips

31 Preparation of a MARC-USTAR Training Grant Application

32 Program Expectations Increase in the baccalaureate retention rate as a result of pre-MARC training Increase in the graduation rate of URM students from MARC supported schools Increase in the number of URM students, both from the program and the institution, that obtain BS degrees and enroll in Ph.D. programs (institutional impact)

33 PROGRAM EXPECTATIONS PROGRAM EXPECTATIONS Increased academic preparation as a result of interdisciplinary instruction in the quantitative sciences to teach about biological phenomena. Increased academic preparation as a result of interdisciplinary instruction in the quantitative sciences to teach about biological phenomena. Increased collaboration between MARC supported institutions and research intensive institutions Increased collaboration between MARC supported institutions and research intensive institutions Exposure of MARC trainees to research during the academic year Exposure of MARC trainees to research during the academic year

34 Need a Good Plan Conduct a self analysis and gather baseline data!! Conduct a self analysis and gather baseline data!! What are the institutional needs? What are the institutional needs? What is your long range goal? What is your long range goal? What are your specific goals and measurable objectives? What are your specific goals and measurable objectives? What activities will help your institution achieve these objectives? What activities will help your institution achieve these objectives?

35 Needs Statement The needs statement is the difference between what is and what should be. The needs statement is the difference between what is and what should be. What your program will do to close this gap. What your program will do to close this gap.

36 Rationale Describe the problem or need Describe the problem or need Explain the program’s long range goal Explain the program’s long range goal Identify institutional commitment Identify institutional commitment Put the program in context of institutional needs/program objectives (e.g. student retention, scholastic achievement/GPA/GRE scores, interest in research) Put the program in context of institutional needs/program objectives (e.g. student retention, scholastic achievement/GPA/GRE scores, interest in research) Review relevant literature that underlies your plan. Review relevant literature that underlies your plan.

37 Important Steps in Preparing a Competitive Grant Application read the program announcement carefully read the program announcement carefully Read the Program Announcement Carefully Read the Program Announcement Carefully READ THE PROGRAM ANNOUNCEMENT CAREFULLY READ THE PROGRAM ANNOUNCEMENT CAREFULLY Read the correct program announcement (know the right program) Read the correct program announcement (know the right program) Read the most current program announcement Read the most current program announcement Read all of the instructions in the program announcement Read all of the instructions in the program announcement FOLLOW all of the instructions in the program announcement FOLLOW all of the instructions in the program announcement

38 Sequence of Proposal Topics for Reading Title Page and Abstract (Description) Title Page and Abstract (Description) Specific goals and measurable objectives Specific goals and measurable objectives Institutional background and need Institutional background and need Rationale for literature review Rationale for literature review Progress report (competing renewals) Progress report (competing renewals) Administration of the program Administration of the program Plans to achieve objectives/activities Plans to achieve objectives/activities Evaluation Plan Evaluation Plan Budget Budget

39 Sequence of Topics for Proposal Development Needs statement Needs statement Rationale and literature review Rationale and literature review Specific measurable goals and objectives vis a vis current institutional productivity Specific measurable goals and objectives vis a vis current institutional productivity Plans to achieve measurable goals and objectives Plans to achieve measurable goals and objectives Evaluation Plan Evaluation Plan Progress Report Progress Report Administration, budget, and biographic sketches Administration, budget, and biographic sketches Budget Budget Description (Abstract) Description (Abstract)

40 Specific Goals & Measurable Objectives State the long range goal of the program State the long range goal of the program State each specific goal or measurable objective, and state how it is connected to the long range goal State each specific goal or measurable objective, and state how it is connected to the long range goal Be brief and focused Be brief and focused

41 Objectives Achieved Through Activities Restate each objective and describe: The intervention activities to achieve each objective The intervention activities to achieve each objective The anticipated impact of each activity The anticipated impact of each activity Who will implement the plan Who will implement the plan Possible pitfalls and solutions Possible pitfalls and solutions Alternative approaches Alternative approaches Timeline for interventions Timeline for interventions

42 Presentation of Data Present data in figures, graphs, tables, or text Present data in figures, graphs, tables, or text Place figures, tables, and graphs close to where they are referred to in the text Place figures, tables, and graphs close to where they are referred to in the text Make all figures, tables, and graphs clearly legible Make all figures, tables, and graphs clearly legible Make a SINGLE point with each figure, graph, or table Make a SINGLE point with each figure, graph, or table Avoid irrelevant information Avoid irrelevant information

43 Training Plan: Summary Make sure the long range goal is clear Make sure the long range goal is clear Specific goals and measurable objectives are statements of end results. They are not a means to an end. Specific goals and measurable objectives are statements of end results. They are not a means to an end. The activities proposed are the means to achieve your specific goals and measureable objectives. The activities proposed are the means to achieve your specific goals and measureable objectives.

44 Training Plan: Summary Clearly explain the need Clearly explain the need Provide baseline data Provide baseline data Explain the rationale for objectives and activities Explain the rationale for objectives and activities Cite literature to support your choices Cite literature to support your choices Consider alternative approaches or strategies Consider alternative approaches or strategies

45 Training Plan: Summary Use plain English Use plain English Be convincing Be convincing List intellectual and physical resources List intellectual and physical resources Provide a timeline for implementations Provide a timeline for implementations The proposal should flow logically from section to section, i.e. activities = logical extensions of specific goals/measurable objectives The proposal should flow logically from section to section, i.e. activities = logical extensions of specific goals/measurable objectives

46 Administration Advisory committee: Not required but highly recommended Possible roles include program direction, selection of students and faculty mentors, preparing the application Define the role(s) of the committee clearly Trainers, mentors,& other key personnel: Define the roles clearly Provide credentials

47 Other Parts of the Application Title Page Title Page Description of the Proposal (Abstract) Description of the Proposal (Abstract) Budget Budget Biographical Sketch of key personnel Biographical Sketch of key personnel Institutional Resources Institutional Resources

48 Description Write the narrative for the description last – it details and summarizes the objectives, rationale, the plan and the anticipated outcomes Write the narrative for the description last – it details and summarizes the objectives, rationale, the plan and the anticipated outcomes It should be succinct and motivating- most often it is the first section to be read, and is also the most often read section of a proposal It should be succinct and motivating- most often it is the first section to be read, and is also the most often read section of a proposal

49 Budget Should never drive the proposal Should never drive the proposal Justify all personnel with respect to effort and expertise Justify all personnel with respect to effort and expertise Any equipment request must be congruent with the resource statement and must stem from the proposed activities Any equipment request must be congruent with the resource statement and must stem from the proposed activities Faculty mentoring is an unallowable cost Faculty mentoring is an unallowable cost JUSTIFY, JUSTIFY, JUSTIFY JUSTIFY, JUSTIFY, JUSTIFY

50 Biographical Sketches Document credentials accurately Document credentials accurately Document aspects of training and expertise that are relevant to the application Document aspects of training and expertise that are relevant to the application Include only relevant and full citations in the bibliography Include only relevant and full citations in the bibliography

51 Common Reasons for Failure Failure to use your resources wisely Failure to use your resources wisely If the institution has limited or no research capacity, don’t propose to put students in research labs on your campus. Find alternatives Lack of clear and well-defined measurable objectives. “Having a seminar series” is not a measurable objective and neither is “creating an atmosphere of science” Lack of clear and well-defined measurable objectives. “Having a seminar series” is not a measurable objective and neither is “creating an atmosphere of science” Missing or inadequate baseline data Missing or inadequate baseline data Reviewers need to know your starting point and what will change as a result of your proposed program

52 MARC-U*STAR: Program Evaluation

53 Evaluation: What is it? Program evaluations are individual, systematic studies that use objective measurement and analysis to answer specific questions about how well a program is working. - #GAO/GGD Program Evaluation Program evaluation and the tracking of students are not the same thing.

54 Program Evaluation Answers Questions Like…. Does it work? How well does it work? Does it do what we want it to? Does it work for the reasons we think it does? Is it cost effective? Are the benefits worth it? What are the unintended consequences?

55 Why bother? Supports continuous program improvement Increases understanding of the program – how are activities and strategies linked to results? Leads to improved planning and management Provides shared understanding of program

56 G uidelines for Conducting Successful Evaluations Invest heavily in planning early on Use knowledgeable, experienced evaluators (usually social scientists) Integrate evaluation into ongoing activities of the program

57 Typical Evaluations Needs Assessment  What is nature & extent of the issues program should address?  Planning phase Process Evaluation  Is program being conducted & producing output as planned?  How can process can be improved? Outcome Evaluation  Extent to which a program’s goals have been met?

58 NeedsAssessment  What problem is the program attempting to address?  Whom does this program serve; to what extent are their needs met?  What should be the documented goals of the program? Process Evaluation  Is the program being implemented as planned? If not, why?  How could the program’s processes be improved?  Has the program achieved recognized standards of performance? Sample Study Questions

59 Outcome Evaluation  To what extent has the program achieved its goals?  Is the current performance different from the past?  Has the program been more successful than a comparable program?  Which characteristics/ activities are most related to success?  What are the intended/ unintended effects of the program? Sample Study Questions

60 Why should you care? Why should you care? If you don ’ t know where you ’ re going, any road will take you there. If you don ’ t know where you ’ re going, any road will take you there. - Lewis Carroll, Alice in Wonderland - Lewis Carroll, Alice in Wonderland -Illustration by Sir John Tenniel, Adelaide, 2004

61 Key Steps in Evaluation 1. Engage stakeholders 2. Describe the program 3. Focus the evaluation design 4. Gather credible evidence 5. Justify conclusions (present data, analysis used, and findings) 6. Ensure use and share lessons

62 1. Engage Stakeholders Who are the stakeholders? Those involved in program operations, those affected by the program operations, and primary users of evaluation results

63 “I think you should be more explicit here in Step Two.” By Sidney Harris, Copyright 2007, The New Yorker

64 2.Describe the program  What are the goals and specific aims of the program?  What problem or need is it designed to address?  What are the measurable objectives?  What are the strategies to achieve the objectives?  What are the expected effects?  What are the resources and activities?  How is the program supposed to work?

65 Activity ≠ Program You cannot evaluate a program by assessing only an activity Remember:

66 Model of a Training Program Resources Activities Impact Research base Workshops & Seminars Mentoring by faculty member Training in scientific methods Short term Knowledge Skills Attitudes Intermediate Behaviors Practices Long term Enter PhD Program Faculty & Staff Money Equipment & Technology What is invested? (Inputs) (Outputs) (Outcomes) What is invested?What is done? What are the changes or benefits?

67 3. Focus the evaluation design  What do you want to know? (key questions)  Who will be involved in or affected by the evaluation or use the findings? (stakeholders)  To focus an evaluation, consider its purpose, uses, questions, methods, roles, budgets, deliverables etc. An evaluation cannot answer all questions for all stakeholders

68 4. Gather credible evidence Evidence must be believable, trustworthy, and relevant Select methodological approach & data collection instruments Determine who is studied and when

69 5. “Justify” conclusions Consider data: Analysis and synthesis - determine findings Interpretation - what do findings mean? Judgments - what is the value of findings based on accepted standards? Recommendations – - what claims can be made? - what are the limitations of your design?

70 An evaluation plan should include: Program description with baseline data Purpose & rationale for evaluation Evaluation Design Data Collection & Analyses Products of evaluation & their use Project Management Budget estimate

71 If you remember nothing else… Evaluation is a tool to help you make decisions about program management

72 MARC-U*STAR: Review

73 February 2009 Forms Revised PHS 398 (NIH Guide NOT-OD ) 11/2007 version must be used for all applications for submission/receipt dates on/after May 25, Format Supply all requested materials within page limits Follow general appendix guidelines (NIH Guide NOT-OD ) and ones specific to the program announcement NO paper appendices are accepted; if sending a paper application, appendices MUST be on CDs Relevant material only, such as large tables, survey instruments, and publications that are NOT available online. NO catalogs, lengthy reports, or material that should be in the body of the application Paper Applications (Currently T, F, P and K12)

74 February 2009 Electronic applications (Currently all R and most K mechanisms) Grants.gov Electronic application via this site will soon be mandatory for all funding mechanisms: T and K12 transitions tentatively scheduled for September 2009 F transition tentatively scheduled for August 2009 P transition indefinite at this point Hosts standardized federal forms SF424 (R&R) and agency-specific forms (PHS 398) Checks the application for federal-wide requirements. eRA Commons Retrieves the data from Grants.gov and checks the application against NIH-specific requirements Allows applicants to electronically track the status of submissions and to receive/transmit application and award information

75 February 2009 Online Resources Overview of Electronic Submission Frequently Asked Questions Avoiding Common Errors htm htm Training Resources, Videos, Quick Reference Materials

76 February 2009 Finding Help Grants.gov Contact Center Toll-free: Hours : Mon-Fri, 7 a.m. to 9 p.m. EST Support for: Grants.gov Registration, Mac Issues, Adobe Forms eRA Commons Help Desk Phone: Hours : Mon-Fri, 7a.m. to 8 p.m. EST Online Help Ticket : Support for: Commons Registration, Application Status, Post- submission Questions

77 February 2009 Only a single amendment accepted (NIH Guide NOT-OD ) For original new applications (never submitted) and competing renewal applications submitted on/after January 25, 2009 only one amendment to the original application will be accepted If funding is not received after two submissions, the program should be substantially re-designed rather than altered in response to previous reviews Scoring changes (NIH Guide NOT-OD , NOT-OD ) Each application submitted on/after January 25, 2009 will receive an individual score on a 9-point rating scale on individual review criteria Each application that is discussed in the review meeting will receive a final impact/priority score on a 9-point rating scale, which will be the average of the overall impact/priority score given by each eligible review committee member For additional information on peer review changes visit: Enhancing Peer Review Update

78 February 2009 TimeframeActivity Contact* Submission + 2 monthsReferral + 4 monthsReview monthsSummary Statement + 7 monthsCouncil + 8 monthsFunding Decisions + 9 monthsAward - or - Not *NIH Contact: Scientific Review Officer (SRO) or Program Officer (PO) PO SRO Timeline for Application and NIH Contacts

79 February 2009 Review of Research Training and Educational Development Program Applications Review Panel Organized by an SRO in the NIGMS Office of Scientific Review Special Emphasis Panel or Standing Committee Temporary committee members included for particular expertise Reviewer characteristics Experience with multiple education levels Involvement with research training programs Educators, researchers, and institutional administrators

80 February 2009 Content Read the program announcement and ensure that your application contains the necessary elements Successful submission through Grants.gov and eRA Commons does not mean appropriate responsiveness to the program announcement Context Present the institutional framework and environment of your program Be realistic in your program’s goals Writing Tips

81 February 2009 Comprehensive Address all of the requirements of the program announcement E.g., if you don’t have institutional baseline data, explain how you plan to obtain it Consistent Data in tables and text should match Match justification to budget items Writing Tips

82 February 2009 Clear Don’t bury important information in appendices or expansive prose Don’t expect reviewers to “read between the lines” Current E.g., make sure faculty biosketchs are up to date E.g., provide data on current and prior students Writing Tips

83 MARC-U*STAR: Grants Management

84 Managing Your MARC Grant Grants Administration Branch, NIGMS, NIH, DHHS

85 Managing Your MARC Grant Use the following resources to help you manage your MARC Training Program : The Terms and Conditions stated in the Notice of Award for your grant. The Terms and Conditions stated in the Notice of Award for your grant. The MARC Program Announcement (PAR ): The MARC Program Announcement (PAR ):http://grants.nih.gov/grants/guide/pa-files/PAR html The NIH Grants Policy Statement available at: The NIH Grants Policy Statement available at: Answers to MARC “Frequently Asked Questions” on the NIGMS website: Answers to MARC “Frequently Asked Questions” on the NIGMS website: Your institution’s Grants & Contracts Office Your institution’s Grants & Contracts Office

86 A Typical Year in the life of a MARC Program June 1st MARCs the start of the budget period!

87 A Typical Year in the life of a MARC Program Time to appoint your MARC students! You must submit a signed/completed PHS 2271 “Statement of Training Appointment” Form for each student you appoint: You must submit a signed/completed PHS 2271 “Statement of Training Appointment” Form for each student you appoint: - Appointments are made in 12 month increments. No trainee may be appointed for less than 12 months in their initial appointment unless you receive prior approval from NIGMS. Trainees should not be appointed unless they plan to complete two years of training. - The maximum number of trainees you can appoint is stated in Section IV or your Notice of Award.

88 A Typical Year in the life of a MARC Program Time to appoint your MARC students! Common reasons why 2271 appointment forms are rejected: Common reasons why 2271 appointment forms are rejected: Period of Appointment is not acceptable Period of Appointment is not acceptable Stipend amounts in Section 20 are wrong Stipend amounts in Section 20 are wrong Missing signatures Missing signatures The maximum number of appointments exceeded The maximum number of appointments exceeded

89 A Typical Year in the life of a MARC Program Time to appoint your MARC students! All MARC students are to be appointed for a consecutive 24 month period (12 months at a time). However, if an appointment is going to be less than 12 months (e.g. student leaves program) then the stipend must be prorated according to the following table: MARC STIPEND LEVEL COMPUTATION CHART 1 YR11 MO10 MO9 MO8 MO7 MO6 MO5 MO 10,95610,0439,1308,2177,3046,3915,4784,565 4 MO3 MO2 MO1 MO3 WK2 WK1 WK1 DAY 3,6522,7391, ** Round off fractional amounts when reporting stipend levels

90 A Typical Year in the life of a MARC Program Time to appoint your MARC students! Mail the PHS 2271 appointment forms to: Mail the PHS 2271 appointment forms to: Grants and Council Operations Grants and Council Operations NIH/NIGMS NIH/NIGMS 45 Center Drive, MSC Center Drive, MSC 6200 Building 45, Room 2AS.55 Building 45, Room 2AS.55 Bethesda, MD Bethesda, MD Or fax the forms to: “ATTN: Grants and Council Operations” at (301)

91 A Typical Year in the life of a MARC Program When Autumn is here, the deadline to submit your progress report is near! Progress reports are due by November 1 of each year. Use the PHS 2590 progress report kit: Progress reports are due by November 1 of each year. Use the PHS 2590 progress report kit: READ THE INSTRUCTIONS CAREFULLY!

92 A Typical Year in the life of a MARC Program Progress Reports: Which Forms? Face Page (Form Page 1) – Mandatory Face Page (Form Page 1) – Mandatory Budget Page (Form Page 2) – Mandatory Budget Page (Form Page 2) – Mandatory NRSA Additional Budget Page 2 – Mandatory NRSA Additional Budget Page 2 – Mandatory Budget Justification (Form Page 3) – Mandatory Budget Justification (Form Page 3) – Mandatory Progress Report Narrative (Form Page 5) – Mandatory Progress Report Narrative (Form Page 5) – Mandatory NRSA Data Table 12A – Mandatory NRSA Data Table 12A – Mandatory MARC Formatted Tables – Optional MARC Formatted Tables – Optional Checklist (Form Page 6) – Mandatory Checklist (Form Page 6) – Mandatory

93 A Typical Year in the life of a MARC Program When Autumn is here, the deadline to submit your progress report is near! One copy of the progress report must be mailed to: One copy of the progress report must be mailed to: Division of Extramural Activities Support, OER Division of Extramural Activities Support, OER National Institutes of Health National Institutes of Health 6705 Rockledge Drive, Room 2207, MSC Rockledge Drive, Room 2207, MSC 7987 Bethesda, MD (US Postal Service & Express mail) Bethesda, MD (US Postal Service & Express mail) Bethesda, MD (UPS/FedEx & other couriers) Bethesda, MD (UPS/FedEx & other couriers)

94 A Typical Year in the life of a MARC Program Progress Reports: Any Advice? Make sure that you include valid IRB and/or IACUC Approval Dates if students are participating in projects involving such research. Make sure that you include valid IRB and/or IACUC Approval Dates if students are participating in projects involving such research. -IRB approval dates are valid up to 1 year before the budget period start date. -IACUC approval dates are valid up to 3 years before the budget period start date. Include a section describing the instructional activities in the responsible conduct in research. Include a section describing the instructional activities in the responsible conduct in research. And Always...

95 BE HONEST!! 18 U.S.C (a) Except as otherwise provided in this section, whoever, in any matter within the jurisdiction of the executive, legislative, or judicial branch of the Government of the United States, knowingly and willfully— (1) falsifies, conceals, or covers up by any trick, scheme, or device a material fact; (2) makes any materially false, fictitious, or fraudulent statement or representation; or (3) makes or uses any false writing or document knowing the same to contain any materially false, fictitious, or fraudulent statement or entry; shall be fined under this title, imprisoned not more than 5 years….

96 A Typical Year in the life of a MARC Program When Spring is here, it’s time to get the Summer Research Experience in Gear! Prior to sending students to the SRE, you must send a list of the students to NIGMS through your institution’s business office. Prior to sending students to the SRE, you must send a list of the students to NIGMS through your institution’s business office. See Section IV of your Notice of Award for specific details on what information must be included. See Section IV of your Notice of Award for specific details on what information must be included.

97 A Typical Year in the life of a MARC Program One Final Note: A Financial Status Report (FSR) is due no later than 90 days after the end of each budget period. A Financial Status Report (FSR) is due no later than 90 days after the end of each budget period. - The FSR must be submitted electronically via the eRA Commons.

98 I Need Help! Who Should I Call? Before you contact NIGMS staff, check out the following resources as they will likely have the information you’re looking for: The Terms and Conditions stated in the Notice of Award for your grant. The Terms and Conditions stated in the Notice of Award for your grant. The MARC Program Announcement (PAR ): The MARC Program Announcement (PAR ):http://grants.nih.gov/grants/guide/pa-files/PAR html The NIH Grants Policy Statement available at: The NIH Grants Policy Statement available at: Answers to MARC “Frequently Asked Questions” on the NIGMS website: Answers to MARC “Frequently Asked Questions” on the NIGMS website: Your institution’s Grants & Contracts Office Your institution’s Grants & Contracts Office

99 I Need Help! Who Should I Call?... And if you still can’t find the answer: Contact The NIGMS Grants Management Specialist (policy/budget issues) or Scientific Program Officer (policy/technical issues) listed in the last section of the Notice of Award for your grant. * If you send an , include your grant number (e.g. GM012345). grant number (e.g. GM012345).

100 Contacts MARC Program: Dr. Toliver, Dr. Drew, Evaluation: Dr. Juliana Blome, Review: Dr. Mona Trempe, Grants Management: Mr. Robert Altieri,


Download ppt "MARC Technical Assistance Workshop MORE Division, NIGMS February 13, 2009 NIH, DHHS."

Similar presentations


Ads by Google