Presentation is loading. Please wait.

Presentation is loading. Please wait.

John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Similar presentations


Presentation on theme: "John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013."— Presentation transcript:

1 John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013

2 Any opinions, suggestions, and conclusions or recommendations expressed in this presentation are those of the presenter and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content. This project is funded through the NSF Research and Technical Assistance (RETA) program (DRL 1238120).

3 Professional Learning Network for Mathematics and Science Partnership Projects Learn and Share: Challenges and Successes Improve Skills Engage in Reflective Evaluation

4 The Goal of the TEAMS project is to: Strengthen the quality of MSP project evaluations and build the capacity of the evaluators by strengthening their skills related to evaluation design, methodology, analysis, and reporting.

5 Promoting MSP Effectiveness Through Evaluation MSP projects represent a major federal effort to support advancements in science, technology, engineering, and mathematics (STEM) disciplines and careers. Recognizing the vital role of evaluation in this national effort to promote STEM disciplines and careers, NSF MSP projects have an obligation to ensure their project evaluation is designed and conducted in a rigorous manner.

6 Regardless of funding sources, project evaluation plays a vital role in every Mathematics and Science Partnership (MSP) project by: Promoting MSP Effectiveness Through Evaluation Assessing the degree to which projects attain their goals and objectives; Advancing the field by sharing lessons learned and evaluation findings; and Improving the overall effectiveness of the project through formative evaluation.

7 Promoting MSP Effectiveness Through Evaluation Fosters increased understanding of evaluation design and implementation, in particular new and innovative methodologies. Promotes the use of longitudinal data systems in MSP evaluations. Strengthens the role of evaluation as a means of improving project effectiveness and contributing to the knowledge of the field. Technical Evaluation Assistance in Mathematics and Science (TEAMS):

8 Meeting the Needs of MSP Evaluation Works closely with the NSF staff to develop and implement strategies to encourage innovation and increased rigor in MSP evaluations. Conducts ongoing needs assessment to identify issues that pose challenges for the work of evaluators of MSP projects. Offers no-cost technical assistance to address these issues and challenges. Provides venues for MSP evaluators and project staff to share strategies and findings from MSP evaluations. Technical Evaluation Assistance in Mathematics and Science (TEAMS):

9 Meeting the Needs of MSP Evaluation Evaluation Approaches Often, external evaluations provide: – Formative feedback to improve projects and suggest mid-course corrections – Summative reporting of project outcomes and impacts – Project monitoring for accountability

10 Resources to Inform Evaluation Institute of Education Sciences, U.S. Department of Education, and National Science Foundation. (2013). Common Guidelines for Education Research and Development. Washington, DC: IES and NSF. Frechtling, J. (2010). 2010 User-Friendly Handbook for Project Evaluation. REC 99- 12175. Arlington, VA: National Science Foundation

11 Resources to Inform Evaluation Heck, D.J. & Minner, D.D. (2010). Technical report: Standards of evidence for empirical research, math and science partnership knowledge management and dissemination. Chapel Hill, NC: Horizon Research, Inc. Guthrie, Wamae, Diepeveen, Wooding, & Grant. (2013). Measuring research: A guide to research evaluation frameworks and tools. RAND Europe.

12 Meeting the Needs of MSP Evaluation Research Types TypesDescription  Foundational Research  Early-state or Exploratory Research  Design and Development Research  Efficacy Research  Effectiveness Research  Scale-Up Research Each of these types of research have different evaluation purposes and require different types of evaluation approaches.

13 Meeting the Needs of MSP Evaluation Measuring Research: Key Rationales Advocacy Demonstrate the benefits of supporting research, enhance understanding of research and its processes among policymakers and the public, and make the case for policy and practice change. Accountability Show that money and other resources have been used efficiently and effectively, and to hold researchers accountable. Analysis Understand how and why research is effective and how it can be better supported, feeding into research strategy and decision-making by providing a stronger evidence base. Allocation Determine where best to allocate funds in the future, making the best possible use of limited funding.

14 Meeting the Needs of MSP Evaluation Standards of Evidence Specify indicators for empirical evidence in six domains: 1.Adequate documentation 2.Internal validity 3.Analytic precision 4.Generalizability/external validity 5.Overall fit 6.Warrants for claims

15 Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013

16 Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Challenge Posed for Each Aspect of Evaluation  Instrumentation (38%)  Theory of Action and Logic Model (27%)  Establishing Comparison Groups (24%)  Evaluation Design (24%)  Sampling (19%)  Measurable Outcomes and Evaluation Questions (19%)  Data Analysis Methodology (16%)  Data Collection (16%)  Reporting (14%)

17 Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Other Evaluation Challenges  Instruments o Instruments for Science and Engineering o Instruments Aligned to State Standards o Instruments Aligned to Content of MSP  Valid and Reliable Performance Tasks  Classroom Observation Protocols

18 Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013 Where Additional Assistance Needed  Comparison Groups in Rural Settings  Random Groups/Comparison Groups  Large Enough Sample Size/Strategies for Random Selection  Evaluation Design and Measurable Outcomes for New Projects  Data Collection/Statewide Task  Excessive Evaluation of Students and Teachers

19 Meeting the Needs of MSP Evaluation Strategic Plan Tasks  Task 1: Intranet – Project Internal Storage and Retrieval Structure  Task 2: Website – teams.mspnet.org  Task 3: Outreach – Ongoing Communications  Task 4: National Advisory Board – Guidance and Review  Task 5: Help Desk – Quick response to Queries  Task 6: Document Review – Identify commonalities – develop resources  Task 7: Webinars – Topics to Inform

20 Meeting the Needs of MSP Evaluation Strategic Plan Tasks  Task 8: Communities of Practice – Guided discussions around evaluation topics  Task 9: Direct Technical Assistance - Strategies and activities at the project level  Task 10: National Conferences – Presentations to inform others work  Task 11: Annual Meeting – Focus on Evaluation  Task 12: Data Sources – Information about data sets and utility  Task 13: Instrument Review – share information about what is being used, by whom, and for what

21 Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance Task 3: Outreach Principal Investigators receive TEAMS communications to know what is available regarding resources and technical assistance. Identify additional resources, templates, processes, and measures being used by project for sharing with other MSP project PIs and evaluators. Communicate with TEAMS regarding specific project needs for information and technical assistance. Task 5: Help Desk Encourage project staff and evaluators to pose queries for TEAMS to respond. Task 6: Document Review Based on PI review of reports, especially challenges identified by evaluator, contact TEAMS staff for follow- up resources or technical assistance.

22 Meeting the Needs of MSP Evaluation Task 7: Webinars Invitations sent to PIs and evaluators to participate in webinars. Identify topics for which webinars can be prepared and provided and communicate that to TEAMS. Encourage your evaluator and project staff to present/participate in offered webinars. Task 8: Communities of Practice Based on PI review of reports, especially challenges and needs identified by individual project, recommend possible topics to TEAMS staff. Consider participation and encourage project staff and evaluator to participate in discussions. Principal Investigator Needs and Assistance

23 Meeting the Needs of MSP Evaluation Task 9: Direct Technical Assistance Based on insights and familiarity with individual project, including review of reports, contact TEAMS staff for follow-up with specific technical assistance and resources. Identify Evaluation topics for which technical assistance could be provided to project staff and evaluators. Task 10: National Conferences Share information with TEAMS about upcoming presentations from your project, especially if related to evaluation. TEAMS staff could help post presentations to share interesting findings from project. Principal Investigator Needs and Assistance

24 Tier Definitions TierGroup DescriptionServices 1Evaluators and researchers of projects other than NSF- and ED-funded MSP projects Access to website that provides links to available evaluation research and resources, research briefs, and other TEAMS publications 2Evaluators of NSF- and ED- funded MSP projects and external evaluators of other projects Help Desk services (Task 5) Webinars (Task 7) Communities of practice (Task 8) 3Evaluators of NSF-funded MSP projects Annual Conference (Task 11) 4Evaluators of NSF-funded MSP projects that are confronting specific challenges Communities of practice specifically for Tier 4 projects with common needs (Tasks 8 & 9) Direct technical assistance (Task 9)

25 Meeting the Needs of MSP Evaluation Task 11: TEAMS Annual Meeting Help identify changes in project staff Help identify specific projects to highlight and participate Help promote participation in meetings (allow resources to be used for this purpose) Task 12: Data Sources Identify projects that are using public databases in their reporting Share information about projects asking about use of public databases Principal Investigator Needs and Assistance

26 Meeting the Needs of MSP Evaluation Task 13: Instrument Review Contact TEAMS with queries regarding specific instruments for specific use. Share information with TEAMS regarding challenges encountered regarding instruments. Identify and share unique instruments being used in project. Consider using instruments from other projects as appropriate. Principal Investigator Needs and Assistance

27 Meeting the Needs of MSP Evaluation In Summary, Principal Investigators can: Identify needs; Share information between projects and TEAMS; Encourage involvement; Facilitate communication; and Promote high quality evaluation approaches. Principal Investigator Needs and Assistance

28 Meeting the Needs of MSP Evaluation Website (http:teams.mspnet.org) and Help Desk

29 Meeting the Needs of MSP Evaluation Website (http:teams.mspnet.org) and Help Desk

30 Meeting the Needs of MSP Evaluation Instruments Considerations Using measures of established quality vs. alignment to the specific goals/approaches of the project o Internally developed & piloted instruments o Externally developed & validated instruments o Collection & analysis of teacher work from the PD

31 Meeting the Needs of MSP Evaluation Instruments Benefits Internally developed instruments can help demonstrate results were what was intended and promised Externally validated instruments can help demonstrate findings are credible and more broadly important Use of multiple instruments provides triangulation of data for findings Use of internally developed instruments and teacher work samples can help in refining the program and informing providers about participants’ learning

32 Meeting the Needs of MSP Evaluation Instruments Lessons Learned As evaluation informs the project and the project evolves, this sometimes requires instrument changes Modifying instruments (adding and/removing items over time) and aligning data sets after modifications to keep up with evolving project needs Adding new instruments or removing instruments (when initial instrumentation isn’t providing appropriate data – i.e., teacher knowledge, etc.) Verify instrument validity and reliability after modifications and include information in reports.

33 Meeting the Needs of MSP Evaluation Develop a Conceptual Model of the Project and Identify Key Evaluation Points Theory of Action  Why This/Hypothesis o Based on interpretation of current research  Describes the experience of the intended audience o Cognitively or behaviorally  Expected Outcome o If This/Then This

34 Meeting the Needs of MSP Evaluation Develop a Conceptual Model of the Project and Identify Key Evaluation Points Model Components  Inputs  Activities  Outputs  Short-term Outcomes  Long-term Outcomes  Contextual Factors

35 Meeting the Needs of MSP Evaluation Example of Logic Model

36 Meeting the Needs of MSP Evaluation Develop an Evaluation Plan Steps  Determining what type of design is required to answer the questions posed  Selecting a methodological approach and data collection instruments  Selecting a comparison group  Timing, Sequencing, and Frequency of Data Collection

37 Meeting the Needs of MSP Evaluation Develop Evaluation Questions and Define Measurable Outcomes Steps  Identify Key Stakeholders and Audiences  Formulating potential evaluation questions of interest to the stakeholders and audiences  Defining outcomes in measureable terms  Prioritizing and eliminating questions

38 Meeting the Needs of MSP Evaluation Conducting the Data Collection Considerations  Obtain necessary clearances and permission.  Consider the needs and sensitivities of the respondents.  Make sure your data collectors are adequately trained and will operate in an objective, unbiased manner.  Obtain data from as many members of your sample as possible.  Cause as little disruption as possible to the ongoing effort.

39 Meeting the Needs of MSP Evaluation Analyzing the Data Considerations  Check the raw data and prepare them for analysis.  Conduct initial analysis based on the evaluation plan.  Conduct additional analyses based on the initial results.  Integrate and synthesize findings.

40 Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Analytic Precision IndicatorsDescription  Measurement Validity/Logic of Research Process  Reliable Measures/Trustworthy Techniques  Appropriate and Systematics Analysis The extent to which the findings of a study were generated from systematic, transparent, accurate and thorough analyses.

41 Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Analytic Precision IndicatorsDescription  Unit of Analysis Issues  Power  Effect Size  Multiple Instruments  Multiple Respondents  All Results The extent to which the findings of a study were generated from systematic, transparent, accurate and thorough analyses.

42 Meeting the Needs of MSP Evaluation Reporting the Findings Considerations  Background (Context, sites, intervention, etc.)  Evaluation study questions  Evaluation procedures (description of measures used and purposes)  Study Sites and Sample Demographics  Data Collection (administration, participants counts, timelines for acquiring data, etc.)  Data analyses (what methods for what measures, limitations, missing data, etc.)  Findings  Conclusions (and recommendations)

43 Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Generalizability/External Validity IndicatorsDescription  Findings for Whom  Generalizable to population or theory  Generalizable to different contexts The extent to which you can come to conclusions about one thing (e.g., population) based on information about another (e.g., sample).

44 Meeting the Needs of MSP Evaluation Disseminate the Information Considerations  The funding source(s)  Potential funding sources  Others involved with similar projects or areas of research  Community members, especially those who are directly involved with the project or might be involved  Members of the business or political community, etc.

45 Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions Warrants for Claims IndicatorsDescription  Limitations  Decay and Delay of the Effect  Efficacy  Conclusions/Implications Logically Drawn from Findings The extent to which the data interpretation, conclusions, and recommendations are justifiable based on the evidence presented.

46 Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider Evaluation TopicsEvaluation Design Component  Develop logic model  Identify contextual conditions Development of a conceptual model (logic model) of the program  Articulate goals clearly  Define multiple achievement outcomes Development of evaluation questions and measureable outcomes

47 Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider Evaluation TopicsEvaluation Design Component  Address shifting project and evaluation priorities Development of the evaluation design  Format measures (hard- copy, electronic, etc.) and schedule administration  Display data effectively  Data management Collection of data

48 Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider Evaluation TopicsEvaluation Design Component  Conduct appropriate data analyses to respond to evaluation questions Analysis of data  Report intended impact on various populations  Report findings to different audiences Provision of information to interested audiences

49 Meeting the Needs of MSP Evaluation Ongoing Needs Assessment At your tables, please write down one or two anticipated evaluation challenges and/or needs that your project perceives it may need assistance related to project/program evaluation.

50 What Questions Do You Have Regarding TEAMS? TEAMS contact information: teams.mspnet.org Meeting the Needs of MSP Evaluation

51 TEAMS Contacts John T. Sutton, PIDave Weaver, Co-PI RMC Research Corporation 633 17th Street Suite 2100 Denver, CO 80202-1620 RMC Research Corporation 111 SW Columbia Street Suite 1030 Portland, OR 97201-5883 Phone: 303-825-3636 Toll Free: 800-922-3636 Fax: 303-825-1626 Email: sutton@rmcdenver.com Phone: 503-223-8248 Toll Free: 800-788-1887 Fax: 503-223-8399 Email: dweaver@rmccorp.com


Download ppt "John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013."

Similar presentations


Ads by Google