Presentation is loading. Please wait.

Presentation is loading. Please wait.

How did we do? Insights from a CLIMAS pilot evaluation Dan Ferguson, Anne Browning-Aiken, Gregg Garfin, Dan McDonald, Jennifer Rice, Marta Stuart Climate.

Similar presentations


Presentation on theme: "How did we do? Insights from a CLIMAS pilot evaluation Dan Ferguson, Anne Browning-Aiken, Gregg Garfin, Dan McDonald, Jennifer Rice, Marta Stuart Climate."— Presentation transcript:

1 How did we do? Insights from a CLIMAS pilot evaluation Dan Ferguson, Anne Browning-Aiken, Gregg Garfin, Dan McDonald, Jennifer Rice, Marta Stuart Climate Prediction Application Science Workshop March 7, 2008

2 Overview CLIMAS/RISA Purpose of the project Evaluation project team Process and methods Who is participating Research/evaluation questions Where we are in the process

3 CLIMAS/RISA Climate Assessment for the Southwest (CLIMAS) is one of 8 currently funded Regional Integrated Science and Assessments (RISA) programs

4 CLIMAS mission/mode We both do climate research and work iteratively with stakeholders, partners, and collaborators to provide timely, pertinent, and (hopefully) useful information, tools, services (or access to these) about climate to those who need these to make decisions.

5 CLIMAS team Program is 10 years oldcast of characters changes through time Currently 10 investigators + affiliate investigators + grad students + core office staff HQ at University of Arizona, but currently have investigator (Deborah Bathke) at New Mexico State University Highly interdisciplinary: anthropology, climatology, decision-support system development, geography, hydroclimatology, Latin American studies, paleoclimatology, resource economics

6 Purpose(s) of the evaluation project

7 Purpose of the evaluation project Broad evaluation of the RISA model as expressed by CLIMAS –Not eval. of particular product, info source, etc, but rather a first crack at an overall evaluation of CLIMAS –Roughly bounded in time2002-2007 Looking for key insights about penetration of information; perceived salience, credibility, and legitimacy* of CLIMAS; and changes in knowledge, behavior, understanding as a result of interactions with CLIMAS *After: Cash, D., W. Clark, et al. (2002). Salience, Credibility, Legitimacy and Boundaries: Linking Research, Assessment and Decision Making. Cambridge, MA, John F. Kennedy School of Government, Harvard University: 24 pp.

8 Purpose of the evaluation project (cont.) Input to CLIMAS program manager and investigators Input to the Climate Program Office and the other RISAs Input to NIDIS as it develops

9 Evaluation Team

10 Evaluation team Mixed team: –two members directly affiliated with CLIMAS (Ferguson and Garfin) –four members not previously affiliated with CLIMAS (Browning-Aiken, McDonald, Rice, Stewart)

11 Evaluation team roles Garfin=Encyclopedia of CLIMAS Ferguson=lead investigator/coordinator, but not conducting data collection Browning-Aiken + Rice=interviews McDonald=survey Stewart, Browning-Aiken, Rice=focus groups

12 Evaluation methods and team process

13 Methods Survey (online) Interviews (primarily telephone) Focus groups will follow survey and interviews; FG will be used to probe deeper into issues, ideas that emerge from survey and interview results

14 Survey Multiple iterations involving whole team Piloted survey with ~15 colleagues –Teased out obvious issues –Included required IRB disclaimer language, but made access (hid) behind a click Used professional web team to develop/build –Very fast turnaround, reliable product, able to customize and troubleshoot –Helped us understand options, e.g. email login

15 Team Process Develop research questions based on broad strokes of proposal Utilize whole team for development of research questions + all data collection instruments –Collaboratively and iteratively develop and refine data collection instruments=a very good thing

16 Who is participating?

17 A sample of organizations with whom we work Arizona Department of Environmental Quality National Agroclimate Information ServicePima Association of Governments Arizona Department of Transportation National Climatic Data CenterPima County Arizona Department of Water Resources National Drought Mitigation CenterPinal County Arizona Division of Emergency Management National Interagency Coordination CenterSalt River Project Arizona State UniversityNational Interagency Fire CenterSan Carlos Apache Tribe Bureau of Land ManagementNational Environmental Satellite, Data, and Information Service Sandia National Laboratories Bureau of ReclamationNational Park ServiceSanta Cruz County California Department of Water Resources National Resources Conservation ServiceSonoran Institute Central Arizona ProjectNational Weather ServiceThe Nature Conservancy Clark CountyNational Wildlife FederationUnited States Department of Agriculture Cornell UniversityNavajo NationUnited States Geological Survey Desert Research InstituteNew Mexico Department of AgricultureUniversity of Arizona Environmental DefenseNew Mexico Office of the State EngineerUniversity of California, Irvine Environmental Protection Agency New Mexico Rural Water AssociationUniversity of Montana Maricopa CountyNew Mexico State UniversityUniversity of New Mexico Matrix Consulting Group, IncNorthern Arizona UniversityUS Fish and Wildlife Service Mohave CountyNorthwest Interagency Coordination Center USDA Forest Service Pacific InstituteWestern Governors' Association Basic stats of evaluation participants –~150 people will be contacted –Representing > 50 organizations –~25-35 interviews –~120 people surveyed

18 Our spectrum of relationships Communication: e.g., receive Southwest Climate Outlook, e- mail updates or other publications; call or e-mail CLIMAS team members with specific questions Consultancy: e.g., ask for expert speaker for workshop/meeting; seek consultation on project development Partner: e.g., co-sponsor an event; been invited to speak at a meeting or workshop Collaboration: e.g., ongoing or long-lasting research collaborations; long-term engagement to address a particular issue

19 Research Questions

20 Research/evaluation questions Is CLIMAS achieving the overall RISA goals of being responsive, stakeholder-oriented, and use- inspired? Is CLIMAS perceived as salient, credible and legitimate?

21 Research/evaluation questions Is CLIMAS perceived by collaborating organizations as a reliable and responsive partner? What are the outcomes (short and medium term) that result from interactions with CLIMAS? How is CLIMAS accessed and is it reaching populations in need of climate information?

22 Lessons learned (so far) or what I know today that I didnt really know in October but probably should have

23 Lessons learned so far (common sense warning) Try to keep track of your stakeholders Mixed team (inside/outside program) has worked out very well Use whole team to develop/refine research questions and instruments Pros for developing web survey interface and database=major time/headache saver Take the time to pilot a survey; big return on small investment Institutional Review Board, oye Understand that sometimes evaluation is intervention

24 Where we are in the process Interviews beginning this week Survey link is being distributed over the next week Focus groups will follow, probably late April/May

25 Then what? White paper aimed at RISA, NIDIS, other programs and organizations grappling with similar issues Peer reviewed pub Better grip on next steps for ongoing evaluation

26 Questions? Dan Ferguson University of Arizona/CLIMAS dferg@email.arizona.edu http://www.climas.arizona.edu/


Download ppt "How did we do? Insights from a CLIMAS pilot evaluation Dan Ferguson, Anne Browning-Aiken, Gregg Garfin, Dan McDonald, Jennifer Rice, Marta Stuart Climate."

Similar presentations


Ads by Google