ADVISORY BOARD MEETING 2013 Doing a Forest Governance Assessment: Practical Tips and Tricks Ken Rosenbaum, Sylvan Environmental Consultants.

Slides:



Advertisements
Similar presentations
National training programmes EHES Training seminar, Rome, 12 February 2010 Päivikki Koponen.
Advertisements

Training activities administration and logistical support
Building Resources in Democracy, Governance & Elections BRIDGE Project.
Governance for REDD+ Crystal Davis Governance of Forests Initiative World Resources Institute REDD Civil Society Coordination Seminar CIFOR campus, Bogor.
Multiple Indicator Cluster Surveys Survey Design Workshop MICS Technical Assistance MICS Survey Design Workshop.
Participant Observation: a Field Study APPROACH
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Child Protection Rapid Assessment Tool
Identifying enablers & disablers to change
1 Changing the way CQC regulates, inspects and monitors care.
Participant Observation
CS305: HCI in SW Development Evaluation (Return to…)
On-the-Spot: Needs Assessment. Objectives To recognize the importance of conducting a rapid initial assessment before deciding whether and how to respond.
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
Extended Project Research Skills 1 st Feb Aims of this session  Developing a clear focus of what you are trying to achieve in your Extended Project.
Publishing qualitative studies H Maisonneuve April 2015 Edinburgh, Scotland.
ADVISORY BOARD MEETING 2013 An Introduction to the What, Why and How, of Forest Governance Assessments. Nalin Kishor (Sr. NR Economist, PROFOR, The World.
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
An evaluation framework
Quality evaluation and improvement for Internal Audit
Quantitative Research
GRDG690 Action Research: Literacy Week 3: Methods, Data Collection & Ethics Gloria E. Jacobs, Ph.D.
Action Research In Organizational Development. Action Research Coined by Kurt Lewin (MIT) in 1944 Reflective process of progressive problem solving Also.
Creating a world where environmental sustainability and social justice are the normal conditions of business
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Preparing for Data Collection Need to recognize that data collection is a high level activity that cannot be just passed off to graduate assistant Need.
Thinking Actively in a Social Context T A S C.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Action Research March 12, 2012 Data Collection. Qualities of Data Collection  Generalizability – not necessary; goal is to improve school or classroom.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
ESKOM CAPE TOWN EXPO FOR YOUNG SCIENTISTS Expo workshop 2015.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Evaluating a Research Report
Data Collection Methods
Inquiry and Investigation. What was the TOPIC? PROBLEM? CIVIC INQUIRY?
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
System Analysis-Gathering Requirements.  System analysis is the process of gathering info about existing system, which may be computerized or not, while.
Data Quality Assessment
Comprehensive Unit Based Safety Program    A webinar series for QI Managers, Nurse Leaders and others supporting healthcare improvement in Wisconsin’s.
Data sources and collection methods Ken Mease Cairo, June 2009.
Training Resource Manual on Integrated Assessment Session UNEP-UNCTAD CBTF Process of an Integrated Assessment Session 2.
Ensuring rigour in qualitative research CPWF Training Workshop, November 2010.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
United Nations Statistics Division Work Programme on Economic Census Vladimir Markhonko, Chief Trade Statistics Branch, UNSD Youlia Antonova, Senior Statistician,
The Problem with PAM A Problem-based Case Study © Copyright 2006, Metropolitan Community College.
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
COMPONENT 2: TRAINING ON ADAPTATION AND MITIGATION Vũ Thế Thường Training Officer June 10, 2009.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Requirements Engineering Processes. Syllabus l Definition of Requirement engineering process (REP) l Phases of Requirements Engineering Process: Requirements.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Systems Accreditation Berkeley County School District Accreditation Team Chair Training October 20, 2014 Dr. Rodney Thompson Superintendent.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Improving Justice Sector Assessments. Insanity: doing the same thing over and over, while expecting the outcome to be different.
CHANGE READINESS ASSESSMENT Measuring stakeholder engagement and attitude to change.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
5. Presentación general de la iniciativa REDD+ SES 5. Presentation of the REDD+ SES Initiative.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
The United Kingdom experience in data collection and statistics on disability Ian Dale Head of Disability Analysis Department for Work and Pensions Steel.
Module 2 National IEA Process Design and Organization Stakeholder’s consultative workshop/ Training on Integrated assessments methodology ECONOMIC VALUATION.
What every benchmarking coordinator needs to know
Pre-planning Planning to plan (and adapt) Implementation starts Here!
Welcome to Scottish Improvement Skills
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Community Facilitator Introduction to FORGE AHEAD
Auditing Sustainable Development Goals
Building Knowledge about ESD Indicators
2012 Annual Call Steps of the evaluation of proposals, role of the experts TEN-T Experts Briefing, March 2013.
Presentation transcript:

ADVISORY BOARD MEETING 2013 Doing a Forest Governance Assessment: Practical Tips and Tricks Ken Rosenbaum, Sylvan Environmental Consultants

Today we will look at three topics 1.The simplest way to carry out an assessment: take a data collection tool that someone else has designed and adapt it to your needs. 2.How to design your own data collection tool (or modify an existing tool) if you can’t find one that suits your needs. 3.What are good practices in collecting forest governance data?

But first a brief recap from Webinar 1: What is a forest governance assessment? An assessment is an attempt to measure forest governance. Assessments can be used to diagnose problems, compare conditions, or monitor efforts to change.

FOUR WELL-DOCUMENTED TOOLS AIMED AT FORESTS OR NATURAL RESOURCES The PROFOR Diagnostic. The USAID Sustainable Conservation Approaches in Priority Ecosystems (SCAPES) governance assessment tool. The World Resources Institute’s (WRI’s) Governance of Forests Initiative Indicators The Indonesia Participatory Governance Assessment (PGA) for REDD+.

The PROFOR Diagnostic Tool

Useful to diagnose problems, to identify priority areas for reform, or to set a baseline for future monitoring Based on the FAO–PROFOR Framework Uses a set of up to 130 indicators (English, French, and Russian versions available) Scored by stakeholders in workshop(s) Validated by key informants Complete “How to” guide available on PROFOR.info website

The USAID SCAPES Tool

Developed by Wildlife Conservation Society for USAID Designed to work on landscape level, looking at natural resource management generally Based on its own framework for analyzing governance, around pillars of Legitimacy, Capacity, and Power A workshop, bringing together representatives of stakeholders or other key informants or experts, identifies key groups and their influence, strengths, and weaknesses Full how-to guide available on frameweb.org

The Indonesia PGA

Developed in Indonesia with support of UNDP/UN- REDD as part of REDD+ readiness Not presented as a tool for general use, but as a report on what Indonesia did Highly participatory, steered by an advisory group of experts in close contact with stakeholders Used 117 indicators Gathered data locally and regionally, via combination of methods. including document reviews, key informant interviews, and focus groups. Report available on undp.org site

WRI Governance of Forests Initiative

An approach piloted in Brazil, Indonesia, and Cameroon Uses 122 indicators WRI has published separate publications on (1) the indicators and (2) methods to score them Can score via desk reviews, key informant interviews, focus groups, participant observation, testing of systems, etc.

I’d Rather Design It Myself! 1.Refine your Scope: Exactly What to Measure 2.Identify Sources of Data 3.Select Data Collection Methods 4.Develop “Tools”: Interview Protocols, Questionnaires, Sampling Plans, etc. 5.Write out a Data Collection Plan or Manual

Refine your scope: What to Measure Didn’t we set the scope in our Work Plan? Yes, and we’ll start with that, but we need more detail now Look at what other assessments have done Decide how detailed you need to be Decide whether to use a narrative description of scope or to use criteria and indicators Set out your scope in writing

Identify Sources of Data Written materials: past assessments, official publications, unofficial government documents, laws, budgets, media reports, academic studies, etc. People: officials, academics, experts, stakeholders, etc. that can be reached through interviews, focus groups, workshops, surveys, etc. Physical evidence: Less commonly used in governance assessments, these can include forest site evaluations, testing of government functions, inspection of boundary markers.

Identify Methods Now you have a sense of where the information might be; how will you collect & validate it? Desk reviews: a good way to tap existing documents; low cost; but no “new” information Expert analysis: Can use multiple experts to add depth to desk reviews, provide opinion, validate other methods; but potential for biased experts. Surveys: Produce lots of data; repeatable; but costly; may not be in-depth; may have bias Key informants: may tap rich sources of information at low cost; but may be biased, not easily replicated

More methods Focus groups: More people=broader perspective than one-on-one interviews or surveys, but some less assertive people may not be heard. Workshops: Even broader perspectives, promotes communication among stakeholders, but can be expensive; hard to get balance of stakeholders; requires planning and skill to extract information from participants.

Develop tools Desk reviews: will you gather existing data or perform new analysis? Experts: What kinds of experts? What terms of reference? Key Informants: How to select? Protocol for interviews? Surveys: Sampling issues. Question design. Focus Groups & Workshops: Sampling issues. How to convene? How to capture (code) responses Get help from social scientists!

Write it down A data collection plan: for use of the managers of the data collection effort Maybe just a refinement of the work plan If it is more, check against the work plan, including the timeline and budget, and adjust as needed A data collection manual: for use of the people in the field who are actually collecting data Not always necessary, but in a large effort, helps with consistency

Vet & validate your methods Consult experts and stakeholders about your design effort. Do it formally or informally. Be transparent! Transparency now leads to greater credibility later.

Good Practices in Data Collection Assembling a data collection team Going out an getting your data Assuring data quality

Assembling a team This step varies, depending on methods and size of effort May need just a few people, who take on multiple roles May need managers, logistics coordinators, researchers/field people, data managers, etc. Larger teams need more documentation, training Team members should be trustworthy, capable, and unbiased

Collecting data Interviewing, facilitation, and survey administration are all teachable skills. Be sure your team is using good and consistent techniques. Give careful thought to coding: your team should capture data in a complete and consistent manner. Be aware of ethical issues: be transparent and truthful; get informed consent; respect privacy and confidentiality; keep people safe from harm; guard the integrity of the data.

Quality assurance When data come in from the field: Edit: Be sure data entries or notes are complete and readable; get back to collectors if not. Clean: Flag data that stands out or suggests an error in entry; look for missing or duplicated entries; etc. Verify: Spot check that data were actually collected; also that data were not miscopied or garbled in transmission. Triangulate: Confirm against other sources.

WHERE WE HAVE BEEN & WILL HEAD NEXT: Webinar 1 (January 20) covered the definition of forest governance and how to plan an assessment. A recording is available online. Webinar 3 (26 March) will cover analyzing your data, making recommendations, and getting your results out to the right audiences. Also, we will talk about setting the stage for any assessments that might follow.

For More Guidance: The FAO–PROFOR Framework: Search on the internet for “Framework for Assessing and Monitoring Forest Governance” The PROFOR–FAO Guide to Good Practices: Search on the Internet for “Forest Governance Data Collection and Analysis”

THE REFORM CHALLENGE “If you want to make enemies, try to change something.” — Woodrow Wilson, President of the Untied States, speaking in 1916.

THANKS FOR LISTENING For more information contact: Nalin Kishor, Ph. D Sr. Natural Resources Economist PROFOR Forests Team, GENDR The World Bank 1818 H St., N.W. Washington, D.C 