Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building Evaluation Capacity (BEC) Beatriz Chu Clewell, Urban Institute Patricia B. Campbell, Campbell-Kibler Associates, Inc.

Similar presentations


Presentation on theme: "Building Evaluation Capacity (BEC) Beatriz Chu Clewell, Urban Institute Patricia B. Campbell, Campbell-Kibler Associates, Inc."— Presentation transcript:

1 Building Evaluation Capacity (BEC) Beatriz Chu Clewell, Urban Institute Patricia B. Campbell, Campbell-Kibler Associates, Inc.

2 BEC: The Background Evaluation Capacity Building (ECB) is a system for enabling organizations and agencies to develop the mechanisms and structure to facilitate evaluation to meet accountability requirements. ECB differs from mainstream evaluation by being continuous and sustained rather than episodic. It is context-dependent; it operates on multiple levels; it is flexible in responding to multiple purposes, requiring continuous adjustments and refinements; and it requires a wide variety of evaluation approaches and methodologies (Stockdale et al, 2002).

3 BEC: The Project The goal of BEC was to develop a model to build evaluation capacity in three different organizations: the National Science Foundation (NSF), the National Institutes of Health (NIH), and the GE Foundation. More specifically, the project’s intent was to test the feasibility of developing models to facilitate the collection of cross-project evaluation data for programs within these organizations that focus on increasing the diversity of the STEM workforce.

4 BEC Guide I: Designing A Cross-Project Evaluation Evaluation design & identification of program goals Construction of logic models The evaluation approach, including:  Generation of evaluation questions  Setting of indicators Integration of evaluation questions and indicators Measurement strategies including:  Selection of appropriate measures  The role of demographic variables

5 Highlights from Guide I: Constructing a Logic Model The basic components of a simplified logic model are: 1.Inputs (resources invested) 2.Outputs (activities implemented using the resources) 3.Outcomes/impact (results)

6 Highlights from Guide I: Constructing a Logic Model (continued…)

7 Highlights from Guide I: Questions & Indicators

8 BEC Guide II: Collecting and Using Cross Project Evaluation Data The strengths and weaknesses of various types of formats that can be used in data collection Data collection scheduling Data quality and methods of ensuring it Data unique to individual projects Confidentiality and the protection of human subjects in data collection Ways of building data collection capacity among projects Rationales, sources, and measures of comparison data Issues inherent in the reporting and displaying of data The uses to which data might be put

9 Highlights From Guide II: Data Collection Formats

10 Highlights From Guide II: Available Information on Comparison Databases URL Availability: Public Access/ Restricted Use (fees/permission needed) Data Format: Web Download/Other Electronic Student Demographic Variables: Race/Ethnicity, Sex, Disability, Citizenship Data Level: National, State, Institution, Student Student Population: Pre-College, College, Graduate School, Employment Survey Population: First Year; Most Recent Year Available Other Variables: Attitudes, Course-taking, Degrees, Employment, etc

11 Highlights From Guide II: Making Comparisons

12 Other Sources of Comparison Data The WebCASPAR database (http://caspar.nsf.gov) provides free access to institutional level data on students from surveys as Integrated Postsecondary Education Data System (IPEDS) & the Survey of Earned Doctorates. The Engineering Workforce Commission (http://www.ewc-online.org/) provides institutional level data (for members) on bachelors, masters and doctorate enrollees & recipients by sex by race/ethnicity for US students & by sex for foreign students. http://www.ewc-online.org/ Comparison institutions can be selected from the Carnegie Foundation for the Advancement of Teaching’s website, (http://www.carnegiefoundation.org/classifications/) based on Carnegie Classification, location, private/public designation, size and profit/nonprofit status. http://www.carnegiefoundation.org/classifications/

13 Some Web-based Sources of Resources OERL, the Online Evaluation Resource Library http://oerl.sri.com/home.html http://oerl.sri.com/home.html User Friendly Guide to Program Evaluation http://www.nsf.gov/pubs/2002/nsf02057/start.htm http://www.nsf.gov/pubs/2002/nsf02057/start.htm AGEP Collecting, Analyzing and Displaying Data http://www.nsfagep.org/CollectingAnalyzingDisplayingDa ta.pdf http://www.nsfagep.org/CollectingAnalyzingDisplayingDa ta.pdf http://www.nsfagep.org/CollectingAnalyzingDisplayingDa ta.pdf American Evaluation Association http://www.eval.org/resources.asp http://www.eval.org/resources.asphttp://www.eval.org/resources.asp

14 Download the Guides http://www.urban.org/publications/411651.html or Google on “Building Evaluation Capacity Campbell” http://www.urban.org/publications/411651.html


Download ppt "Building Evaluation Capacity (BEC) Beatriz Chu Clewell, Urban Institute Patricia B. Campbell, Campbell-Kibler Associates, Inc."

Similar presentations


Ads by Google