Presentation on theme: "Evaluation Research Applying Research Skills for Everyday Practical Purposes."— Presentation transcript:
Evaluation Research Applying Research Skills for Everyday Practical Purposes
Evaluation Research We ’ re talking “ applied research ” now. Ideas that should come to mind: –Applied is kind of like engineering –Evaluation is like inspection of structure –Field testing a product –This is a “ high demand ” area
Evaluation Research Evaluation research is appealing because it helps us scientifically analyze whether organizations are doing what they intend to do. Motive: –If you were considering donating to the Red Cross for Hurricane relief, what questions would you have for the Red Cross? –If you were a CEO, what questions would you have of your employees regarding your company? –If you were a judge attempting to make your probation department help prevent juvenile delinquency, what questions would you have of probation?
Evaluation Research Evaluation research begins with an understanding of Organizations. There are stakeholders who are invested in the organization because they want to provide something to the world or because they are getting something out of the organization. –Philanthropists –Board Members –Politicians –Contractors –Managers –Employees They are typically hierarchically arranged and form the support behind an organization or program.
Evaluation Research So who is likely to call for an Evaluation? Why? Who is likely to work with (or against) the researcher doing the evaluation? Stakeholders who control funding are most likely to request evaluations. –Reasons others wouldn ’ t? The most ideological of supporters and those getting income and prestige from programs are more likely to interfere. An evaluability assessment can help determine whether a proper evaluation can even be conducted.
Evaluation Research Inputs: The resources, raw materials, clients, and staff that go into a program. Program Process: The complete treatment or service delivered by a program. Outputs: The services delivered or new products produced by the program process. Outcomes: The impact of the program process on the cases processed or the state of the target population or the social conditions that a program is expected to have changed. Stakeholders are involved in each realm, have diverse interests, and are invested to various degrees.
Evaluation Research Inputs: The resources, raw materials, clients, and staff that go into a program. Program Process: The complete treatment or service delivered by a program. Outputs: The services delivered or new products produced by the program process. Outcomes: The impact of the program process on the cases processed or the state of the target population or the social conditions that a program is expected to have changed. Stakeholders are involved in each realm, have diverse interests, and are invested to various degrees. MSW, assistant, HIV education manuals, videos, schools provide kids Program provides in- school lectures detailing disease process, risky behaviors, and HIV prevention tactics. Some number of teens attend programs at some number of schools. Initial: Kids know more about HIV. Intermediate: Kids influence others to avoid risky behaviors. Long-term: Disease-free adults and lower HIV infection in community. For example: An HIV education program:
Evaluation Research Understanding the model of organizational production, one can see foci of evaluation studies. Assessments of: –Needs for the Program –Program Design and Theory –Program Process and Implementation –Program Outcome/Impact –Program Cost and Efficiency At the end of this lecture, you should see that these form a hierarchy where proper assessment at the next level depends in part on information provided at the previous level. InputsProgram ProcessOutputsOutcomes
Evaluation Research Okay !
Evaluation Research Needs Assessment Typical Questions: ( “ Typical Questions ” are from Evaluation, Seventh Edition by Rossi, Lipsey, and Freeman, pp ): What are the nature and magnitude of the problem to be addressed? What are the characteristics of the population in need? What are the needs of the population? What services are needed? How much service is needed, over what time period? What service delivery arrangements are needed to provide those services to the population?
Evaluation Research Approaches to Needs Assessment: –Determine what “ need ” is –Allow stakeholders or others to define the problem –Search for evidence of the problem or lack thereof –Commonly demanded of social service agencies, but also requested by agencies concerned that their clients have unmet needs –Multiple methods can be employed E.g., how would one determine whether there is a need for crime reduction programs in San José?
Evaluation Research Assessment of Program Theory Typical Questions: What clientele should be served? What services should be provided? What are the best delivery systems for the services? How can the program identify, recruit, and sustain the intended clientele? How should the program be organized? What resources are necessary and appropriate for the program?
Evaluation Research Approaches to Program Theory Assessment: –Programs have: Impact Theory (Causal theory such as Media Campaign HIV Knowledge, Awareness Healthier Habits –Articulated Program Theory –Implicit Program Theory Service Utilization Plan: Refers to assumptions about how and why intended clients will become engaged in the program and follow through to the end, experiencing impact as impact theory suggests Organizational Plan: Refers to coordination of program functions and activities and acquisition of necessary human, financial and physical resources to implement plans. –Evaluator must determine whether these are sensible given Needs, program goals, common sense, prevailing social science literature,
Evaluation Research Approaches to Program Theory Assessment: –Evaluator must determine whether these are sensible given needs, program goals, local context, common sense, prevailing social science literature, best practices, staffing, available resources, etc. –Without understanding program theory, one may do black box evaluation but there will be little understanding as to why a program works or does not work. –Is failure an implementation failure or a theory failure?
Evaluation Research Are you getting it???
Evaluation Research Assessment of Program Process Typical Questions: Are administrative and service objectives being met? Are the intended services being delivered to the intended persons? Are there needy persons whom the program is not reaching? Once in service, do sufficient numbers of clients complete service? Are the clients satisfied with the services? Are administrative, organizational, and personnel functions handled well? When program process is evaluated for the purposes of shaping and refining ongoing program operations, the assessment is called a “ Formative Evaluation. ”
Evaluation Research Approaches to Process Evaluation: –What an organization is supposed to do and what it actually does are two different things. Process evaluation reveals how a plan is actually implemented. –Is delivery of service adequate, uniform? –Keeping detailed data on the process allows researchers to say for whom the program is effective, for whom it is not, and sometimes why. –Researchers gather and analyze many indicators such as program records, participant surveys, community surveys, participation intensity, and qualitative observations, interviews, and focus groups. –Analysis is often statistical and qualitative.
Evaluation Research Outcomes/Impact Assessment Typical Questions: Are the outcome goals and objectives being achieved? Do the services have beneficial effects on the recipients? Do the services have adverse side effects on the recipients? Are some recipients affected more by the services than others? Is the problem or situation the services are intended to address made better?
Evaluation Research Approaches to Impact Analysis: The general approach is to compare what happens after programs are implemented with what would have happened without the programs. An experimental approach is generally preferred in impact analysis because a program is conceptualized as a manipulation of an independent variable that is intended to cause change in a dependent variable. E.g., Program Theory: Education Knowledge Less Risky Behavior (from Evaluation, Seventh Edition by Rossi, Lipsey, and Freeman, p207):
Evaluation Research Efficiency Assessment Typical Questions: Are resources used efficiently? Is the cost reasonable in relation to the magnitude of the benefits? Would alternative approaches yield equivalent benefits at less cost?
Evaluation Research Approaches to Efficiency Analysis: –Cost-benefit analysis: Compares program costs to the economic value of program benefits (focus is on ratio of dollar cost to dollar benefit) –Cost-effectiveness analysis: Compares program costs to actual program outcomes (focus is on what you got for your money) –What are actual costs (easier to measure) and what is the value in dollars of the tangible and intangible outcomes and impact? –Many assumptions are made, so they must be specified. In addition, good reports alter assumptions to see how sensitive determinations are to alterations. –Given the vagaries of social outcomes (e.g., it is hard to put a dollar value on keeping a mentally handicapped child happy), social service evaluations often concentrate on cost- effectiveness analyses
Evaluation Research So do you want to be one of these folks???? Institute of Organizational and Program Evaluation Research The Institute of Organizational and Program Evaluation Research (IOPER) is an organized research unit of the School of Behavioral and Organizational Sciences (SBOS) at Claremont Graduate University. The mission of IOPER is to provide services and to conduct research to improve the effectiveness of a wide range of programs and organizations. Using “ state of the art ” scientific knowledge and methodologies, IOPER has provided applied research, evaluation, and organizational consulting services to more than 100 different organizations in the past decade.