Using the Guiding Principles for Evaluators to Improve Your Practice Meredith Stocking, Atlanta-Area Evaluation Association Note: This presentation has.

Slides:



Advertisements
Similar presentations
WASC Visiting Committee Report 3/28/2007. Areas of Strength Organization The Co Principals and the School Leadership Team provide direction and support.
Advertisements

Association of American Colleges and Universities.
Highlighting Parent Involvement in Education
Report to the KSD Board June 9, Provide Kent School District the necessary guidance and assistance to create an equitable, academically enriching,
Addressing minority health access through community-based health literacy research Susan J. Shaw, Ph.D., University of Arizona.
Diversity Issues in Research Charlotte Brown, Ph.D. Associate Professor of Psychiatry Western Psychiatric Institute and Clinic PMBC Summer Institute, Pittsburgh,
Purpose of Information and Referral Services?. Purpose of Information and Referral Services The primary purpose of Information and Referral services is.
Spark NH Council Member Survey October – November, 2012.
 Community Engagement For Local Government Councillors It is the business of council to involve the public in the business of government Presentation.
What can outcomes based planning and evaluation do for you? Elizabeth Kryder-Reid, Director, Museum Studies, IUPUI and Principal Investigator, Shaping.
Teacher Excellence and Support System
Engaging Patients and Other Stakeholders in Clinical Research
Note: Lists provided by the Conference Board of Canada
In the Framework of: Financed by: Developed by: Module 2 CURRENT CONDITION ASSESSMENT.
Neighbor to Neighbor Lessons learned from a community- based HIV testing partnership: The HIV Minority Community Health Partnership Presented at American.
Tri-Council Policy Statement 2010 Ethical Conduct for Research Involving Humans.
Understanding Boards Building Connections: Community Leadership Program.
Code of Ethics of the National Association of Social Workers
Purpose of the Standards
For more information, please contact Jonny Andia at 1.
Research Methods for the Social Sciences: Ethics Ryan J. Martin, Ph.D. Thomas N. Cummings Research Fellow March 9, 2010.
How to Develop the Right Research Questions for Program Evaluation
International Conference on Enhancement and Innovation in Higher Education Crowne Plaza Hotel, Glasgow 9-11 June 2015 Evaluation of a peer tutoring scheme.
Putting Professional Ethics into research and practice BASW.
King Saud university Collage of nursing Master program Nursing administration Special Problems in Clinical Specialization (NUR 574 ) Prepared.
Leaving No One Behind Communicating with Special Populations During Public Health Emergencies Doris Y. Estremera, MPH, CHES San Mateo County Health Department.
Topic 4 How organisations promote quality care Codes of Practice
Health for Life Dunblane Cluster Schools Sexual Health and Relationships Education “A practitioner’s reflection on the successes and challenges of implementing.
Domain 4: Professional Responsibilities. Component 4a: Reflecting on Teaching ElementUnsatisfactoryBasicProficientDistinguished AccuracyTeacher does not.
Connected Learning with Web 2.0 For Educators Presenter: Faith Bishop Principal Consultant Illinois State Board of Education
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Roles and Responsibilities Of the library trustee NJLTA New Jersey Library Association.
Mission The faculty and staff of Pittman Elementary School are committed to providing every student with adequate time, effective teaching, and a positive.
Registrant Engagement Through CPD Aoife Sweeney, Head of Education, CORU - Health and Social Care Professionals Council, Ireland.
Survey Results & Analysis for NCMA Board of Directors Meeting Survey (July 27, 2014) 1.
What We've Learned: How Service Prepares Individuals for Employment and Post-Secondary Education Sheila Fesko Dana Carpenter.
Journeying with a Board Coach. Why seek external assistance? From experience, the reasons for engaging someone to work with your Board tend to fall into.
SAS: Resiliency December 8, Build: SAS Resiliency Clear Standards and Curriculum Frameworks –Update –Student / School Resiliency and School Climate.
TELECONFERENCE/WEBINAR ON MAY 6,2010 2:30 – 4:00 PM EASTERN THE NATIONAL CHILD WELFARE RESOURCE CENTER FOR ORGANIZATIONAL IMPROVEMENT Building Ongoing.
SCHOOL BOARD A democratically elected body that represents public ownership of schools through governance while serving as a bridge between public values.
Opening Activity  Welcome!  Sit at a table that represents an area on which you would like to talk with others about  As you sit at your table, waiting.
Ethics in Research: APA code & Review Boards. Definition the study of proper action Morality right versus wrong it is the shared responsibility of the.
Ethical Decision Making and Consumer Directed Care Have You Thought About It? Angie Robinson November 2023.
Standard 1: Teachers demonstrate leadership s. Element a: Teachers lead in their classrooms. What does Globally Competitive mean in your classroom? How.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Special Education is a service, not a place IDEA and NCLB have changed the focus on access to the general curriculum from WHERE to: WHAT, a focus on what.
Critically reviewing a journal Paper Using the Rees Model
Chapter 19 Perspectives on Diversity and Ethical Behavior.
11/3/2011 Exploring How Values, Identity and Gender Influence Evaluator Approach and Role.
FLIBS Dec Biology Category 1 Session 2: Learning Biology within the IB Philosophy.
High Performance Leaders in Irving Independent School District (IISD) Administrator’s Leadership Conference August 3, 2010 Leadership 1.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
How to get the ball moving and Keep it rolling? Motivation.
Family-Centered Care Collaboration: Practice Components Unit II 1.
iPPQ Team Report for Practitioners
Action Research Purpose and Benefits Technology as a Learning Tool to Improve Student Achievement.
Choosing Questions and Planning the Evaluation. What do we mean by choosing questions? Evaluation questions are the questions your evaluation is meant.
`iCARE` Improving our patients, clients and service users experience (with grateful thanks to Yeovil District General Hospital Foundation Trust)
1 1 Using the Guiding Principles for Evaluators to Improve Your Practice Released 2008; Updated 2011 Note: This workshop has been approved by the AEA Board.
Assessment and Learning in Practice Settings (ALPS) © Structuring observational assessment to promote learning in practice 12.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Health and Social Care Action Group A Human Rights Based Approach in action – Care about Rights.
©SHRM SHRM Speaker Title Bhavna Dave, PHR Director of Talent SHRM member since 2005 Session 2: Relationship Management Competencies for Early-Career.
 Counseling substance abuse users.  NASW: National Association of Social Workers- code of ethics.  NAADAC: National Association of Alcoholism and.
Supporting Your Child in the IB MYP and Diploma Programme.
HR and Knowledge Management in Multidisciplinary Team

Providing Customized Training on Quality Online Design and Delivery
Presentation transcript:

Using the Guiding Principles for Evaluators to Improve Your Practice Meredith Stocking, Atlanta-Area Evaluation Association Note: This presentation has been adapted in part from the AEA’s Guiding Principles Training Packet

Objectives Increase knowledge of the AEA Guiding Principles for Evaluators (GP) Analyze the Guiding Principles in a program evaluation context Consider how the Guiding Principles can be used to inform your evaluation practice

What are ethical problems in evaluation anyway? Issues of moral accountability that involve doing the “right” or “wrong” thing Often involve the welfare of others Choices between unfavorable alternatives

AEA’s Development of the Guiding Principles for Evaluators 1986 : Founding of American Evaluation Association : Original five Guiding Principles for Evaluators developed and ratified : GP reviewed and updated 2004 : Revised GP endorsed through referendum of AEA membership

Format and Purpose of the Guiding Principles for Evaluators Format of the Guiding Principles (long version): o 10 assumptions o 5 principles with supporting sub-principles o 2 pages of background o Purpose of the Guiding Principles: o Promote ethical evaluation practice o Foster continuing professional development o Stimulate discussion within and outside evaluation

Assumptions Behind the Guiding Principles for Evaluators The Guiding Principles: Proactively guide everyday practice Cover all kinds of evaluation Do not apply in every situation Are not independent, but overlap Sometimes conflict Were developed in the context of Western cultures

Principle A: Systematic Inquiry Evaluators conduct systematic, data-based inquiries: o Adhere to highest technical standards o Explore strengths and shortcomings of evaluation questions and approaches o Communicate approaches, methods and limitations accurately

Principle B: Competence Evaluators provide competent performance to stakeholders: o Possess appropriate skills and experience o Demonstrate cultural competence o Practice within limits of competence o Continually improve competencies

Principle C: Integrity/Honesty o Negotiate honestly with clients and stakeholders o Disclose values, interests and conflicts of interest o Represent accurately methods, data and findings o Disclose source of request and financial support for evaluation Evaluators display honesty and integrity and attempt to ensure them throughout the entire evaluation process:

Principle D: Respect for People Evaluators respect security, dignity and self-worth of all stakeholders: o Understand evaluation context o Get informed consent and protect confidentiality o Maximize benefits and minimize harm o Foster social equity o Respect differences among stakeholders

Principle E: Responsibilities for General and Public Welfare Evaluators take into account general and public interests : Include a full range of stakeholders Examine assumptions and potential side effects Balance client and stakeholder needs Allow all relevant stakeholders access to findings in understandable forms

Case Study: Evaluating the Health Care Collaborative Background: The Health Care Collaborative (HCC), a signature program of a local nonprofit organization, is designed to increase the delivery of primary health care services to residents in a low-income, underserved neighborhood. The HCC uses trained neighborhood residents as outreach health workers to raise health-issues awareness among residents and to give them options for accessing health care. Health care providers who are collaboration partners deliver a range of services to program participants. HCC’s principal funder recently approached the nonprofit’s board or directors to request an external evaluation of the program. Margaret, the nonprofit’s director, is meeting with Jane, a former board member and professor at the local University, to ask her to consider taking on the evaluation. Margaret: Jane, it’s been such a long time! It’s great to see you! How have things been going? Jane: Everything’s great! I am really busy leading a large multi-site evaluation with one of my colleagues, and still teaching a few evaluation classes. What can I do for you today? Margaret: Well, our main funder just asked us for an external evaluation of our Health Care Collaborative Program. Jane: Yes, that’s a fabulous initiative. What do they want to know? Continued on the next slide.

Margaret: They are saying that they need more information than the program’s reporting system alone can provide- principally about how the neighborhood residents and program participants view HCC, and how the program is meeting or not meeting identified service needs. Jane: That sounds pretty straight forward. Have you issued an RFP? Margaret: Well, you know that we always have such a positive response to the work we do, but the program has been subject to recent criticism from some new neighborhood residents. We are really relying on the funder to renew our grant next year, and we don’t want the results of the evaluation to be skewed by the opinion of a vocal few. We’d really like to work with someone we trust who is familiar with our track record. Jane: Honestly Margaret, you are a great friend, but I am really slammed right now. I don’t think I can take on another project. Margaret: What if you designed the evaluation and provided some oversight, but recruited one or two graduate students to do the data collection and analysis? Jane: I suppose that may be an option. Can you give me an idea of the current composition of the community and the stakeholders that would be part of the evaluation team? I need to think about whether I have any students that would be a good fit. It sounds like you’re going to need to get some fairly broad community involvement given the information needs of the funder. Margaret: Well, we only have about eight months to complete the evaluation and our budget is fairly tight, so we may need to limit stakeholder participation on the evaluation team. The make-up of the community has been in flux lately. The African American and Latino populations are still big, but the neighborhood has seen a lot of East African refugees arrive as well as some immigrants from Eastern European nations. It’s becoming more and more diverse. Continued on the next slide.

A few days later: Jane is talking with one of her best graduate students, Alisa, about possibly working on the HCC evaluation. Alisa did some data analysis for Jane last semester and she knows he does good work. Alisa is also has the advantage of being fluent in Spanish. Alisa: The project sounds really interesting. And I am actually looking for a subject to write my Master’s thesis about. This could be a perfect topic! What kind of design are you thinking of? Jane: Probably a mixed-methods approach. We could do some surveys of participants, program staff, and health care providers. Also, one of the things they want to know is how beneficiaries and other folks in the neighborhood feel about the program, so a few focus groups would be good. Do you have any experience conducting focus groups? Alisa: I could probably wing it. I took a workshop on qualitative methods one time. I mean, its not like we are talking about rigorous experimental methods or anything. How difficult can it be? Jane: I suppose it’s a nice opportunity for you to get some practice. It’s just a small nonprofit program, and I don’t imagine the stakes of the evaluation are too high. Besides, I don’t have any other students with your language skills. When are you available to get started? Alisa: I could start as soon as you’d like. But I would have to do all the data collection during the day. I have all night classes this semester. Continued on the next slide.

After data collection: Alisa and Margaret meet to discuss the initial findings and plans for data dissemination. Alisa will report back to Jane, who is unable to make the meeting due to travel for her other evaluation project. Margaret: So Alisa, how did data collection go? What are you finding? Alisa: I ended up conducting three focus groups- one for senior citizens, another for adult non-senior males, and the third for adult, non-senior females. I am still in the process of analyzing the survey and focus group data, but all in all participants are overwhelmingly positive and satisfied with your services. And this is probably no news to you, but the program serves a disproportionate number of Hispanic adults compared to the neighborhood’s composition. I think that it may be important to consider why other racial and ethnic groups aren’t taking advantage of your services in greater numbers. But since we didn’t involve non-participants in the study, we really can’t say. Margaret: The board and the funder will be very happy to hear about the positive results. I am planning on giving the main evaluation briefing at next month’s meeting of the HCC board to which the funder will be invited. We can discuss the issue of how to better engage other racial and ethnic groups with the program staff at a later date. It’s really a programming issue and not something the funder needs to worry about. Alisa: Really? Margaret: Yeah. But don’t worry about it. We will definitely use the information. Alisa: Ok. Hey, there was one more thing I wanted to ask you about. Jane and I wanted to do a presentation on this evaluation at our national evaluation conference next month. Would that be OK with you? Of course, we will be sure to protect the confidentiality of all of the participants. Margaret: That’s a great idea! I am really happy that you are getting so much out of this opportunity. And I heard from Jane that your thesis is really looking nice. I would love to get a copy of it when you are done. Alisa: Sure thing! Discuss !

Food for Thought What elements of competence are at stake? How do omissions in competence affect the ability to meet the principle of systematic inquiry? What about respect for people? In what ways does the evaluation address potential weaknesses or criticisms of convenience in gathering data as opposed to systematic methods? How would you assess the honesty and integrity of those involved in the evaluation? If you were part of this evaluation, what would you do to resolve or handle issues related to integrity? Is there anything problematic about the way that stakeholders were included in the evaluation activities? How does the level and kind of stakeholder involvement relate to the principles of respect for people and public interest? Is the evaluation reporting adequate? What about the use of findings? To what extent does this evaluation attempt to foster social equity?

Discussion What ethical dilemmas have you encountered in your own evaluation practice? How did you handle the situation? How can you use the Guiding Principles as you design and conduct your own evaluations?

“ Not knowing what constitutes best practice is incompetence. Knowing what best practice is, but not knowing how to achieve it, may be inexperience. Knowingly not following best practice, when one knows how to achieve it, is unethical.” Smith, N. (2002). An analysis of ethical challenges in evaluation. American Journal of Evaluation, 23, THANK YOU!