Leukemia & Lymphoma Society (LLS) Information Resource Center (IRC) Planning for a Service Program Evaluation - Case Study Public Administration 522 Presented.

Slides:



Advertisements
Similar presentations
UCSC History. UCSC: A brief history 60s University Placement Committee A lot of field trips/interaction with employers.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Pushing Past Barriers to Post-Primary Education Sharing Reflections on the Methodology of UNICEF’s Regional Education Study.
The CGEN Project: Development, Implementation and Testing of Genetics Education Materials for Use in Community and Clinical Settings National Coalition.
What You Will Learn From These Sessions
CULTURAL COMPETENCY Technical Assistance Pre-Application Workshop.
OVERVIEW OF ClASS METHODS and ACTIVITIES. Session Objectives By the end of the session, participants will be able to: Describe ClASS team composition.
Facilities Management 2013 Manager Enrichment Program U.Va.’s Strategic Planning Initiatives Colette Sheehy Vice President for Management and Budget December.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Linking Actions for Unmet Needs in Children’s Health
A Logic Model for the Effective Implementation of Service Coordination: Culmination of Five Years of Research Michael Conn-Powers, Indiana University Julia.
Laying the Foundation for Success: SDPI Demonstration Projects Overview November19, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative:
1 Ben George – Poet, Al Zantua & David Little Raven – Drummers.
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
PPA 502 – Program Evaluation
Cancer Disparities Research Partnership Program Process & Outcome Evaluation Amanda Greene, PhD, MPH, RN Paul Young, MBA, MPH Natalie Stultz, MS NOVA Research.
LEUKEMIA & LYMPHOMA SOCIETY TEAM IN TRAINING. THE LEUKEMIA & LYMPHOMA SOCIETY WORLDS LARGEST VOLUNTARY HEALTH ORGANIZATION DEDICATED TO FUNDING BLOOD.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Evaluation. Practical Evaluation Michael Quinn Patton.
Lecture 3 Strategic Planning for IT Projects (Chapter 7)
How to Develop the Right Research Questions for Program Evaluation
11 Building an Effective Peer Support Program: A Proven Volunteer Model General Overview of Peer Support and ABIL Program September 23, :00 a.m.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
Towards the Development of Health Promotion Competencies Laying the Foundation for Discipline-Specific Competency Development Marco Ghassemi, MSc Chronic.
1 SARAH HUNTER RAND CORPORATION LAURA STEIGHNER AMERICAN INSTITUTES FOR RESEARCH NOVEMBER 16, 2009 National Evaluation of the Demonstration to Improve.
Sue Huckson Program Manager National Institute of Clinical Studies Improving care for Mental Health patients in Emergency Departments.
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Accountability in Health Promotion: Sharing Lessons Learned Management and Program Services Directorate Population and Public Health Branch Health Canada.
Group Technical Assistance Webinar August 5, CFPHE RESEARCH METHODS FOR COMPARATIVE EFFECTIVENESS RESEARCH.
Project Stakeholder Management
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Results The final report was presented to NICE and published by NICE and WHO. See
Partnership Analysis & Enhancement Tool Kit Cindy S. Soloe Research Triangle Institute (RTI) April Y. Vance Centers for Disease Control and Prevention.
DANA L. RILEY, PHD POSTDOCTORAL FELLOW, UNIVERSITY OF OTTAWA MAY 28, 2014 CPHA PUBLIC HEALTH 2014 CONFERENCE An implementation evaluation of the National.
R-7 Health Navigator Initiative Megan O’Brien, PhD, MPH Dot Nary, PhD Sasha Li.
System Changes and Interventions: Registry as a Clinical Practice Tool Mike Hindmarsh Improving Chronic Illness Care, a national program of the Robert.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Chapter 4 Developing and Sustaining a Knowledge Culture
Chapter 4 Developing and Sustaining a Knowledge Culture
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Audrey Tucker AET560 October 13, 2014 Professor Charity Jennings.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Evaluation design and implementation Puja Myles
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Reduce Waiting & No-Shows  Increase Admissions & Continuation Reduce Waiting & No-Shows  Increase Admissions & Continuation Lessons Learned.
Overview.  Accreditation is both a status and a process  Status:  Status: Accreditation provides public notification that standards of quality are.
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
Economies of Scale: A National Network of Quitlines Suzy McDonald, Program Consultant, Tobacco Control Programme, Health Canada.
1 Project Management C13PM Session 2 Project Initiation & Definition Russell Taylor Business Department Staff Workroom
Georgia Comprehensive Cancer Control Program 3/10/2015 Program Monitoring and Evaluation Activities Short-Term Outcomes Long-Term Outcomes Intermediate.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
Michael Celestin, MA,CHES,CTTS 3/6/2013 R2R MENTORSHIP EXPERIENCE.
Pathway to Excellence. School’s Out Washington provides services and guidance for organizations to ensure all young people have safe places to learn and.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Increased # of AI/AN receiving in- home environmental assessment and trigger reduction education and asthma self-management education Increased # of tribal.
CHB Conference 2007 Planning for and Promoting Healthy Communities Roles and Responsibilities of Community Health Boards Presented by Carla Anglehart Director,
Evaluation What is evaluation?
... for our health Building WREN’s Capacity through Strengthening Relationships with Full Support Practices Katherine B. Pronschinske, MT(ASCP)
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
Using Logic Models in Program Planning and Grant Proposals
Presentation transcript:

Leukemia & Lymphoma Society (LLS) Information Resource Center (IRC) Planning for a Service Program Evaluation - Case Study Public Administration 522 Presented by Ervina Castillo-Newton, John Toadlena and Scott Atole October 16, 2012

Leukemia & Lymphoma Society (LLS) Information Resource Center (IRC) “World’s largest voluntary health organization dedicated to funding blood cancer research, education, and patient services”  Private non-profit service organization  Funded by individual and corporate contributions  68 chapters: U.S and Canada  Masters level Information specialists responding to 80,000 inquiries on leukemia, lymphoma, myeloma and other blood cancer patients, etc.  Provide information, guidance & support (phone based) Scott Atole UNM-Main PADM 522 Planning for a Service Program Evaluation Background on Service Program Case Study

 Evaluation of Information Resource Center(IRC)  – initial contact  Program and academic interest & expertise match  Successful response to RFP  Initial discussions including questions identified in RFP response. Additional details expected in the planning process  RFP: Pilot a patient navigation model for IRC follow up services  Navigate health care systems  Overcome barriers  Evaluation of service enhancement-scheduled phone call follow up to select callers. Need identified by program staff  LLS-service oriented and committed to the evaluation process  Evaluate new programs and ongoing evaluations  Internal funding for this evaluation Scott Atole UNM-Main PADM 522 Planning for a Service Program Evaluation Background on Service Program Case Study

 Initial Planning:  3 months ( to launch)  Report provided 6 months later  LLS moved to fund a second/larger evaluation on Patient navigation  EPIC model (steps before the pilot study) Scott Atole UNM-Main PADM 522 Planning for a Service Program Evaluation Background on Service Program Case Study

Understanding the Political & Organizational Environment  Pre-project-consulted various sources  Internal/external people  Internet/LLS website  LLS Promotional items/collaterals  RFP  LLS Mission Statement  LLS Organizational structure (background materials/site visit)  Commitment to organizational cause.  Political environment associated with advocacy and organizational commitment  Funding-individual/corporate  LLS operations (private sector-accountability)  How will the evaluation findings be used? Future funding from LLS Board of trustees. Internally competitive  Location of IRD within LLS structure. department Scott Atole UNM-Main PADM 522 Planning for a Service Program Evaluation Assess the Context

Understanding the Political & Organizational Environment  LLS values program evaluation  Objective data to support the IRC program enhancement  IRC – immediate program context  7 full time information specialists  80,000 phone calls annually  Navigation model – innovative among other national programs  Navigation to identified subgroup (2 specialists)  National organization  68 chapters – IRC link to local chapters  Need for cancer information (consumer friendly)  Potential for collaboration “community of cancer information service professional” Scott Atole UNM-Main PADM 522 Planning for a Service Program Evaluation Assess the Context

Define the Relationship Between Evaluator & Sponsor  Internal/External Evaluator and Funding  LLS External Evaluator/Internal funding  Findings were reported to the LLS Board of Trustees  Primary Point of Contact – LLS Vice President or Patient Services & Disease Programs (Includes IRC)  Advantages/Disadvantages-resistance to int/ext pressures, produce positive findings, objectivity, credibility, timelines(learning/building relationship), potential biases  Clarify roles and the relationship early  Evaluator and Sponsor roles  Funding arrangements  LLS –Participatory Relationship in planning and implementation  LLS – 2 Managers  IRC – 1 Supervisor (Ongoing Liaison) and 1 staff member  Ongoing Education – Responsibilities  LLS Staff – Provide feedback  Evaluator-Drafts, Technical assistance (research process), Assess stakeholder knowledge  Evaluator skill  Relevant experience, experience and interest  Qualitative vs. Quantitative Scott Atole UNM-Main PADM 522 Planning for a Service Program Evaluation Assess the Context

Determine the Level of the Evaluation  Leukemia & Lymphoma Society (LLS) is local  Information Resource Center (IRC) is national. Seek to create the connection with local chapters.  LLS Target population  Newly Diagnosed  Those seeking information on clinical trials  Patients residing in Wyoming  Reasoning for Target Population selection (Where will this enhancement be most helpful?):  Limited LLS resources to offer the program to all callers  Navigation approach –reduction in health disparities to vulnerable populations  Wyoming – no local chapters  Outcome vs. Process Evaluation  Primary focus on outcomes  IRC program enhancement – include and document the process Scott Atole UNM-Main PADM 522 Planning for a Service Program Evaluation Assess the context

Specify Evaluation Use  LLS-findings will be used for program sustainability and continued and amount of future funding.  Evaluations used to make decisions.  Important to understand the implication of negative findings  Will this program be duplicated? Will the literature/results be made available nationally? Validate Perspectives  Communications with nearly every level of LLS/IRC  Welcome evaluation  Supportive of the project  Important to understand that it is a program evaluation Scott Atole UNM-Main PADM 522 Planning for a Service Program Evaluation Gather Reconnaissance

 Potential stakeholders for a service program include:  Two main reasons why the evaluator must engage stakeholders in evaluation planning.  Identify and Invite Stakeholders.  Establish rapport.  Participants involved in the evaluation planning.  “Sponsor” role.  Define Stakeholder Roles and Structure for Their Input.  Importance of engaging the few stakeholder groups.  Survey design. John Toadlena UNM-Gallup PADM 522 Planning for a Service Program Evaluation Engage Stakeholders

 Establish Group Processes for Ongoing Stakeholder Involvement.  The communication process.  Methods used.  Evaluation plan timeline.  Preliminary reports. John Toadlena UNM-Gallup PADM 522 Planning for a Service Program Evaluation Engage Stakeholders

 Establishment of the stakeholder.  Become immersed in the nature of the program to be evaluated.  “Framework for Program Evaluation in Public Health”.  Understanding the IRC.  Conceptualize Program Theory or Rationale.  Usefulness of logic models and conceptual models.  Evaluation Planning Matrix (EPM). (Holden and Zimmerman, pg.78-79)  Lists the evaluation questions considered.  Short-term, Intermediate, or Long-term outcomes and identifies relevant data sources corresponding to each question.  Outcome surveys and questions. John Toadlena UNM-Gallup PADM 522 Planning for a Service Program Evaluation Describe the Program

 Focusing the evaluation.  The planning process.  Study design team.  Preliminary evaluation questions.  Theoretical frameworks.  Ensure Feasibility.  The focus on feasibility.  Control or comparison group.  Assess Potential Data Collection Burden.  Data collection burden an issue on two levels.  How much surveying could the participants tolerate.  What kinds of data collection operation could IRC take on.  Sample size.  Draw conclusions from the results. John Toadlena UNM-Gallup PADM 522 Planning for a Service Program Evaluation Focus the Evaluation

 Service Programs are unique  Start with the Mission Statement  Size and scope of the activities  Evaluation  Attitudes  Stakeholders’ knowledge  Stakeholders  Expertise & knowledge  Minimizes resistance  Planning Process  Dissemination  Goals  This case – positive and knowledgeable Ervina Castillo-Newton UNM-WS PADM 522 Planning for a Service Program Evaluation Lessons Learned

Comparison Group  Based on comparison of callers  Standard Service through IRC  Enroll in intervention  Information Specialists  Qualified  Trained  Experienced  Outcomes Ervina Castillo-Newton UNM-WS PADM 522 Planning for a Service Program Evaluation Conclusion