Evaluation and Evidence: Applying guidelines for health IT evaluation in developed and developing countries Nicolette de Keizer, Michael Rigby, Elske Ammenwerth,

Slides:



Advertisements
Similar presentations
Supporting National e-Health Roadmaps WHO-ITU-WB joint effort WSIS C7 e-Health Facilitation Meeting 13 th May 2010 Hani Eskandar ICT Applications, ITU.
Advertisements

Develop an Information Strategy Plan
UNSW Strategic Educational Development Grants
E-OCVM (Version 2) Explained Episode 3 - CAATS II Final Dissemination Event Alistair Jackson EUROCONTROL Episode 3 Brussels, 13 & 14 Oct 2009.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
Return On Investment Integrated Monitoring and Evaluation Framework.
PPA 502 – Program Evaluation
MIE2009 Good Evaluation Practice Workshop Pirkko Nykanen 1 Guidelines for Good Evaluation Practices in Health Informatics - A shared networked initiative.
Quality evaluation and improvement for Internal Audit
A Healthy Place to Live, Learn, Work and Play:
Purpose of the Standards
Building the Foundations for Better Health Health Services Organization.
Monitoring, Review and Reporting Project Cycle Management A short training course in project cycle management for subdivisions of MFAR in Sri Lanka.
Standards and Guidelines for Quality Assurance in the European
Enterprise Architecture
How to Develop the Right Research Questions for Program Evaluation
Strategies for health system performance comparison: some international experience Peter C. Smith Emeritus Professor of Health Policy Imperial College.
IAEA International Atomic Energy Agency How do you know how far you have got? How much you still have to do? Are we nearly there yet? What – Who – When.
Critical Role of ICT in Parliament Fulfill legislative, oversight, and representative responsibilities Achieve the goals of transparency, openness, accessibility,
EFFECTING CULTURAL CHANGE IN RESEARCH ETHICS AND INTEGRITY Encouraging a culture of research integrity Andrew C. Rawnsley.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Darren A. DeWalt, MD, MPH Division of General Internal Medicine Maihan B. Vu, Dr.PH, MPH Center for Health Promotion and Disease Prevention University.
Assessment Improvement Maureen McEnaney Safeguarding & Review Manager Every Child Matters.
Developing an IS/IT Strategy
What is an Inventory Program for? Dr. Emilio Moceo Ph.D Director of Studies Meet international obligations and expectations Inform international, national,
Sue Huckson Program Manager National Institute of Clinical Studies Improving care for Mental Health patients in Emergency Departments.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Medical Audit.
Evaluation in the GEF and Training Module on Terminal Evaluations
WHO–ITU National eHealth Strategy Toolkit An effective approach to national Strategy Development and Implementation Clayton Hamilton WHO Regional Office.
Introduction to MAST Kristian Kidholm Odense University Hospital, Denmark.
Introduction to Evaluation Odette Parry & Sally-Ann Baker
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
AHRQ 2011 Annual Conference: Insights from the AHRQ Peer Review Process Training Grant Review Perspective Denise G. Tate Ph.D., Professor, Chair HCRT Study.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Publication of Evaluation Studies: Challenges & Guidelines for authors Elske Ammenwerth UMIT - University for Health Sciences, Medical Informatics and.
Module V: Writing Your Sustainability Plan Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 © 2011.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Specific Safety Requirements on Safety Assessment and Safety Cases for Predisposal Management of Radioactive Waste – GSR Part 5.
Transforming Patient Experience: The essential guide
International Atomic Energy Agency Roles and responsibilities for development of disposal facilities Phil Metcalf Workshop on Strategy and Methodologies.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
The Implementation of BPR Pertemuan 9 Matakuliah: M0734-Business Process Reenginering Tahun: 2010.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Technology Needs Assessments under GEF Enabling Activities “Top Ups” UNFCCC/UNDP Expert Meeting on Methodologies for Technology Needs Assessments
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
Presented at the OSPA Summit 2012 January 9, 2012.
1 CREATING AND MANAGING CERT. 2 Internet Wonderful and Terrible “The wonderful thing about the Internet is that you’re connected to everyone else. The.
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
Session 2: Developing a Comprehensive M&E Work Plan.
Leadership Guide for Strategic Information Management Leadership Guide for Strategic Information Management for State DOTs NCHRP Project Information.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
European network for Health Technology Assessment | JA | EUnetHTA European network for Health Technology Assessment THL Info.
Program Planning for Evidence-based Health Programs.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Stages of Research and Development
MUHC Innovation Model.
Planning a Learning Unit
Communication and Consultation with Interested Parties by the RB
Evaluation in the GEF and Training Module on Terminal Evaluations
Project Management Process Groups
Presentation transcript:

Evaluation and Evidence: Applying guidelines for health IT evaluation in developed and developing countries Nicolette de Keizer, Michael Rigby, Elske Ammenwerth, Pirkko Nykänen, Hamish Fraser, Tom Oluoch IMIA WG Technology Assessment & Quality Development EFMI WG Assessment of Health Information Systems

IMIA WG Technology Assessment & Quality Development Evaluation and Evidence  Health informatics systems intend to improve quality of care  Some health informatics systems disrupt current processes and work practices – Unintended events and unintended side effects – E.g. a CDSS on CD4-testing in HIV patients:  Not used because of lack of electric power  Not used because time-consuming  Not used because of disagreement with guideline  More focus on CD4-testing resulted in improved CD4 testing but also increased unnecessary testing -> higher costs Medinfo 2013 Copenhagen 21/08/20132 IMIA WG Technology Assessment & Quality Development

Evidence Based Health Informatics  Implementation need to be based on evidence not simply on aspiration or promises  To provide evidence it is necessary to evaluate systems in practice to assess the overall effects, costs and outcomes – Setting is important Medinfo 2013 Copenhagen 21/08/20133

IMIA WG Technology Assessment & Quality Development Evidence Based Health Informatics  Importance : – Theme of this years IMIA’s yearbook – WHO Bellagio meeting in 2011 – Collaboration between IMIA and WHO to promote and practice an evidence-based approach, especially in developing countries Medinfo 2013 Copenhagen 21/08/20134

IMIA WG Technology Assessment & Quality Development Work of EFMI and IMIA WG  To reach Evidence Based Health Informatics – GEP-HI  generic principles of health informatics evaluation; – STARE-HI  a standard for reporting results from an evaluation study; – EVALDB  a database of evaluation studies; Medinfo 2013 Copenhagen 21/08/20135

IMIA WG Technology Assessment & Quality Development Objective of this Workshop  Share generic principles of health informatics evaluation – Developed and developing countries  Case study from a developing country Medinfo 2013 Copenhagen 21/08/20136

IMIA WG Technology Assessment & Quality Development Content Medinfo 2013 Copenhagen 21/08/20137 TimeTopicPresenter 10.35An Overview of Evaluation ApproachesM Rigby 10.50the Guideline for Good Evaluation Practice in Health Informatics (GEP-HI) P Nykänen 11.05Challenges in Finding the EvidenceE Ammenwerth 11.15Key Value Issues from the Bellagio Meeting on Global eHealth Evaluation H Fraser 11.30A Practical Example of Evaluation in a Developing Country Context T Oluoch DiscussionM Rigby / N de Keizer

The IMIA work in Context Michael Rigby and Siobhan Melia

IMIA WG Technology Assessment & Quality Development Principles and Plurality  IMIA work is consensus- and evidence-driven  Other evaluation innovators have worked in other settings  Many different services and settings are covered by health informatics.  Value in looking ot other methodologies alongside GEP-HI.

IMIA WG Technology Assessment & Quality Development Yusof et al – HOT-fit  2008  Focuses on three dimensions: – Human – Organisation – Technology  Assesses how the application ‘fits’ the local need

IMIA WG Technology Assessment & Quality Development Yusof et al – HOT-fit  The three dimensions are important  Accommodates the importance of context, not least level of development  Each of the three dimensions is itself multi- dimensional  Challenges: – metrics and analysis – Assessment of ‘fit’  Compatible with GEP-HI, within method selection

IMIA WG Technology Assessment & Quality Development AHRQ Evaluation Toolkit  2009  United States Agency for Health Research and Quality  Evaluation Toolkit developed to provide step- by-step guidance for developing evaluation plans for health information technology projects

IMIA WG Technology Assessment & Quality Development AHRQ Evaluation Toolkit  Assists evaluators to define – the goals for evaluation – what is important to stakeholders – what needs to be measured to satisfy stakeholders – what is realistic and feasible to measure – how to measure these items  Methods, but not study management  Compatible with GEP-HI, within method selection

IMIA WG Technology Assessment & Quality Development  Currently finalising / validating  Result of European Commission project  Purpose is to describe effectiveness and contribution to quality of care of telemedicine applications  And to produce a basis for decision making  So far applied to real time telecare and telemonitoring only Model for ASsesment of Telemedicine (MAST)

IMIA WG Technology Assessment & Quality Development Model for ASsesment of Telemedicine (MAST)  Preceding analysis, then –  Assessment against seven categories: – Health Problem and characteristics of the application – Safety – Clinical effectiveness – Patient perspectives – Economic aspects – Organisational aspects – Socio-cultural, ethical, legal aspects  Potential for transferability

IMIA WG Technology Assessment & Quality Development Model for ASsesment of Telemedicine (MAST)  Follows an HTA format, so familiar to policy makers  Does not cover management of evaluation  Strong interest in influencing policy makers

IMIA WG Technology Assessment & Quality Development Socio-Technical Approach – STAT-HI  2010  Seeks to address need for socio-technical approach (early writers - Lorenzi; Berg; Ash)  Not yet widely used  Claims compatibility with GEP-HI.

IMIA WG Technology Assessment & Quality Development Summary  There have been a number of conceptual models for Health Informatics evaluation  Mainly created in developed world, but generic  Each has strengths and weaknesses  Only GEP-HI takes full cycle including evaluation project management  Core objectives: – seeking evidence – making evidence available where needed

IMIA WG Technology Assessment & Quality Development Content Medinfo 2013 Copenhagen 21/08/ TimeTopicPresenter 10.35An Overview of Evaluation ApproachesM Rigby 10.50the Guideline for Good Evaluation Practice in Health Informatics (GEP-HI) P Nykänen 11.05Challenges in Finding the EvidenceE Ammenwerth 11.15Key Value Issues from the Bellagio Meeting on Global eHealth Evaluation H Fraser 11.30A Practical Example of Evaluation in a Developing Country Context T Oluoch DiscussionM Rigby / N de Keizer

MEDINFO2010 Good Evaluation Practice Workshop Pirkko Nykänen Using the Guideline for Good Evaluation Practice in Health Informatics (GEP-HI) for Success P Nykänen, J Brender GEP-HI Good evaluation practice guideline MEDINFO2013 Workshop

MEDINFO2010 Good Evaluation Practice Workshop Pirkko Nykänen The guideline provides –guidance for systematic design, execution and management of an evaluation study in health informatics domain – is applicable to different types of health informatics evaluation studies independent on whether the object of the study is an IT application, or a method, a service or a practice. –has been developed through consensus seeking process by evaluation experts and by collecting feedback from the health informatics community in workshops and through evaluation mailing list.

MEDINFO2010 Good Evaluation Practice Workshop Pirkko Nykänen

GEP-HI guideline 1.Preliminary outline – Starting question 2.Study design –Preliminary design 3. Operationalisation of methods – Methodological approach, methodical aspects 4. Project planning – Detailed planning 5. Execution of the evaluation study – Accomplishment of the planned evaluation study 6. Completion of the evaluation study – Accounting, archiving, reporting of evaluation study results

1.1The information need 1.2Primary audience 1.3Identification of the study funding party (ies) 1.4The organisational context of the evaluation study 1.5Identification of stakeholders 1.6Identification of required expertise 1.7A first sketch of the health care setting 1.8First exploration of evaluation methods to be used 1.9Ethical, moral and legal issues 1.10Budget 1.11Exploring the restrictions of study execution and publication 1.12Result of Study Exploration 1.13Formal acceptance to proceed to the next phase Identify those who directly or indirectly will be affected by the study itself, by its anticipated outcome, or by the system that is being evaluated The selected methods have to be tightly correlated with the information need and its qualities and the study objectives, taking the study setting constraints into account. Which issues are relevant? What are the restrictions? Phase 1: Preliminary outline THE key issue

2.1Elaboration of the detailed rationale and objectives for the study 2.2Key evaluation issues/questions 2.3Budget 2.4Establishment of the study team 2.5Stakeholder analysis/Social Network analysis 2.6Study methods 2.7Organisational setting, the study context 2.8Technical setting, the study context 2.9Participants from the organisational setting 2.10Material and practical resources 2.11Time and timing 2.12Risk analysis 2.13Ethical, moral and legal issues 2.14Strategy for reporting and disseminating the results 2.15Result of Study Design 2.16Formal acceptance to proceed to the next phase Get a rich picture of Who’s Who, and their potential motives Get a rich picture of the organisation Place the planned activities in a calendar Data protection and security principles and laws need to be obeyed. Have a strategy, policy and principles been formulated for how to handle problematic issues? Seek approval to go on as now planned Phase 2: Study design Candidate methods, and preferred ones

3.1Study type 3.2Approach 3.3Assumptions 3.4Pitfalls and perils 3.5Expertise 3.6Frame of reference 3.7Timing 3.8Justification of the methodological approach 3.9Outcome measures 3.10Quality Control on data (measures) 3.11Participants 3.12Study flow 3.13Result of Operationalisation of Methods 3.14Ethical, moral and legal issues 3.15Approval of Operationalisation of Methods Limitations of the methods (scope, accuracy, specificity, appli. range)? Of the hidden assumptions? Get approvals, consents and permissions, prepare dedicated forms, etc. Which candidates in your set-up? quantitative versus qualitative, subjective versus objective, formative versus summative, etc Which? How to get it? Can you get it? Phase 3: Operationalisation of methods Have the actual measures been tested as regards syntactic and semantic validity, are the calculations working properly, have the questionnaires been tested for validity, etc?

4.1Project management 4.2Evaluation activity mapping 4.3Quality management 4.4Risk management 4.5Recruitment of necessary staff 4.6Inform relevant stakeholders 4.7Result of Project Planning 4.8Approval of Project Planning Prepare a complete script Who, when, how – get it done! Prepare these … as for all other projects Phase 4: Project planning

5.1Establishment of the frame of reference 5.2Undertake the study and collect data 5.3Quality control of findings 5.4Interpretation of observations 5.5Observation of changes 5.6Continuous project management, quality management and risk management 5.7Regular reports 5.8Final result of Evaluation Study Implementation Explore unexpected events and unexpected observations to find causal relations Report summarizing main information on objectives, the setting, the conditions, the method, the outcome data and information, and the conclusions Ordinary Project management activities If your methods requires a frame of reference, then now is the time to acquire it Phase 5: Execution of the evaluation study

6.1Accountings 6.2Reports and publications 6.3Archiving 6.4Reporting guidelines 6.5Reporting scope 6.6Reporting message 6.7Authorship 6.8Ethical and moral aspects 6.9Preparation of reports / publications ICMJE: In order of substantial intellectual contributions For publicly available evaluation reports use the STARE-HI guidelines What, where, who, how, and for how long … if accountings of finances are needed Phase 6: Completion of the evaluation study

Nykänen P, Brender J, Talmon J, deKeizer N, Rigby M, Beuscart-Zephir MC, Ammenwerth E, Guideline for good evaluation practice in health informatics Int J Medical Informatics 2011; 80: Available also at: iig.umit.at/efmi

IMIA WG Technology Assessment & Quality Development Content Medinfo 2013 Copenhagen 21/08/ TimeTopicPresenter 10.35An Overview of Evaluation ApproachesM Rigby 10.50the Guideline for Good Evaluation Practice in Health Informatics (GEP-HI) P Nykänen 11.05Challenges in Finding the EvidenceE Ammenwerth 11.15Key Value Issues from the Bellagio Meeting on Global eHealth Evaluation H Fraser 11.30A Practical Example of Evaluation in a Developing Country Context T Oluoch DiscussionM Rigby / N de Keizer

Univ.-Prof. Dr. Elske Ammenwerth Institute of Health Informatics, UMIT 32 Challenges in Finding the Evidence How do we know what we know? Prof. Dr. Elske Ammenwerth, UMIT, Hall in Tirol EFMI WG on Assessment of Health Information Systems

Univ.-Prof. Dr. Elske Ammenwerth Institute of Health Informatics, UMIT 33 Motivation While conducting systematic reviews on health IT impact, we found that important information is missing in many study papers, e.g. – Details of the health IT – Clinical context, clinical processes – Patterns of IT usage Ammenwerth E, Schnell-Inderst P, Siebert U. Vision and Challenges of Evidence-Based Health Informatics. Int J Med Inform 2010; 79: e83-e88 These study publications … are useless!

Univ.-Prof. Dr. Elske Ammenwerth Institute of Health Informatics, UMIT 34 STARE-HI-Guidelines Guideline on how to structure a publication of an health IT evaluation study Recommended by journals, e.g. Methods Inf Med, Int J Med Inform and ACI

Univ.-Prof. Dr. Elske Ammenwerth Institute of Health Informatics, UMIT 35 STARE-HI-Guidelines Elaboration Additional paper with examples and explanations just published in ACI, July-24th

Univ.-Prof. Dr. Elske Ammenwerth Institute of Health Informatics, UMIT 36 Health IT Evaluation Database – Contains around abstract of health IT evaluation papers All papers are classified and searchable, e.g. according to – Type of system – Clinical setting – Evaluation methods – Evaluation criteria

Univ.-Prof. Dr. Elske Ammenwerth Institute of Health Informatics, UMIT 37 Health IT Evaluation Database

Univ.-Prof. Dr. Elske Ammenwerth Institute of Health Informatics, UMIT 38 Health IT Evaluation Database Only < 2% of evaluation studies come from developing countries Need for more health IT evaluation studies in this special context!

Univ.-Prof. Dr. Elske Ammenwerth Institute of Health Informatics, UMIT 39 IMIA/EFMI WG For more information on the work of the IMIA WG and EFMI WG on health IT evaluation: Or contact

IMIA WG Technology Assessment & Quality Development Content Medinfo 2013 Copenhagen 21/08/ TimeTopicPresenter 10.35An Overview of Evaluation ApproachesM Rigby 10.50the Guideline for Good Evaluation Practice in Health Informatics (GEP-HI) P Nykänen 11.05Challenges in Finding the EvidenceE Ammenwerth 11.15Key Value Issues from the Bellagio Meeting on Global eHealth Evaluation H Fraser 11.30A Practical Example of Evaluation in a Developing Country Context T Oluoch DiscussionM Rigby / N de Keizer

Call to Action on Global eHealth Evaluation Dr Hamish Fraser Assistant Professor, Division of Global Health Equity, Brigham and Womens Hospital and Harvard Medical School Senior Advisor in Medical Informatics, Partners In Health

Call to Action on Global eHealth Evaluation Consensus Statement of the WHO Global eHealth Evaluation Meeting, Bellagio, September 2011 “To improve health and reduce health inequalities, rigorous evaluation of eHealth is necessary to generate evidence and promote the appropriate integration and use of technologies.”

Partners In Health & WHO with support from the Rockefeller Foundation

Impact of eHealth Used appropriately, eHealth has the potential to catalyze, support and monitor health improvements at scale, and to accelerate achievement of national and global development goals, including the United Nations Millennium Development Goals If used improperly, eHealth may divert valuable resources and even cause harm

Core Principles 1)Core principles underlie the structure, content, and delivery of an eHealth system independent of the rapidly changing technology used. 2)High quality data collection, communication and use are central to the benefits of eHealth systems. 3)Evaluating eHealth both demonstrates its impact and fosters a culture that values evidence and uses it to inform improvements in eHealth deployments.

Core Principles 4) To ensure the greatest benefit from eHealth and enhance sustainability and scale, eHealth evaluations should recognize and address the needs of all key constituencies. 5) Evidence is needed to demonstrate costs and benefits of eHealth implementations, and maximize eHealth’s beneficial impact on health system performance and population health.

Core Principles 6) The value of a complete evaluation program is enhanced through research that is attuned to the differing requirements throughout the life-course of the project, whether at needs assessment, pilot-, facility level-, regional and national scale-up stages. 7) Independent and objective outcome-focused evaluation represents the ideal of impact evaluation.

Core Principles 8) Country alignment and commitment to a clear eHealth vision, plan, and evaluation strategy is essential. 9) Improving the eHealth evidence base requires more than increased numbers of studies but also improved quality of eHealth research studies.

Bellagio Call to Action Develop repository of tools, studies, frameworks, teaching materials – mHealth Alliance HUB – mHealth Evidence Refine frameworks – GEP-HI, PRISM, KDS, and others Create training courses on evaluation in developing countries – MIT-Harvard Course Advocate to funders to require evaluation in eHealth projects

Barriers to evaluation Not seen as a priority – “eHealth is just infrastructure” – Not aware of the challenges in effective use/impact – Uptake or usage is the only measure of interest – More important to “get implementation done” Not sure how to do it – Think all studies must be big and expensive – Not familiar with range of evaluation methods – Studies that are done are poor quality Threatened – Don’t want to look bad… Need resources/prioritization from funders

Conclusions Large investment in Global eHealth to date - $Billions!! Particular need to monitor day to day system performance and activity Measuring impact of eHealth, what are the alternatives and control groups? How role of e/mHealth differs in resource poor environments

Plan of action Study the real needs of healthcare/patients Build great software tools Rigorously optimize those tools in the field Evaluate their impact and costs – initial and long term Poor data wastes everyone’s time and money!

Content Medinfo 2013 Copenhagen 21/08/ TimeTopicPresenter 10.35An Overview of Evaluation ApproachesM Rigby 10.50the Guideline for Good Evaluation Practice in Health Informatics (GEP-HI) P Nykänen 11.05Challenges in Finding the EvidenceE Ammenwerth 11.15Key Value Issues from the Bellagio Meeting on Global eHealth Evaluation H Fraser 11.30A Practical Example of Evaluation in a Developing Country Context T Oluoch DiscussionM Rigby / N de Keizer

Evaluating Health Information Systems in Developing Countries Tom Oluoch US Centers for Disease Control and Prevention Division of Global HIV/AIDS – Kenya Medinfo Conference August 20-23, 2013 Center for Global Health Division of Global HIV/AIDS

Outline  Introduction  Implementing EMRs in developing countries and in Kenya  Need for evidence-based investment in EMRs and CDSS  Methods  Guidelines for Good Evaluation Practices in Health Informatics  Results  Some selected applications of GEPHI  Lessons learnt and challenges  Conclusion  Appropriateness and applicability of GEPHI in evaluations in developing countries

Introduction (i)  EMRs have been shown to improve quality of care.  Majority of the evaluations have been conducted in developed countries.  Recent systematic review on evaluation of EMR-based CDSS in developing countries showed only 12 studies published.  Need for evaluations due to unique challenges in developing countries:  Weak infrastructure (erratic power supply, poor Internet connectivity, inadequate computers)  Low ICT literacy among health workers, Policies, etc.  Managing chronic conditions (e.g. HIV) require managing complex longitudinal data

Introduction (ii)  It is essential that future investments on eHealth projects in developing countries are guided by evidence of what works and what does not.  Need for well designed, executed and documented evaluations in developing countries (inc. Kenya).  CDC-Kenya in collaboration with the Kenya Medical Research Institute (KEMRI) have developed evaluation protocols guided by the GEP-HI principles.  Some evaluations supervised by the University of Amsterdam.

Evaluation Questions  Do EMRs+CDSS improve HIV care?  Does SNOMED CT based recording of opportunistic infections improve quality of registration and quality of HIV care?

Guidelines for Good Evaluation Practices in Health Informatics (GEPHI)  Provides guidelines for planning and implementing evaluation studies in health informatics  Consists of 5 iterative phases:  Preliminary outline  First study design  Operationalization of methods  Detailed study plan & project plan  Evaluation study implementation

Preliminary Outline Information need: Do EMRs+CDSS improve HIV care? Context: Clinical decision support in rural HIV clinics Restrictions: Restriction on use of PEPFAR funds for rigorous scientific studies. Budget: Budget discussed with KEMRI/CDC leadership. Studies to ride on existing EMR scale-up. Ethical, moral and legal issues: Institutional ethical regulations referenced. Sponsor: KEMRI and CDC, using PEPFAR resources

First study design Organizational and technical settings -Participating clinics -Computer locations Participants -Investigators & roles -Clinicians -Data team -Programmers Risk analysis -Patients -Institutions Reporting -Dissemination plan -Audiences (local, national, global ) Study constraints -Design restrictions -Infrastructure -Low ICT literacy

Operationalization of methods Assumptions: -Guidelines used uniformly Outcome measures -Initially broad (e.g. quality of care) -Refined to specific and measurable Study flow - Study flow designed to replicate regular work flow Approach -Operational research incorporating service delivery and evaluation

Detailed Study Plan and Project Plan Project management: -Defined project mgt structure -Resources & timelines Communication strategy: -Channels of comm. Among study team -Feedback to sponsors -Progress reports Roles of participants: -Revised list of investigators -Redefined roles Risk management: -Mitigation of risks -Procedures for responding to adverse events -Contact persons

Evaluation Study Implementation Observe changes: - Changes were documented and minor adjustments made (no violation of protocol) Continue project mgt: - Review progress against pre-set timelines and milestones Regular reports: -Progress reports to sponsors -IRB report for continuation Interpret observations: - Interpret events and take action

CDSS Study  To evaluate the alert-based CDSS on its effectiveness to: –Timely order CD4 according to Kenya MOH and WHO guideline –Timely initiation of ART according to the Kenya MOH and WHO guideline  Study sites: 20 HIV clinics in Siaya county  Design: Prospective, Comparative  10 sites with EMR+CDSS  10 sites with EMR only  Current status: –Data collection in final stages –Preliminary data review in progress  Dissemination –2 manuscripts on baseline data in IRB clearance –Additional manuscripts planned on the prospective study –All manuscripts will be based on STARE-HI

SNOMED CT study  To determine whether SNOMED CT based registration of OIs improves: -Data quality -Accuracy of WHO staging -Appropriate ART initiation based on WHO staging  Study site: A Teaching and Referral Hospital’s Maternal & Child Health clinic, western Kenya  Design: Pre-post, with an external control  Current status: -Data collection yet to start -System design and development

Lessons learnt GEP-HI is very elaborate and comprehensive. Components are iterative and some happen in parallel. Practical implementation may not always follow the order in which steps are listed in GEP-HI. We omitted steps that were not critical for our studies (e.g. identification of sponsor and social network analysis). Although some of the steps are repeated in several phases, we were able to complete them in a single phase, while others took several iterations to complete. Working with more than one IRB, sponsor or collaborator may require modification of some steps.

Conclusion  GEPHI acts as an important checklist for key activities in evaluation studies, from protocol development to pre- publication stages. We found it practical in our resource- limited setting in a developing country.

Content Medinfo 2013 Copenhagen 21/08/ TimeTopicPresenter 10.35An Overview of Evaluation ApproachesM Rigby 10.50the Guideline for Good Evaluation Practice in Health Informatics (GEP-HI) P Nykänen 11.05Challenges in Finding the EvidenceE Ammenwerth 11.15Key Value Issues from the Bellagio Meeting on Global eHealth Evaluation H Fraser 11.30A Practical Example of Evaluation in a Developing Country Context T Oluoch DiscussionM Rigby / N de Keizer