 11:45 – 12:30  From IHE Profiles to conformance testing, closing the implementation gap  Helping the implementers, testing tools, connectathons  12:30.

Slides:



Advertisements
Similar presentations
Integrating the Healthcare Enterprise IHE Overview Keith W. Boone Interoperability Architect, GE Healthcare Co-chair, IHE Patient Care Coordination PC.
Advertisements

Integrating the Healthcare Enterprise
Sept 13-15, 2004IHE Interoperability Workshop 1 Integrating the Healthcare Enterprise Patient Identifier Cross-referencing for MPI (PIX) Profile Mike Henderson.
The Connectathon: IHEs Conformance Testing Process Presented by: Mike Nusbaum & Mike Glickman IHE Interoperability Showcase Planning Committee January.
September, 2005What IHE Delivers VA Success Story – Image Acquisition using IHE Scheduled Workflow June 6~7, 2006 Peter Kuzmak, Andrew Casertano, and.
June - September Part II: 2010 NA Connectathon Participants Part II: What 2010 NA Connectathon Participants need to know: Connectathon.
Oct, 2010IHE Orientation-TurMIA 1 INTEGRATING THE HEALTHCARE ENTERPISE (IHE) Orientation Workshop(2) TurkMIA Conference-10 Charles Parisot, IHE-Europe.
Sept 13-15, 2004IHE Interoperability Workshop 1 Integrating the Healthcare Enterprise Post-Processing Workflow Sanjay Jain Co-Chair, Radiology Planning.
September, 2005What IHE Delivers 1 Purchasing & Integrating Radiology Systems Using IHE: A Tutorial & A Real-world Case Kevin ODonnell, Cor Loef, John.
IHE Canada Workshop – Sept What IHE Delivers 1 Kevin ODonnell Toshiba Medical Systems IHE Structure & Concepts.
Pathfinding Session: Device Integration IHE North America Webinar Series 2008 Todd Cooper Patient Care Device Domain Breakthrough Solutions Foundry, Inc.
Sept 13-15, 2004IHE Interoperability Workshop 1 Integrating the Healthcare Enterprise IHE Tools for Users and Integrators: Connectathon, Integration Statements.
Connectathon Overview IHE Workshop 2007 Chris Carr RSNA.
June 28-29, 2005IHE Interoperability Workshop 1 Integrating the Healthcare Enterprise Cross-enterprise Document Sharing for Imaging (XDS-I) Rita Noumeir.
September, 2005What IHE Delivers How to purchase eye care systems using IHE IHE Showcase at the AAO Conference November 2006 Andrew Casertano, Peter Kuzmak,
Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group
By Rick Clements Software Testing 101 By Rick Clements
HL7 V2 Implementation Guide Authoring Tool Proposal
18 Copyright © 2005, Oracle. All rights reserved. Distributing Modular Applications: Introduction to Web Services.
DICOM Structured Reporting Workshop - March 2000 Structured Reporting and the IHE Year 2 Technical Framework Simon Wail, PhD Marconi Medical Systems.
Week 2 The Object-Oriented Approach to Requirements
Configuration management
Software change management
Chapter 5 – Enterprise Analysis
Effectively applying ISO9001:2000 clauses 6 and 7.
© Telcordia Technologies 2004 – All Rights Reserved AETG Web Service Advanced Features AETG is a service mark of Telcordia Technologies. Telcordia Technologies.
Welcome, Panel of Examiner and Process Development Members! Washington State Quality Award PEPD #1 Training 2008.
Quality Manual for Interoperability Testing Morten Bruun-Rasmussen Presented by Milan Zoric, ETSI.
Chapter 10 Software Testing
Acceptance Testing.
1. 2 Captaris Workflow Microsoft SharePoint User Group 16 May 2006.
Test Management Eric Poiseau Inria, Rennes. Purpose  Provide support for the management of the connectathon process from registration to results  Provide.
IHE Profile Proposal: Dynamic Configuration Management October, 2013.
1 Standards Adoption Process Testing at Connectathons IHE Demonstrations Products with IHE Timely access to information Document Use Case Requirements.
September, 2005What IHE Delivers 1 Karen Witting IBM Cross-Community: Peer- to-Peer sharing of healthcare information.
Introduction Peter Dolog dolog [at] cs [dot] aau [dot] dk Intelligent Web and Information Systems September 9, 2010.
HIMAA Symposium 2008, Canberra 1 Integrating the Healthcare Enterprise Klaus Veil Manager - IHE Connectathon and Interoperability Showcase 2008 Chairman.
S&I Framework Testing HL7 V2 Lab Results Interface and RI Pilot Robert Snelick National Institute of Standards and Technology June 23 rd, 2011 Contact:
1 G2 and ActiveSheets Paul Roe QUT Yes Australia!
IHE Testing Tools An overview. The past (and current)  Mesa Tools  In house testing for Vendors  C++, Perl  Kudu  Connectathon management tool :
The HITCH project: Cooperation between EuroRec and IHE Pascal Coorevits EuroRec 2010 Annual Conference June 18 th 2010.
Eye Care Domain Connectathon 2014: Preparation & Processes Lynn Felhofer IHE Technical Project Manager.
Web Development Process Description
Antilope – Testing tools Milan Zoric, ETSI Presented by Karima Bourquard, IHE.
Connectathon Preparation Steps Lynn Felhofer – IHE Technical Project Manager Sarah Willis-Garcia – IHE USA.
Connectathon Registration IHE Test Cycle Lynn Felhofer / Steve Moore (MIR) Eric Poiseau (INRIA)
Integrating the Healthcare Enterprise Presentation and short explanation of the developmental tools Eric Poiseau Laboratoire IDM Faculté de Médecine Université.
14 Publishing a Web Site Section 14.1 Identify the technical needs of a Web server Evaluate Web hosts Compare and contrast internal and external Web hosting.
Federal Health Architecture How to Prepare for an HIE Connectathon Adeola Odunlami, Senior Solutions Architect Health and Civilian Solutions Division 1.
Afdasfdasfd Adfasdfasfd asd Software Inventory Connectathon Manager Training Steve Moore Mallinckrodt Institute of Radiology.
Annual Cycle Review for Returning Participants IHE Test Cycle Lynn Felhofer / Steve Moore (MIR) Eric Poiseau (INRIA)
2-3 Feb 2009Webinar for CAT Participants1 Connectathon Organization Eric Poiseau, IHE Europe Technical Project Manager INRIA Eric Poiseau, IHE Europe Technical.
3 Feb 2011Webinar for CAT Participants1 Connectathon Organization Eric Poiseau, IHE Europe Technical Project Manager INRIA Eric Poiseau, IHE Europe Technical.
Integrating the Healthcare Enterprise Presentation of some development tools of some development tools Eric Poiseau IHE Europe Technical Projet Manager.
Configuration Management (CM)
IHE Global Collaborative Strategy for Testing and Tools Cor Loef/Chris Carr IHE International Testing and Tools Committee.
IHE IT Infrastructure: The Value Proposition HIMSS 2003 Joining the IHE in its New Enterprise Initiatives.
IHE-Europe – Use Case Based Approach to eHealth Interoperability Peter Künecke, SIEMENS Medical Solutions IHE-Europe „vendor“ co-chair Integrating the.
IHE International Meeting Gazelle Project Steve Moore, MIR Eric Poiseau, INRIA.
IHE Marketing and Implementation Resources IHE Marketing and Implementation Resources Chris Carr Director of Informatics, Radiological Society of North.
June 28-29, 2005IHE Interoperability Workshop 1 Integrating the Healthcare Enterprise IHE Resources to Facilitate Implementation Kevin O’Donnell Toshiba.
Overview of Pre-Connectathon Testing Lynn Felhofer – IHE NA Connectathon Manager.
Requirements for the Testing Cycle IHE Test Cycle Lynn Felhofer / Steve Moore (MIR) Eric Poiseau (INRIA)
Integrating the Healthcare Enterprise IHE Plans for Multi-domain Testing and Demonstrations Steve Moore Technical Project Manager (ITI, Rad)
Helping the Cause of Medical Device Interoperability Through Standards- based Test Tools DoC/NIST John J. Garguilo January 25,
Sept 13-15, 2004IHE Interoperability Workshop 1 Integrating the Healthcare Enterprise IHE Conformance: Connectathons, Integration Statements & RFPs Kevin.
Testing Process: Tools, Responsibilities, Connectathon Steve Moore Mallinckrodt Institute of Radiology.
E-commerce Architecture Ayşe Başar Bener. Client Server Architecture E-commerce is based on client/ server architecture –Client processes requesting service.
Connectathon 2009 Gazelle: HL7 V2 EVS, PIX Tests Agents, Automated Testing Project plans for Connectathon 2009 (February 23 rd -27 th 2009 ) November 14.
Connectathon Organization
Presentation transcript:

 11:45 – 12:30  From IHE Profiles to conformance testing, closing the implementation gap  Helping the implementers, testing tools, connectathons  12:30 – 13:30 Lunch Break  13: :00  How to use IHE resources: hands on experience  Technical Frameworks: navigating, Q&A  Test tools: finding, using, configuring  Participating in the testing process

IHE Resources Eric Poiseau, INRIA, IHE Europe technical manager Charles Parisot, GE, IHE Europe

Connectathon

History

Connectathon  Started in 1998 in Chicago within the RSNA HQ  Europe started in 2001  Japan in 2003  China and Australia now also in the process

Charenton le pont 2001  11 companies  18 systems  40 m2  30 participants Nantes Nov 2007Formation IHE France 6

Paris 2002  33 companies  57 systems  130 m2  100 participants Nantes Nov 2007Formation IHE France 7

Aachen 2003  43 companies  74 systems  350 m2  135 participants Nantes Nov 2007Formation IHE France 8

Padova 2004  46 companies  78 systems  600 m2  180 participants Nantes Nov 2007Formation IHE France 9

Noordwijkerhout 2005  75 companies  99 systems  800 m2  250 participants Nantes Nov 2007Formation IHE France 10

Barcelona 2006  67 companies  117 systems  1500 m2  +250 participants Nantes Nov 2007Formation IHE France 11

Berlin 2007  Companies  systems  1500 m2  +300 participants

Oxford 2008  83 companies  112 systems  1500 m2  300 participants

C.A.T Participation in Europe Paris Paris Aachen Padova Noordwijkerhout Barcelona Berlin Oxford

Purpose  Test the implementation of the integration profile within product  Verify that the vendors did a good job  Verify that what the committees invented makes sense !  Verify that the text is clear enough  Verify that that the committee did not miss anything  Build a community of …

Computer geeks…

…who like to enjoy local brewed beers

From the vendor perspective  Unique Opportunity for vendors to test their implementations of the IHE integration profiles  Controlled environment  Customer is not present !  Not in a clinical production environment  Specialists available  From SDO  From the peer companies  Bugs are identified and most of the time fixed !!!!  Connectathon Result Matrix 

But…  Testing is sub-optimal  Only a part of all the possible tests are performed  A system successful at the connectathon is not guaranteed to be error free !!!!  We do not do certification !

From the IHE perspective  Feedback from the vendor community  Did the committee do a good job ?  Did the developed integration profile respond to a demand of the vendors ?

European C.A.T  We have reach now our cruise speed  NA and EU C.A.T are very alike  C.A.T used as an IHE promoting tool  Workshop in parallel to the C.A.T  Berlin : ITEG  Oxford  Vienna

C.A.T. Model

The IHE testing process 22/05/08Projet IHE-Dev Inria Rennes 23 Users Sponsors: Project ManagementTeam Vendors Sponsors:Exhibits Develop Testing Tools Implement Profile Actors In-HouseTesting Connectathon Demonstration DeployedSystems Testing Results Approves Test Logs IHE Technical Framework (Profiles Specification) Product + Integration Statement

Pre-connectathon

 Registration  See what can be tested  Exchange of configuration parameters  IP addresses  AE Title  Assigning authorities  OID  Certificates  Affinity domain specification

Pre-connectathon  Mesa testing  In-house testing for vendors to get ready  Vendors return logs  Upon log return participation to C.A.T is accepted

At connectathon

6-7 Feb 2008Participant Workshop 28 Connectathon Testing  3 types of test to be performed  No peer tests  Peer to peer tests  Workflow tests

No Peer Tests  Calibration Tests -CPI :  screen calibration  Printer calibration  Scrutiny Tests  Verify that the objects created are « valid »  Provide peers with samples 6-7 Feb 2008Participant Workshop 29

Peer To Peer Tests (P2P)  Test subsections of a workflow between 2 vendors  Preparation to workflow test  Vendor chose when to run them  Vendor select their peer.  Not to be run with other systems from same company 6-7 Feb 2008Participant Workshop 30

Workflow Tests  Test an entire workflow that may combined more than one integration profile  We have a schedule, vendors need to be ready at the time of the test.  We have a list of difficulties to check.  Some test can run in 15 minutes  Some will require more than an hour  No second chance test 6-7 Feb 2008Participant Workshop 31

5 days  Monday morning till 11 am  Set up time  Till Friday noon :  Free peer to peer and no peer testing  From Wednesday till Friday noon :  Directed workflow testing

Monitors  Volunteers  Independent from vendors  Standard specialist  Verify tests  Act as moderator between vendors

Results  Failure are not reported  To be successful  Each peer to peer test needs to be verified with at least 3 peers  There are some exceptions  A vendor may fail for an actor but pass for the others

Nantes Nov 2007Formation IHE France 35 Connectathon Results  IHE does not report failure  Public results only at the company level  IHE will never tell you what system participated to the connetathon  Vendors have access to their own test results.

Nantes Nov 2007Formation IHE France 36 Connect-a-thon Results Browser

Nantes Nov 2007Formation IHE France 37 Connectathon Results Browser

Nantes Nov 2007Formation IHE France 38 Connectathon Results Browser

What does it mean ?  The Company was successful at the connectathon for the actor/integration profile combination  Results do not guaranty product conformity  This is the role of the « IHE integration statements » Nantes Nov 2007Formation IHE France 39

IHE Integration Statement Nantes Nov 2007Formation IHE France 40

Participation Fees  First System € 2750  Other systems€ 2850  Per domain€ 750  Covers :  Infrastructure : room, power, monitors, internet…  Lunch and coffee breaks for 2 engineers during 5 days

Next Connectathon  Where : Remise, Vienna, Austria   When : Monday 20th April to Friday 24th April 2009  Registration : November 1st – January 7th 2009  Announcement to be released soon

CAT : conclusion

C.A.T : Conclusion  It’s not a certification process  Unique opportunity for vendor to test and discuss  Seems to be usefull as proved by increased participation over the years  Sure, needs improvement…  … but, we are working on it

Testing

Before we start  Impossible to test every thing  What we do not test  Design  Performance (Load)  What we are looking for  interoperability  conformity 22/05/08Projet IHE-Dev Inria Rennes 46

Conformance / Interoperability 22/05/08Projet IHE-Dev Inria Rennes 47 Specifications/Standards Implementation A Vendor A Implementation B Vendor B Conformance testing Interoperability testing

Conformance Testing (1/2)  Is unit testing  Tests a single ‘part’ of a device  Tests against well-specified requirements  For conformance to the requirements of specified and the referenced standards  Usually limited to one requirement per test.  Tests at a 'low' level  At the protocol (message/behaviour) level.  Requires a test system (and executable test cases)  Can be expensive, tests performed under ideal conditions

Conformance Testing (2/2)  High control and observability  Means we can explicitly test error behaviour  Can provoke and test non-normal (but legitimate) scenarios  Can be extended to include robustness tests  Can be automated and tests are repeatable  Conformance Testing is DEEP and NARROW  Thorough and accurate but limited in scope  Gives a high-level of confidence that key components of a device or system are working as they were specified and designed to do

Limitations of Conformance Testing  Does not prove end-to-end functionality (interoperability) between communicating systems  Conformance tested implementations may still not interoperate  This is often a specification problem rather than a testing problem! Need minimum requirements or profiles  Does not test a complete system  Tests individual system components, not the whole  A system is often greater than the sum of its parts!  Does not test functionality  Does not test the user’s ‘perception’ of the system  Standardised conformance tests do not include proprietary ‘aspects’  Though this may well be done by a manufacturer with own conformance tests for proprietary requirements

Interoperability Testing  Is system testing  Tests a complete device or a collection of devices  Shows that (two) devices interoperate  within a limited scenario !  Tests at a ‘high’ level (as perceived by users)  Tests the ‘whole’, not the parts  Tests functionality  Does not necessarily require a test system  Uses existing interfaces (standard/proprietary)  Interoperability Testing is BROAD and SHALLOW  Less thorough but wide in scope  Gives a high-level of confidence that devices (or components in a system) will interoperate with other devices (components)

Limitations of Interoperability Testing  Does not prove interoperability with other implementations with which no testing has been done  A may interoperate with B and B may interoperate with C. But it doesn’t necessarily follow that A will interoperate with C.  Combinatorial explosion  Does not prove that a device is conformant  Interoperable devices may still interoperate even though they are non-conformant  Cannot explicitly test error behaviour or unusual scenarios  Or other conditions that may need to be forced (lack of controllability)  Has limited coverage (does not fully exercise the device)  Not usually automated and may not be repeatable

Conformance or Interoperability  Both are Needed !  Complementary, not competitive  ETSI : « While it is not absolutely necessary to undertake both types of testing, the combined application of both techniques gives a greatly increased confidence in the tested product and its chances of interoperating with the other similar products

Conclusion  Need to have conformance testing in the IHE Testing process  Important to perform conformance testing in advance of the connectathon  Interoperability testing takes place during the connectathon.  Need to perform conformance testing as well during the connectathon.

IHE Ressources

Technical Frameworks  One per Domain  They are the reference, the tools are not !  Written and reviewed by Vendors and Users  Freely available on

Organization of the TF  Volume 1  Description of the Integration profiles and actors  Dependencies between actors and integration profiles  Use cases  Volume 2 and followings  Description of the transactions with reference to used standards

TF Life Cycle  Every year :  New integration profiles  Change Proposal  Integration Profile proposed as Supplements  Public Comment  Trial Implementation  Final Text  Once in Final Text integration into the main document.  No concept of version.

TF Navigation (Kudu)  IHE Connectathon management tool (Kudu) needs to know about the IHE Concepts  Concepts in a database  PHP script to navigate among the concepts  URL :  Warning : TF is the reference, this is an other view of the official document

Wiki   A lot of information  Committee planning / minutes   Wiki for connectathon organization and management  Code exchange  XDS implementation page  …

Mesa tools

Mesa Tools  m m  First generation of tools  Used for pre-connectathon testing  More focused on conformance than interoperability

Mesa Tools Installation  Available for Windows, Linux (easier to use on Linux)  Need  database installation  perl  Contains hl7 listener, initiator, Dicom tools  Set of perl scripts to run scenario  Set of data to run the scenario.

Mesa Tools Output  Set of tests to run based on system pedigree  Each test gathers exchanged messages  A script evaluates the content of the captured message for specifics content  Output is a text file : “Passed”, “Failed”

Mesa Tools Limitation  Need to install the entire tool set even for testing a single integration profile  Scripts kind of require a clean SUT context.  Not easy to use… not easy to maintain

Kudu

Connectathon management  Registration process  Pre-connectathon testing management  Pre-connectathon configuration exchange  Connectathon test management  Connectathon results management

Kudu  Used by in  North America  Europe  Japan  China  Australia  Helps harmonizing the testing process

Kudu draw backs  Designed for Connectathon not for usage by vendors  PHP scripts : scalability may not work  Designed for “interoperability testing” not “conformance testing”

Gazelle

Gazelle = MESA + Kudu  Proposal to combine MESA and Kudu  2nd generation of tool  Avoid 1st generation design errors  Target more use cases  Allow scalability  More developers  Better software, better coverage  Improved support

Gazelle Requirements

Objectives  Improve the overall quality of testing  Conformance and Interoperability  Broaden the use of the application  Build a framework for Healthcare interoperability testing

5 Use Cases  Connectathon  Virtual Connectathon  Company Internal Testing tool  Healthcare Enterprise Testing tool  Governmental organizations

Requirements  Synchronous testing of multiple systems  Multilingual  Scalable

Gazelle Architecture

Proxy System Under Test Network Gazelle Test Engine Control Configuration Info Feedback External Validation Services Tests Scenario Gazelle Actor (Simulators) Gazelle Control System External Validation Services External Validation Services External Validation Services Gazelle Actor (Simulators) Gazelle Actor (Simulators) Gazelle Actor (Simulators) Gazelle Actor (Simulators) System Under Test Database 22/05/08Projet IHE-Dev Inria Rennes 77

System under test  More than one system can be tested at the same time  One S.U.T, many simulators (~mesa)  Many S.U.T, no simulators (~kudu)  S.U.T management  Web application to provide instruction

Database  Model of TF concepts  Storage of test related information  Assertion to be tested  Ideally provided by the IHE technical committees Test Management

EVS External Validation Services

 Part of the Gazelle architecture  Webservices following a common definition  Perform “validation”  HL7  Dicom  CDA

Dicom EVS  2 services available  DVTK ( based servicehttp://  Hosted by MIR : blue.wustl.edu:8090/service1?wsdlhttp://gazelle- blue.wustl.edu:8090/service1?wsdl  Dicom3tools ( based servicehttp://  Hosted by MIR : red.wustl.edu:8080/axis2/services/service1http://gazelle- red.wustl.edu:8080/axis2/services/service1

CDA EVS  Service proposed by the NIST  GUI  validation/validation.html validation/validation.html  WSDL  ebService?wsdl ebService?wsdl

HL7 EVS  NIST EVS    INRIA EVS :  InriaHL7EVS- ejb/Hl7GazelleMessageValidation?wsdl InriaHL7EVS- ejb/Hl7GazelleMessageValidation?wsdl

CDA : How does it work ?  Use of Schematron  NIST wrote schematron for the IHE Integration profiles  EVS performs schematron validation of the send document

Dicom : How does it work ?  Check for conformance to the dicom standard.  No IHE specifics there.  Use of MTOM for transport of the large dicom objects.

HL7 : How does it work ?  HL7 message profiles  INRIA wrote message profiles for IHE integration profiles  EVS uses the HL7 message profile as a reference to validate message  Validates the “syntax” of the message

HL7 Message profiles  In order to check HL7 message conformity need a reference document  Usage of HL7 Message Profiling mechanism  Re-engineering of TF and production of the HL7 message profiles for the existing transactions  See : or get the source from the forgehttp://sumo.irisa.fr/TFforge  HL7 Message profiles  HL7 message samples

EVS GUI Tool  Tool for easy EVS interfacing  Java Web Start :   Allow user to select a file and get validation  For HL7 can work as a proxy

Simulators

Actor Simulators  IHE actors with Web service interface for control by Gazelle  We are currently working on the API  Configuration  Control  Feedback  Re-use of existing software  Need to adapt to fit API

Proxy

 Used to capture messages exchanged between systems to perform validation by the EVS  Use of Mirth during the Connectathon  Need an infrastructure that supports large number of systems  Use of Nule HL7 for single SUT testing

Proxy experiment in Oxford  Successfull usage of Mirth during the Oxford C.A.T.  Used as proxy for HL7 messages  Kudu hacked to get channel generation and report of validation  Both NIST and INRIA EVS used by the proxy

Proxy Environment Overview 95 Daemon EVS mirth_input hl7_message_validation SUT1SUT Proxy

Test Engine

 Controls simulators  Controls proxy  Use BPEL for test definition and orchestration  SUT cannot be controlled directly  Control user through WEB GUI

Participants  3 IHE Regions  North America : MIR  Europe : INRIA  Japan : Shizuoka University  DVTK  NIST  Tiani-Spirit  David Clunie  Offis

Roadmap  DB model redesign  EVS API Definition  Finalize licence  EVS at Chicago connectathon  DICOM  EVS at Oxford connectathon  HL7, DICOM, CDA  Proxy for HL7 messages  Registration with Gazelle  API for Simulators  API for TestEngine  Test cases in gazelle  PIX PDQ SWF LTW

Project Management  Testing and Tool Committee  Overview of IHE testing activities  Choice of the licenses  Testing Management Group  Project Management  Eric and Steve

Licensing  Agreement of an opensource license  Final choice of the license still in discussion  Licensing does not concern tools developed by 3rd party  Typically EVS, Simulators

Thanks Contact :