Presentation is loading. Please wait.

Presentation is loading. Please wait.

 11:45 – 12:30  From IHE Profiles to conformance testing, closing the implementation gap  Helping the implementers, testing tools, connectathons  12:30.

Similar presentations


Presentation on theme: " 11:45 – 12:30  From IHE Profiles to conformance testing, closing the implementation gap  Helping the implementers, testing tools, connectathons  12:30."— Presentation transcript:

1  11:45 – 12:30  From IHE Profiles to conformance testing, closing the implementation gap  Helping the implementers, testing tools, connectathons  12:30 – 13:30 Lunch Break  13:30 - 15:00  How to use IHE resources: hands on experience  Technical Frameworks: navigating, Q&A  Test tools: finding, using, configuring  Participating in the testing process

2 IHE Resources Eric Poiseau, INRIA, IHE Europe technical manager Charles Parisot, GE, IHE Europe

3 Connectathon

4 History

5 Connectathon  Started in 1998 in Chicago within the RSNA HQ  Europe started in 2001  Japan in 2003  China and Australia now also in the process

6 Charenton le pont 2001  11 companies  18 systems  40 m2  30 participants Nantes Nov 2007Formation IHE France 6

7 Paris 2002  33 companies  57 systems  130 m2  100 participants Nantes Nov 2007Formation IHE France 7

8 Aachen 2003  43 companies  74 systems  350 m2  135 participants Nantes Nov 2007Formation IHE France 8

9 Padova 2004  46 companies  78 systems  600 m2  180 participants Nantes Nov 2007Formation IHE France 9

10 Noordwijkerhout 2005  75 companies  99 systems  800 m2  250 participants Nantes Nov 2007Formation IHE France 10

11 Barcelona 2006  67 companies  117 systems  1500 m2  +250 participants Nantes Nov 2007Formation IHE France 11

12 Berlin 2007  Companies  systems  1500 m2  +300 participants

13 Oxford 2008  83 companies  112 systems  1500 m2  300 participants

14 C.A.T Participation in Europe Paris Paris Aachen Padova Noordwijkerhout Barcelona Berlin Oxford

15 Purpose  Test the implementation of the integration profile within product  Verify that the vendors did a good job  Verify that what the committees invented makes sense !  Verify that the text is clear enough  Verify that that the committee did not miss anything  Build a community of …

16 Computer geeks…

17 …who like to enjoy local brewed beers

18 From the vendor perspective  Unique Opportunity for vendors to test their implementations of the IHE integration profiles  Controlled environment  Customer is not present !  Not in a clinical production environment  Specialists available  From SDO  From the peer companies  Bugs are identified and most of the time fixed !!!!  Connectathon Result Matrix  http://sumo.irisa.fr/con_result http://sumo.irisa.fr/con_result

19 But…  Testing is sub-optimal  Only a part of all the possible tests are performed  A system successful at the connectathon is not guaranteed to be error free !!!!  We do not do certification !

20 From the IHE perspective  Feedback from the vendor community  Did the committee do a good job ?  Did the developed integration profile respond to a demand of the vendors ?

21 European C.A.T  We have reach now our cruise speed  NA and EU C.A.T are very alike  C.A.T used as an IHE promoting tool  Workshop in parallel to the C.A.T  Berlin : ITEG  Oxford  Vienna

22 C.A.T. Model

23 The IHE testing process 22/05/08Projet IHE-Dev Inria Rennes 23 Users Sponsors: Project ManagementTeam Vendors Sponsors:Exhibits Develop Testing Tools Implement Profile Actors In-HouseTesting Connectathon Demonstration DeployedSystems Testing Results Approves Test Logs IHE Technical Framework (Profiles Specification) Product + Integration Statement

24 Pre-connectathon

25  Registration  See what can be tested  Exchange of configuration parameters  IP addresses  AE Title  Assigning authorities  OID  Certificates  Affinity domain specification

26 Pre-connectathon  Mesa testing  In-house testing for vendors to get ready  Vendors return logs  Upon log return participation to C.A.T is accepted

27 At connectathon

28 6-7 Feb 2008Participant Workshop 28 Connectathon Testing  3 types of test to be performed  No peer tests  Peer to peer tests  Workflow tests

29 No Peer Tests  Calibration Tests -CPI :  screen calibration  Printer calibration  Scrutiny Tests  Verify that the objects created are « valid »  Provide peers with samples 6-7 Feb 2008Participant Workshop 29

30 Peer To Peer Tests (P2P)  Test subsections of a workflow between 2 vendors  Preparation to workflow test  Vendor chose when to run them  Vendor select their peer.  Not to be run with other systems from same company 6-7 Feb 2008Participant Workshop 30

31 Workflow Tests  Test an entire workflow that may combined more than one integration profile  We have a schedule, vendors need to be ready at the time of the test.  We have a list of difficulties to check.  Some test can run in 15 minutes  Some will require more than an hour  No second chance test 6-7 Feb 2008Participant Workshop 31

32 5 days  Monday morning till 11 am  Set up time  Till Friday noon :  Free peer to peer and no peer testing  From Wednesday till Friday noon :  Directed workflow testing

33 Monitors  Volunteers  Independent from vendors  Standard specialist  Verify tests  Act as moderator between vendors

34 Results  Failure are not reported  To be successful  Each peer to peer test needs to be verified with at least 3 peers  There are some exceptions  A vendor may fail for an actor but pass for the others

35 Nantes Nov 2007Formation IHE France 35 Connectathon Results  IHE does not report failure  Public results only at the company level  IHE will never tell you what system participated to the connetathon  Vendors have access to their own test results.

36 Nantes Nov 2007Formation IHE France 36 Connect-a-thon Results Browser

37 Nantes Nov 2007Formation IHE France 37 Connectathon Results Browser

38 Nantes Nov 2007Formation IHE France 38 Connectathon Results Browser

39 What does it mean ?  The Company was successful at the connectathon for the actor/integration profile combination  Results do not guaranty product conformity  This is the role of the « IHE integration statements » Nantes Nov 2007Formation IHE France 39

40 IHE Integration Statement Nantes Nov 2007Formation IHE France 40

41 Participation Fees  First System € 2750  Other systems€ 2850  Per domain€ 750  Covers :  Infrastructure : room, power, monitors, internet…  Lunch and coffee breaks for 2 engineers during 5 days

42 Next Connectathon  Where : Remise, Vienna, Austria  http://www.koop-kundenweb.at/remise/ http://www.koop-kundenweb.at/remise/  When : Monday 20th April to Friday 24th April 2009  Registration : November 1st – January 7th 2009  Announcement to be released soon

43 CAT : conclusion

44 C.A.T : Conclusion  It’s not a certification process  Unique opportunity for vendor to test and discuss  Seems to be usefull as proved by increased participation over the years  Sure, needs improvement…  … but, we are working on it

45 Testing

46 Before we start  Impossible to test every thing  What we do not test  Design  Performance (Load)  What we are looking for  interoperability  conformity 22/05/08Projet IHE-Dev Inria Rennes 46

47 Conformance / Interoperability 22/05/08Projet IHE-Dev Inria Rennes 47 Specifications/Standards Implementation A Vendor A Implementation B Vendor B Conformance testing Interoperability testing

48 Conformance Testing (1/2)  Is unit testing  Tests a single ‘part’ of a device  Tests against well-specified requirements  For conformance to the requirements of specified and the referenced standards  Usually limited to one requirement per test.  Tests at a 'low' level  At the protocol (message/behaviour) level.  Requires a test system (and executable test cases)  Can be expensive, tests performed under ideal conditions

49 Conformance Testing (2/2)  High control and observability  Means we can explicitly test error behaviour  Can provoke and test non-normal (but legitimate) scenarios  Can be extended to include robustness tests  Can be automated and tests are repeatable  Conformance Testing is DEEP and NARROW  Thorough and accurate but limited in scope  Gives a high-level of confidence that key components of a device or system are working as they were specified and designed to do

50 Limitations of Conformance Testing  Does not prove end-to-end functionality (interoperability) between communicating systems  Conformance tested implementations may still not interoperate  This is often a specification problem rather than a testing problem! Need minimum requirements or profiles  Does not test a complete system  Tests individual system components, not the whole  A system is often greater than the sum of its parts!  Does not test functionality  Does not test the user’s ‘perception’ of the system  Standardised conformance tests do not include proprietary ‘aspects’  Though this may well be done by a manufacturer with own conformance tests for proprietary requirements

51 Interoperability Testing  Is system testing  Tests a complete device or a collection of devices  Shows that (two) devices interoperate  within a limited scenario !  Tests at a ‘high’ level (as perceived by users)  Tests the ‘whole’, not the parts  Tests functionality  Does not necessarily require a test system  Uses existing interfaces (standard/proprietary)  Interoperability Testing is BROAD and SHALLOW  Less thorough but wide in scope  Gives a high-level of confidence that devices (or components in a system) will interoperate with other devices (components)

52 Limitations of Interoperability Testing  Does not prove interoperability with other implementations with which no testing has been done  A may interoperate with B and B may interoperate with C. But it doesn’t necessarily follow that A will interoperate with C.  Combinatorial explosion  Does not prove that a device is conformant  Interoperable devices may still interoperate even though they are non-conformant  Cannot explicitly test error behaviour or unusual scenarios  Or other conditions that may need to be forced (lack of controllability)  Has limited coverage (does not fully exercise the device)  Not usually automated and may not be repeatable

53 Conformance or Interoperability  Both are Needed !  Complementary, not competitive  ETSI : « While it is not absolutely necessary to undertake both types of testing, the combined application of both techniques gives a greatly increased confidence in the tested product and its chances of interoperating with the other similar products

54 Conclusion  Need to have conformance testing in the IHE Testing process  Important to perform conformance testing in advance of the connectathon  Interoperability testing takes place during the connectathon.  Need to perform conformance testing as well during the connectathon.

55 IHE Ressources

56 Technical Frameworks  One per Domain  They are the reference, the tools are not !  Written and reviewed by Vendors and Users  Freely available on http://www.ihe.nethttp://www.ihe.net

57 Organization of the TF  Volume 1  Description of the Integration profiles and actors  Dependencies between actors and integration profiles  Use cases  Volume 2 and followings  Description of the transactions with reference to used standards

58 TF Life Cycle  Every year :  New integration profiles  Change Proposal  Integration Profile proposed as Supplements  Public Comment  Trial Implementation  Final Text  Once in Final Text integration into the main document.  No concept of version.

59 TF Navigation (Kudu)  IHE Connectathon management tool (Kudu) needs to know about the IHE Concepts  Concepts in a database  PHP script to navigate among the concepts  URL : http://sumo.irisa.fr/TFhttp://sumo.irisa.fr/TF  Warning : TF is the reference, this is an other view of the official document

60 Wiki  http://wiki.ihe.net http://wiki.ihe.net  A lot of information  Committee planning / minutes  http://ihewiki.wustl.edu/ http://ihewiki.wustl.edu/  Wiki for connectathon organization and management  Code exchange  XDS implementation page  …

61 Mesa tools

62 Mesa Tools  http://ihedoc.wustl.edu/mesasoftware/index.ht m http://ihedoc.wustl.edu/mesasoftware/index.ht m  First generation of tools  Used for pre-connectathon testing  More focused on conformance than interoperability

63 Mesa Tools Installation  Available for Windows, Linux (easier to use on Linux)  Need  database installation  perl  Contains hl7 listener, initiator, Dicom tools  Set of perl scripts to run scenario  Set of data to run the scenario.

64 Mesa Tools Output  Set of tests to run based on system pedigree  Each test gathers exchanged messages  A script evaluates the content of the captured message for specifics content  Output is a text file : “Passed”, “Failed”

65 Mesa Tools Limitation  Need to install the entire tool set even for testing a single integration profile  Scripts kind of require a clean SUT context.  Not easy to use… not easy to maintain

66 Kudu

67 Connectathon management  Registration process  Pre-connectathon testing management  Pre-connectathon configuration exchange  Connectathon test management  Connectathon results management

68 Kudu  Used by in  North America  Europe  Japan  China  Australia  Helps harmonizing the testing process

69 Kudu draw backs  Designed for Connectathon not for usage by vendors  PHP scripts : scalability may not work  Designed for “interoperability testing” not “conformance testing”

70 Gazelle

71 Gazelle = MESA + Kudu  Proposal to combine MESA and Kudu  2nd generation of tool  Avoid 1st generation design errors  Target more use cases  Allow scalability  More developers  Better software, better coverage  Improved support

72 Gazelle Requirements

73 Objectives  Improve the overall quality of testing  Conformance and Interoperability  Broaden the use of the application  Build a framework for Healthcare interoperability testing

74 5 Use Cases  Connectathon  Virtual Connectathon  Company Internal Testing tool  Healthcare Enterprise Testing tool  Governmental organizations

75 Requirements  Synchronous testing of multiple systems  Multilingual  Scalable

76 Gazelle Architecture

77 Proxy System Under Test Network Gazelle Test Engine Control Configuration Info Feedback External Validation Services Tests Scenario Gazelle Actor (Simulators) Gazelle Control System External Validation Services External Validation Services External Validation Services Gazelle Actor (Simulators) Gazelle Actor (Simulators) Gazelle Actor (Simulators) Gazelle Actor (Simulators) System Under Test Database 22/05/08Projet IHE-Dev Inria Rennes 77

78 System under test  More than one system can be tested at the same time  One S.U.T, many simulators (~mesa)  Many S.U.T, no simulators (~kudu)  S.U.T management  Web application to provide instruction

79 Database  Model of TF concepts  Storage of test related information  Assertion to be tested  Ideally provided by the IHE technical committees Test Management

80 EVS External Validation Services

81  Part of the Gazelle architecture  Webservices following a common definition  Perform “validation”  HL7  Dicom  CDA

82 Dicom EVS  2 services available  DVTK (http://www.dvtk.org/) based servicehttp://www.dvtk.org/  Hosted by MIR : http://gazelle- blue.wustl.edu:8090/service1?wsdlhttp://gazelle- blue.wustl.edu:8090/service1?wsdl  Dicom3tools (http://www.dclunie.com/dicom3tools.html) based servicehttp://www.dclunie.com/dicom3tools.html  Hosted by MIR : http://gazelle- red.wustl.edu:8080/axis2/services/service1http://gazelle- red.wustl.edu:8080/axis2/services/service1

83 CDA EVS  Service proposed by the NIST  GUI  http://xreg2.nist.gov/cda- validation/validation.html http://xreg2.nist.gov/cda- validation/validation.html  WSDL  http://xreg2.nist.gov:8080/ws/services/ValidationW ebService?wsdl http://xreg2.nist.gov:8080/ws/services/ValidationW ebService?wsdl

84 HL7 EVS  NIST EVS  http://xreg2.nist.gov:8080/HL7Web http://xreg2.nist.gov:8080/HL7Web  http://xreg2.nist.gov:8080/HL7WS http://xreg2.nist.gov:8080/HL7WS  INRIA EVS :  http://sumo.irisa.fr:8080/InriaHL7EVS-ear- InriaHL7EVS- ejb/Hl7GazelleMessageValidation?wsdl http://sumo.irisa.fr:8080/InriaHL7EVS-ear- InriaHL7EVS- ejb/Hl7GazelleMessageValidation?wsdl

85 CDA : How does it work ?  Use of Schematron  NIST wrote schematron for the IHE Integration profiles  EVS performs schematron validation of the send document

86 Dicom : How does it work ?  Check for conformance to the dicom standard.  No IHE specifics there.  Use of MTOM for transport of the large dicom objects.

87 HL7 : How does it work ?  HL7 message profiles  INRIA wrote message profiles for IHE integration profiles  EVS uses the HL7 message profile as a reference to validate message  Validates the “syntax” of the message

88 HL7 Message profiles  In order to check HL7 message conformity need a reference document  Usage of HL7 Message Profiling mechanism  Re-engineering of TF and production of the HL7 message profiles for the existing transactions  See : http://sumo.irisa.fr/TF or get the source from the forgehttp://sumo.irisa.fr/TFforge  HL7 Message profiles  HL7 message samples

89 EVS GUI Tool  Tool for easy EVS interfacing  Java Web Start :  http://sumo.irisa.fr:8080/EVSClientGUI/ http://sumo.irisa.fr:8080/EVSClientGUI/  Allow user to select a file and get validation  For HL7 can work as a proxy

90 Simulators

91 Actor Simulators  IHE actors with Web service interface for control by Gazelle  We are currently working on the API  Configuration  Control  Feedback  Re-use of existing software  Need to adapt to fit API

92 Proxy

93  Used to capture messages exchanged between systems to perform validation by the EVS  Use of Mirth during the Connectathon  Need an infrastructure that supports large number of systems  Use of Nule HL7 for single SUT testing

94 Proxy experiment in Oxford  Successfull usage of Mirth during the Oxford C.A.T.  Used as proxy for HL7 messages  Kudu hacked to get channel generation and report of validation  Both NIST and INRIA EVS used by the proxy

95 Proxy Environment Overview 95 Daemon EVS mirth_input hl7_message_validation SUT1SUT2 1 3 4 5 6 7 2 Proxy

96 Test Engine

97  Controls simulators  Controls proxy  Use BPEL for test definition and orchestration  SUT cannot be controlled directly  Control user through WEB GUI

98 Participants  3 IHE Regions  North America : MIR  Europe : INRIA  Japan : Shizuoka University  DVTK  NIST  Tiani-Spirit  David Clunie  Offis

99 Roadmap  DB model redesign  EVS API Definition  Finalize licence  EVS at Chicago connectathon  DICOM  EVS at Oxford connectathon  HL7, DICOM, CDA  Proxy for HL7 messages  Registration with Gazelle  API for Simulators  API for TestEngine  Test cases in gazelle  PIX PDQ SWF LTW

100 Project Management  Testing and Tool Committee  Overview of IHE testing activities  Choice of the licenses  Testing Management Group  Project Management  Eric and Steve

101 Licensing  Agreement of an opensource license  Final choice of the license still in discussion  Licensing does not concern tools developed by 3rd party  Typically EVS, Simulators

102 Thanks Contact : eric.poiseau@inria.fr moores@mir.wustl.edu


Download ppt " 11:45 – 12:30  From IHE Profiles to conformance testing, closing the implementation gap  Helping the implementers, testing tools, connectathons  12:30."

Similar presentations


Ads by Google