Presentation is loading. Please wait.

Presentation is loading. Please wait.

The use cases that drive XNAT development Dan Marcus June 24, 2012.

Similar presentations


Presentation on theme: "The use cases that drive XNAT development Dan Marcus June 24, 2012."— Presentation transcript:

1 The use cases that drive XNAT development Dan Marcus June 24, 2012

2 4 Driving Use Cases Institutional repositories Institutional repositories Multi-center studies Multi-center studies Data sharing Data sharing Clinical research Clinical research (There are others – individual labs, small animal imaging, etc.) (There are others – individual labs, small animal imaging, etc.)

3 Institutional repositories Organizational Characteristics: Organizational Characteristics: – Multiple studies (i.e. protocols, projects) – Multiple investigators – Multiple modalities – Multiple user types (PIs, RAs, students, techs, external collaborators, etc.) – Common imaging protocols – Common data elements

4 Institutional repositories Technical Characteristics: Technical Characteristics: – Common computing resources (e.g. data storage, computing grid) – Common data resources (e.g PACS, clinical database). – Common authentication/authorization resources (e.g. active directory, university login system)

5 Institutional repositories XNAT Capabilities XNAT Capabilities – Project-based security and navigation – DICOM C-Store – Custom routing rules – Pipeline management – LDAP authentication

6 Institutional repositories XNAT Gaps XNAT Gaps – Pushing protocols to scanners.

7 Institutional repositories Example: Example: – Central Neuroimaging Data Archive (CNDA) (“The Original XNAT”) 831 Projects, Subjects, Imaging Sessions, 240 PIs. Direct connectivity to all research scanners Direct access to department data storage Direct connectivity to department computing resources.

8 Institutional repositories Central Neuroimaging Data Archive (CNDA) (“The Original XNAT”) Central Neuroimaging Data Archive (CNDA) (“The Original XNAT”) – – 831 Projects, Subjects, Imaging Sessions, 240 PIs. – – Direct connectivity to all research scanners – – Direct access to department data storage – – Direct connectivity to department computing resources.

9 Institutional repositories CNDA Data Sources CNDA Data Sources – – Center for Clinical Imaging Research (CCIR) – – East Building MRI Facility

10 Data import from CCIR 1.Betsy Thomas, head coordinator, manages creation of new protocols, assigns protocol #. 2.Betsy creates project in CNDA using protocol # as project ID. 3.Betsy notifies PI re: CNDA project. 4.Scanner tech creates project specific protocol on scanner. CCIR# is written to DICOM header. 5.Scanner tech sends all scans to CNDA immediately after acquisition. 6.Automated scripts sent to scanner techs to “close the loop”.

11 Data import from CCIR sourceprojectsubjectsessionLabelscanDateUpload datefiles archiveCCIR /21/20126/21/ : archiveCCIR _ v106/21/20126/21/ : archiveCCIR-00286phantom 6/21/20126/21/ :5862 prearchiveCCIR-00358pan15812_06_21-11_16/21/20126/21/ :261 archiveCCIR-00477ADOL_0032 6/21/20126/21/ : archiveCCIR-00491RD6 6/20/20126/20/ : archiveCCIR-00437NP965-31P80476/20/20126/20/ :22918 prearchiveCCIR /20/20126/20/ : prearchiveCCIR /20/20126/20/ : archiveCCIR-00176BA75_MR1 6/20/20126/20/ : archiveCCIR _11_62_MRI 6/20/20126/20/ : archiveCCIR _52106/14/20126/20/ :361463

12 Data import from East Building 1.Investigators opt in. 2.Investigators create their own projects. 3.Whoever runs scan manually enters project ID in Study Comments 4.Whoever runs scan sends scan to CNDA destination 5.Whoever screws up calls helpdesk to locate scan.

13 Funding models Per project (or subject or scan) fee Per project (or subject or scan) fee Departmental support Departmental support Center funding Center funding Large scale projects Large scale projects Industry support Industry support

14 Spinning out mini repositories When to do it? When to do it? How to do it? How to do it?

15 Multicenter studies Organizational Characteristics: Organizational Characteristics: – One primary PI – Many site PIs – One data coordinating center – Many staff, many roles – Multiple data sources, one unified data set – Multiple data access policies

16 Multicenter trials Technical Characteristics: Technical Characteristics: – Uploading is key – Central image database – Separate clinical database – Common protocol (with variations) – Common image analysis (with variations) – Many ways to screw up

17 Multicenter trials XNAT Capabilities XNAT Capabilities – Between-project sharing – Image acquisition validation – Programmatic API – Protocol validation – Visualization – 21 CFR Part 11 compliance* * Requires additional commercial modules from Radiologics

18 Multicenter trials XNAT Gaps XNAT Gaps – Notification service – Rule engine – Site query service

19 Multicenter trials Example: Example: – Dominantly Inherited Alzheimer Network (DIAN) Longitudinal study 12 sites 269 participants Extensive protocol (MRI, 2x PET, tissue, clinical battery, etc) Clinical coordinating center (and clinical DB) at Alzheimer’s Disease Cooperative Study (ADCS), UCSD MRI QC at Mayo clinic PET QC at U of Michigan Radiology eads by Wash U diagnostic group

20 DIAN Dataflow

21 Coordinating Data Images uploaded via upload applet. Images uploaded via upload applet. Psychometrics uploaded via custom form. Psychometrics uploaded via custom form. PET QC completed through online forms (Easy breezy). PET QC completed through online forms (Easy breezy). Radiology reads completed through online viewer and forms (Easy breezy). Radiology reads completed through online viewer and forms (Easy breezy). Processed image data through automated pipelines (Tough but worthwhile). Processed image data through automated pipelines (Tough but worthwhile).

22 Coordinating Data MR QC imported through ETL process MR QC imported through ETL process – Data extracted from Mayo DB into spreadsheet. – Spreadsheet transformed to XNAT XML. – XML loaded to CNDA by NRG scripts. – Havoc ensues.

23 Coordinating Data Clinical data imported through ETL process Clinical data imported through ETL process – Data extracted from EDC by NRG via programmatic interface. – Data transformed to XML by NRG scripts. – XML loaded to CNDA by NRG scripts. – Havoc DOESN’T ensue.

24 Coordinating Data What’s the difference? What’s the difference? – Mayo uses patient name field in DICOM which might not match the database. – MRI QC values trigger actions (queries, uploads) so changes cause lots of confusion. – Wash U controls clinical data transfers, so if things get weird are aware and can resolve.

25 Data sharing Organizational Characteristics: Organizational Characteristics: – Different sharing models (open access, access approval) – A few uploaders – Many downloaders – Link to publication – Documentation is needed – Trust and reliability are concerns – Usage tracking

26 Data sharing Technical Characteristics: Technical Characteristics: – Downloading is key – Between instance sharing (“Share this project to XNAT Central”) – Convenient data formats – Careful anonymization – Bandwidth bursts

27 Data sharing XNAT Capabilities XNAT Capabilities – Project access options – DICOM review – DICOM to NIFTI pipeline – Data type extensibility – API

28 Data sharing XNAT Gaps XNAT Gaps – Exceptional download tool – Non-DICOM upload tool – Dynamic extensibility – Good project summaries – Standard QA pipelines – Data dictionaries – Customizable project pages

29 Data sharing Examples: Examples: – OASIS Cross sectional – 430 subjects across lifespan Cross sectional – 430 subjects across lifespan Longitudinal old subjects, multiple visists Longitudinal old subjects, multiple visists MR, demographics, some clinical, freesufer MR, demographics, some clinical, freesufer Open access – lots of downloads, dozens of papers. Open access – lots of downloads, dozens of papers. – XNAT Central Public XNAT instance Open to contributions from anyone Unfunded proof of concept 250 projects – mostly garbage

30 Open access DUA 1.Indemnify: The quality and completeness of the data cannot be guaranteed. Users employ these data at their own risk. 2.Respect subject privacy: Users shall respect restrictions of access to sensitive data. Users will make no attempt to identify the individuals whose images are included in OASIS data sets. 3.Acknowledge: Users must acknowledge the use of OASIS data and data derived from OASIS data when publicly presenting any findings or algorithms that benefited from their use…

31 Open access DUA 4.Encourage redistribution: Redistribution of original OASIS data is permitted so long as the data are redistributed under the same terms and conditions are described in this DUA. 5.Respect your users: Data derived from original OASIS data may be distributed under terms and conditions established by the creators of the data. Users must comply with the terms and conditions of use set by the creators of the data.

32 Clinical Research Organizational Characteristics: Organizational Characteristics: – Patient images – Patient medical record – No standard protocols – Lower quality images – Prospective studies – Retrospective studies

33 Clinical Research Technical Characteristics: Technical Characteristics: – Crossing the divide is key – Need to obtain images from archives – Need to obtain data directly from modalities – Careful handling of PHI – Need to send reports back to clinical information systems

34 Clinical Research XNAT Capabilities XNAT Capabilities – DICOM C-Store – DICOM C-Find, C-Move (almost) – Automated pipelines (including NIFTI to DICOM) – Metadata summaries

35 Clinical Research XNAT Gaps XNAT Gaps – Better linking to clinical research databases (ClinPortal, I2B2) – Formatting data types as structured reports – Pushing reports & images back to clinical information systems – Metadata-based searches

36 Data sharing Example: Example: – Resting state fMRI for neursurgical planning Run rs-fMRI scan preoperatively Execute “Perceptron” pipeline for localizaing key networks Create DICOM version of spatially registered labeled image Post image as secondary capture with study Load study on navigation system for surgeon

37 Model Workflow PACS Stealth

38 Model Workflow – using Amazon

39 Discussion Do these use cases match what others are doing? Do these use cases match what others are doing? Am I missing key use cases? Am I missing key use cases? Are there complimentary technologies we need to be aware? Are there complimentary technologies we need to be aware?


Download ppt "The use cases that drive XNAT development Dan Marcus June 24, 2012."

Similar presentations


Ads by Google