Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software for systematic reviews

Similar presentations


Presentation on theme: "Software for systematic reviews"— Presentation transcript:

1 Software for systematic reviews
Technology to assist with guideline development: Software for systematic reviews Skye Newton, Joanne Milverton, Sharon Kessels (no conflicts of interest) Adelaide Health Technology Assessment, School of Population Health, University of Adelaide Over the last couple days, we have heard about lots of different software packages which should assist in the development and dissemination of guidelines. I thought that today I would share with you today our groups experience at comparing a few different software packages which are available for the early stages of developing a systematic review. We are still in the process of our investigation, so today I’ll just be presenting some interim results, which can’t fully evaluate all the different aspects of the packages as yet. I should state that Jo, Sharon and I have no financial or other conflicts of interests in one software package or another, other than being very interested in being able to reduce the amount of time we need to spend culling citations in Endnote. adelaide.edu.au

2 Systematic reviews (SRs)
Potentially relevant articles identified in the bibliographic databases and screened for retrieval: n=19,349 Articles retrieved for more detailed evaluation: n=163 Included studies: n=39 Articles excluded, with reasons*: n=123 Did not meet inclusion criteria: 140 SR did not meet inclusion criteria: 14 Not retrieved within time: 3 Duplicates: 7 Articles excluded because they did not meet inclusion criteria, determined by title/ abstract: n=19,186 Systematic reviews which are performed as the basis of guidelines often start with very large volumes of citations to screen, and the processes of culling these citations, keeping track of the includes and excludes, reasons for exclusion etc is complicated enough if you are doing individual data entry, led alone if attempting double screening. University of Adelaide

3 Current processes Currently, our group uses a mixture of Endnote, Word and Stata to perform the different roles of keeping track of the references, performing quality appraisals, data extraction and meta-analysis. Although when I first discovered Endnote, I thought it was wonderful, it has a habit of crashing with large file sizes, and although it is getting more sophisticated, has no easy method of comparing results for two reviewers, and it doesn’t allow for easy creation of a QUOROM flowchart. University of Adelaide

4 Aim We therefore have been wondering if there is a better way.
University of Adelaide

5 Identifying SR software
Cochrane Symposium Covidence        DistillerSR       Abstrackr           RevMan / RevMan HAL  ExaCT       Internet browser EROS TrialState SRS 4.0 SUMARI EPPI Reviewer 4            Potentially relevant software identified: n=9 Software chosen for trial: n=3 Software excluded: Did not assist with culling: n=1 In beta or demo mode: n=2 English version not functioning: n=1 Too Cochrane specific: n=1 Could not find software: n=1 We found out about 5 different software packages through a Cochrane symposium, and identified 4 additional ones through Google. From these 9, we restricted further investigation to those which assisted with the process of culling, were flexible enough to suit the full range of the types of reviews we do, and were developed enough to not be in demonstration or beta mode anymore. University of Adelaide

6 Shortlist Searching  PubMed only Upload from Endnote 
Duplicates removal Double screening Generate PRISMA F/C Quality appraisal Auto-populate tables Meta-analysis Export to RevMan Cost (2 Pp, 4 months) US$160 (free) US$1500 £220 Access once subscription finished Currently N/A (free) Read only 1 year Read only 2 months On our shortlist were Covidence , DistillerSR , and EPPI Reviewer 4. From initial investigations into what they could actually do, all three of them looked very impressive. We developed a protocol for assessing these packages, having one other colleague and myself trialling them on real data while performing our day to day work. We compared the different packages against our current methods, using Endnote, Word and Stata. University of Adelaide

7 Covidence Very simple clean layout University of Adelaide

8 Covidence Very intuitive, and easy to use. University of Adelaide

9 Covidence Immediately tells you the vote of the other reviewer, and requires a decision University of Adelaide

10 Covidence Covidence 3042 → 1902 Endnote 3042 → 1524
University of Adelaide

11 Covidence Annoying having to decide which one reason for excluding a study should be listed. University of Adelaide

12 Distiller SR University of Adelaide

13 Distiller SR University of Adelaide

14 Distiller SR University of Adelaide

15 EPPI Reviewer 4 Very thorough, but painfully slow method of removing duplicates. There is a method of setting different thresholds for what is considered a duplicate, and what is not, but there is still a limit to how many you can get rid of at once. We ended up deleting the whole library, removing the duplicates in Endnote, and starting again in EPPI Reviewer. University of Adelaide

16 EPPI Reviewer 4 University of Adelaide

17 EPPI Reviewer 4 Separate step for working out duplication.
University of Adelaide

18 Outcome measures Ease of use (compared to Endnote)
Functionality (compared to Endnote) Assessed for: Ease of use compared to Endnote (0=could not get to work, 5=same as Endnote, 10=really easy, much better than Endnote) Functionality compared to Endnote (0=does not work, 5=same as Endnote, 10=great functionality, much better than Endnote) The way we assessed the packages, was through two rating scales, one for ease of use, and one for functionality. University of Adelaide

19 Overall - Ease of use 0=could not get to work, 5=same easiness as current processes (Endnote), 10=very easy, much better than current processes. Across the range of activities, Covidence was the clear winner in terms of being easy to use, from the moment you start using it. Both DistillerSR and EPPI Reviewer 4 have big learning curves, and are not easy to pick up. Distiller is slightly easier than EPPI-Reviewer 4, but EPPI Reviewer 4 has fantastic YouTube videos that step you through what’s required. University of Adelaide

20 Overall - Functionality
0=did not work, much worse than current processes, 5=same as current processes, 10=much more functional than current processes When it comes to functionality, we have very different rating scales. Covidence is incredibly easy to use because it doesn’t give you many options. EPPI-Reviewer 4 and Distiller both give the option of highlighting text, which I really like for assisting culling. University of Adelaide

21 Conclusions thus far? Covidence clearly most user-friendly, but least capable Distiller SR and EPPI Reviewer 4 have very similar functionality Distiller SR is slightly more intuitive, but EPPI Reviewer has better manual and fantastic YouTube tutorials University of Adelaide

22 Summary Our interim results limited to early stages of SR
Balance of ease of use and functionality Learning curves mean first (few?) reviews are unlikely to see efficiency gains Great hope for the future University of Adelaide

23 Thank you University of Adelaide


Download ppt "Software for systematic reviews"

Similar presentations


Ads by Google