Download presentation
Presentation is loading. Please wait.
Published byAmina Diggle Modified over 9 years ago
1
Technology Assisted Review: Trick or Treat? Ralph Losey, Esq., Jackson Lewis 1
2
Ralph Losey, Esq. 2 Partner, National e-Discovery Counsel, Jackson Lewis Adjunct Professor of Law, University of Florida Active member, The Sedona Conference Author of numerous books and law review articles on e-discovery Founder, Electronic Discovery Best Practices (EDBP.com) Lawyer, writer, predictive coding search designer, and trainer behind the e-Discovery Team blog (e- discoveryteam.com) Co-founder with son, Adam Losey, of IT-Lex.org, a non-profit educational for law students and young lawyers
3
Discussion Overview 3 What is Technology Assisted Review (TAR) aka Computer Assisted Review (CAR)? Document Evaluation Putting TAR into Practice Conclusion
4
What is Technology Assisted Review? 4
5
Why Discuss Alternative Document Review Solutions? Document review is routinely the most expensive part of the discovery process. Saving time and reducing costs will result in satisfied clients. Traditional/Linear Paper-Based Document Review Online Review Technology Assisted Review 5
6
Information retrieval effectiveness can be evaluated with metrics Fraction of relevant documents within retrieved results – a measure of exactness Precision Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness Harmonic mean of precision and recall Recall F-Measure Hot Not All documents Bobbing for Apples: Defining an effective search
7
Information retrieval effectiveness can be evaluated with metrics Fraction of relevant documents within retrieved results – a measure of exactness Precision Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness Harmonic mean of precision and recall Recall F-Measure 1) Perfect Recall; Low precision Bobbing for Apples: Defining an effective search Hot Not
8
Information retrieval effectiveness can be evaluated with metrics Fraction of relevant documents within retrieved results – a measure of exactness Precision Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness Harmonic mean of precision and recall Recall F-Measure 2) Low Recall; Perfect Precision Bobbing for Apples: Defining an effective search Hot Not
9
Information retrieval effectiveness can be evaluated with metrics Fraction of relevant documents within retrieved results – a measure of exactness Precision Fraction of retrieved relevant documents within the total relevant documents – a measure of completeness Harmonic mean of precision and recall Recall F-Measure 3) Arguably Good Recall and Precision Bobbing for Apples: Defining an effective search Hot Not
10
Key Word Search Key word searches are used throughout discovery However, they are not particularly effective »Blair and Maron - Lawyers believed their manual search retrieved 75% of relevant documents, when only 20% were retrieved It is very difficult to craft a key word search that isn’t under-inclusive or over-inclusive Key word search should be viewed as a component of a hybrid multimodal search strategy Go fish! 10
11
Where are we?
12
What Is Technology Assisted Review (TAR) ? 12
13
13 Classification Effectiveness Any binary classification can be summarized in a 2x2 table Test on sample of n documents for which we know answer »A + B+ D + E = n
14
14 Classification Effectiveness Recall = A / (A+D) »Proportion of interesting stuff that the classifier actually found High recall of interest to both producing and receiving party
15
15 Classification Effectiveness Precision = A / (A+B) High precision of particular interest to producing party: cost reduction!
16
How precise were you in culling out from your bag of 10,000 and ? 16 Sampling and Quality Control Want to know effectiveness without manually reviewing everything. So: »Randomly sample the documents »Manually classify the sample »Estimate effectiveness on full set based on sample Sampling is well-understood »Common in expert testimony in range of disciplines Sample size = 370 (Confidence Interval: 5; Confidence Level: 95%) 300 370 Precision: 81%
17
Annual event examining document review methods 17 TREC 2011 [T]he results show that the technology-assisted review efforts of several participants achieve recall scores that are about as high as might reasonably be measured using current evaluation methodologies. These efforts require human review of only a fraction of the entire collection, with the consequence that they are far more cost- effective than manual review. -Overview of the TREC 2011 Legal Track
18
18 Putting TAR into Practice
19
TAR or CAR? A Multimodal Process Must… have… humans! 19
20
The Judiciary’s Stance Da Silva Moore v. Publicis Groupe »Court okayed parties’ agreement to use TAR; parties disputed implementation protocol (3.3 million documents) Kleen Products v. Packaging Corp. of Am. »Plaintiffs abandoned arguments in favor of TAR and moved forward with Boolean search Global Aerospace Inc. v. Landow Aviation, L.P. »Court blessed defendant’s use of TAR over plaintiff’s objections (2 million documents) In re Actos (Pioglitazone) Products Liability Litigation »Court affirmatively approved the use of TAR for review and production EORHB, Inc., et al v. HOA Holdings, LLC »Court orders parties to use TAR and share common ediscovery provider
21
Must address risks associated with seed set disclosure Must have nuanced expert judgment of experienced attorneys Must have validation and QC steps to ensure accuracy 21 TAR/CAR: TricksTreats TAR can reduce time spent on review and administration TAR can reduce number of documents reviewed, depending on the solution and strategy TAR can increase accuracy and consistency of category decisions (vs. unaided human review) TAR can identify the most important documents more quickly &
22
TAR Accuracy TAR must be as accurate as a traditional review Studies show that computer-aided review is as effective as a manual review (if not more so) Remember: Court standard is reasonableness, not perfection: “[T]he idea is not to make it perfect, it’s not going to be perfect. The idea is to make it significantly better than the alternative without as much cost.” -U.S. Magistrate Judge Andrew Peck in Da Silva Moore 22
23
23 Conclusion
24
24 Parting Thoughts Automated review technology helps lawyers focus on resolution – not discovery – through available metrics »Complements human review, but will not replace the need for skillful human analysis and advocacy Search adequacy is defined in terms of reasonableness, not whether all relevant documents were found TAR can be a treat, but only when implemented correctly »Reconsider, but do not abandon, the role of: »Concept search »Keyword search »Attorney review
25
25 Q & A
26
26
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.