Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Social Validation of Collaborative Annotations on Digital Documents Guillaume Cabanac †, Max Chevalier †, ‡, Claude Chrisment †, Christine Julien † International.

Similar presentations


Presentation on theme: "A Social Validation of Collaborative Annotations on Digital Documents Guillaume Cabanac †, Max Chevalier †, ‡, Claude Chrisment †, Christine Julien † International."— Presentation transcript:

1 A Social Validation of Collaborative Annotations on Digital Documents Guillaume Cabanac †, Max Chevalier †, ‡, Claude Chrisment †, Christine Julien † International Workshop on Annotation for Collaboration Paris, November, 24–25, 2005 ‡ Laboratoire de Gestion et Cognition † Generalized Information Systems S emi-Structured Data and Documents

2 Annotations for Collaboration2 G. Cabanac - nov. 2005 Our context of work redactor 87% reader 13% 1993 2005 ComMentor … iMarkup … Yawas … Amaya … > 20 ann. systems Unsharable  ‘lost’ Web servers Annotation server (Ovsiannikov et al., 1999) hardcopy a discussion thread

3 Annotations for Collaboration3 G. Cabanac - nov. 2005 Talk Roadmap Social Validation of Collaborative Annotations I.Collaborative annotations weaknesses Utility and usability study II.Our approach: definitions and validity computation III.Implementation: the TafAnnote prototype IV.Conclusion and perspectives of work

4 Annotations for Collaboration4 G. Cabanac - nov. 2005 1.Collaborative annotations – Weaknesses (1/2) I. Collaborative annotations weaknesses, utility and usability study Demonstration Annotea (W3C) Annotation server Web servers Amaya (INRIA + W3C) Annotation system

5 Annotations for Collaboration5 G. Cabanac - nov. 2005 1.Collaborative annotations – Weaknesses (2/2) I. Collaborative annotations weaknesses, utility and usability study  no range information (starting point only)  no replies count  no annotator’s expertise  no annotator’s opinion in a discussion thread  painful annotations exploration  no personal annotation space as for bookmarks  scalability issue

6 Annotations for Collaboration6 G. Cabanac - nov. 2005 “Who does need them?” Web context  for redactors:publication improvement  for annotators:debate about different point of views 2.Collaborative annotations – Utility (1/4) I. Collaborative annotations weaknesses, utility and usability study This is wrong because… You should… correctness You can also cite (Robert, 1999) who… completeness No, I think that the death of Keats’ mother made him… I’m not sure. In his poem “To hope”, he shows… poetry

7 Annotations for Collaboration7 G. Cabanac - nov. 2005 2.Collaborative annotations – Utility (2/4) I. Collaborative annotations weaknesses, utility and usability study “Who does need them?” Decisional Systems  annotations formulated on… … schema elements … values themselves   collect and build an “expertise memory” cf. (Cabanac, Chevalier, Teste & Ravat, 2006) to appear in EGC’06 an annotated multidimensional table

8 Annotations for Collaboration8 G. Cabanac - nov. 2005 2.Collaborative annotations – Utility (3/4) I. Collaborative annotations weaknesses, utility and usability study “Who does need them?” Digital Libraries  digitized documents number ↗  librarians’ annotations  improve indexing process QueryDocuments Query modelDocuments model Matching process Relevant documents analysisindexing information retrieval ‘U’ process

9 Annotations for Collaboration9 G. Cabanac - nov. 2005 2.Collaborative annotations – Utility (4/4) I. Collaborative annotations weaknesses, utility and usability study “Who does need them?” Industrial context  technical documentation improvement draft a plane technical documentation test & annotate use & annotate feedbacks modify engineers technicians pilots

10 Annotations for Collaboration10 G. Cabanac - nov. 2005 3.Collaborative annotations – Usability “Are they convenient and practicable for use?” Scalability projection Our proposal :identify socially validated annotation I. Collaborative annotations weaknesses, utility and usability study cognitive overload average annotation count per document dozensa few hundreds valuable overwhelming annoying disturbing

11 Annotations for Collaboration11 G. Cabanac - nov. 2005 Talk Roadmap Social Validation of Collaborative Annotations I.Collaborative annotations weaknesses Utility and usability study II.Our approach: definitions and validity computation III.Implementation: the TafAnnote prototype IV.Conclusion and perspectives of work

12 Annotations for Collaboration12 G. Cabanac - nov. 2005 Collaborative annotation model –– Objective information –– Subjectiveinformation Annotator’s expertise Annotation types 1.Definitions (1/2) II. Our approach: definitions and validity computation  neophyte  expert  intermediate expertise

13 Annotations for Collaboration13 G. Cabanac - nov. 2005 1.Definitions (2/2) Annotation Model –– Objective information –– Subjectiveinformation Model instantiation II. Our approach: definitions and validity computation Internet history Inventor: Tim Berners-Lee  John Doe, 12/21/2004 Internet vs Web invention It’s false, Vint Cerf and Bob Kahn invented IP [1], and Tim Berners-Lee invented the Web [2].  Robert Langdon, 05/14/2005 Tim Berners-Lee point of view Tim explains on his webpage [1] that he didn’t invent the Internet, but rather the World Wide Web. http://www.w3.org/ People/ Berners-Lee [1] http://www.ietf.org/ rfc/rfc791.txt [1] http://www.w3.org/ Consortium/ history [2]

14 Annotations for Collaboration14 G. Cabanac - nov. 2005 ann11 – Ok, for example: ann2 – False formula, see this counterexample ann3 – You should precise the domain of x ann121 – In general: ann1 – You are mistaking, consider correcting with: ann12 – Wrong equ. for neg. values A mathematical example Considering the combination of annotation types 2.Agreement of an annotator (1/3) II. Our approach: definitions and validity computation Math lesson Def 1.

15 Annotations for Collaboration15 G. Cabanac - nov. 2005 Math lesson Def 1. Considering the annotator’s involvement…  …in commenting  …in referencing 2.Agreement of an annotator (2/3) II. Our approach: definitions and validity computation ann1 – You are mistaking, consider correcting with: ann2 –  ann3 – You are mistaking, see [1], [2] mathworld.comwikipedia.com/squareRoot [1] [2]

16 Annotations for Collaboration16 G. Cabanac - nov. 2005 Mixing-up : agreement of an annotator A concrete example   = 0.6 >  = 0.4  comments more weighted than references 2.Agreement of an annotator (3/3) II. Our approach: definitions and validity computation 0.36 1 com. 0 ref. 0.29 0 com. 3 ref. 0.64 1 com. 2 ref. –0.22 0 com. 0 ref. –0.20 1 com. 1 ref. agreement(a)

17 Annotations for Collaboration17 G. Cabanac - nov. 2005 II. Our approach: definitions and validity computation Annotated passage social reliability 3.Reliability of an annotated passage passage reliability annotation validity – 1.01.00 Consensus: the passage is wrong validated remark Consensus: the passage is right

18 Annotations for Collaboration18 G. Cabanac - nov. 2005 4.Validity of a collaborative annotation (1/2) Towards a discussion thread opinion synthesis II. Our approach: definitions and validity computation reliability 0 dubious – 1 not reliable at all 1 reliable case 1case 2case 3case 4

19 Annotations for Collaboration19 G. Cabanac - nov. 2005 We consider  agreement of replies (types, comment, references)  expertise declared by people who reply (  )  context: many replies  annotation more validated 5.Validity of a collaborative annotation (2/2) Discussion thread opinion synthesis II. Our approach: definitions and validity computation level: 0 1 2 3 a1 a2 is more validated than

20 Annotations for Collaboration20 G. Cabanac - nov. 2005 Roadmap Social Validation of Collaborative Annotations I.Collaborative annotations weaknesses Utility and usability study II.Our approach: definitions and validity computation III.Implementation: the TafAnnote prototype IV.Conclusion and perspectives of work

21 Annotations for Collaboration21 G. Cabanac - nov. 2005 1.The TafAnnote prototype General description Client / server architecture Mozilla Firefox extension III. Implementation: social validation of annotations in the TafAnnote prototype

22 Annotations for Collaboration22 G. Cabanac - nov. 2005 2.The TafAnnote prototype Main features (1/4) III. Implementation: social validation of annotations in the TafAnnote prototype Annotation creation personal annotation space

23 Annotations for Collaboration23 G. Cabanac - nov. 2005 2.The TafAnnote prototype Main features (2/4) III. Implementation: social validation of annotations in the TafAnnote prototype Annotation consultation adaptative notification of new information discussion thread

24 Annotations for Collaboration24 G. Cabanac - nov. 2005 2.The TafAnnote prototype Main features (3/4) III. Implementation: social validation of annotations in the TafAnnote prototype Personal annotation space management D&D reorganizationView by annotation type

25 Annotations for Collaboration25 G. Cabanac - nov. 2005 2.The TafAnnote prototype Main features (4/4) III. Implementation: social validation of annotations in the TafAnnote prototype Annotation search boolean search filter by annotators & types

26 Annotations for Collaboration26 G. Cabanac - nov. 2005 3.The TafAnnote prototype Social validation at work III. Implementation: social validation of annotations in the TafAnnote prototype Implemented in Oracle PL/SQL (server side) Modification of annotations display emphasized  validated quite hidden  dubious remark

27 Annotations for Collaboration27 G. Cabanac - nov. 2005 A social validation of collaborative annotations  why?scalability issue considering usability  how?exploit people opinions expressed in discussion threads  aim?indicate validated information expressed by annotations Implementation : the TafAnnote prototype  Client / Server architecture user interaction : Mozilla Firefox extension annotations storage : Oracle RDBMS  personal and collective annotation management  emphasized validated annotations  pre-sort right information Perspectives of work  evaluation with concrete users  NLP techniques  deducing annotators opinions  exploitation : indexing, summarizing… Conclusion and perspectives IV. Conclusion and perspectives of future work

28 Annotations for Collaboration28 G. Cabanac - nov. 2005 Question time


Download ppt "A Social Validation of Collaborative Annotations on Digital Documents Guillaume Cabanac †, Max Chevalier †, ‡, Claude Chrisment †, Christine Julien † International."

Similar presentations


Ads by Google