Presentation is loading. Please wait.

Presentation is loading. Please wait.

Benchmarking Textual Annotation Tools for the Semantic Web

Similar presentations


Presentation on theme: "Benchmarking Textual Annotation Tools for the Semantic Web"— Presentation transcript:

1 Benchmarking Textual Annotation Tools for the Semantic Web
Diana Maynard University of Sheffield, UK Motivation Criteria for benchmarking should include not just performance but also scalability usability flexibility interoperability in order for users to make an informed decision about the best product for their needs. Requirements User Manual Annotator Annotation Consumer Corpus Developer System Usability ++ + Flexibility Performance - Scalability Interoperability Requirements for different users Results Tradeoffs: 1. Scalability vs response time 2. Performance vs coverage User Manual Annotator Annotation Consumer Corpus Developer System GATE ++ KIM MnM - Magpie + OntoMat Suitable tools for different users Tools Magpie KIM OntoMat GATE MnM


Download ppt "Benchmarking Textual Annotation Tools for the Semantic Web"

Similar presentations


Ads by Google