A review of ontology editor evaluation Presenter: Yujie Cao

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Copyright 1999 all rights reserved Why Conduct a Task Analysis? n Uncovered Information may lead to abandoning a bad idea –The world is ready for video.
CS305: HCI in SW Development Evaluation (Return to…)
User Interface Evaluation Usability Inspection Methods
How to Assess Psychomotor Skills
An evaluation framework
Evaluation Methods April 20, 2005 Tara Matthews CS 160.
CITE Research Symposium July, 2002 MR. Dominic Kwok.
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Olli Kulkki Markus Lappalainen Ville Lehtinen Reijo Lindroos Ilari Pulkkinen Helsinki University of Technology S Acceptability and Quality.
Outcomes Assessment and Program Effectiveness at Florida Atlantic University : Student Affairs Gail Wisan, Ph.D. July 13, 2010.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
1 Technical & Business Writing (ENG-315) Muhammad Bilal Bashir UIIT, Rawalpindi.
Chapter 6. Researching Your Subject © 2010 by Bedford/St. Martin's1 Understand the differences between academic and workplace research In academic research,
Chapter 6. Researching Your Subject © 2012 by Bedford/St. Martin's1 Understand the differences between academic and workplace research: In academic research,
Chapter 6 Researching Your Subject. In academic research, your goal is to find information that will help you answer a scholarly question. In workplace.
Lecture 9 Usability of Health Informatics Applications (Chapter 9)
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Usability and Internet Instruction INST 5240 Mimi Recker Utah State University.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 27th, 2004 Benchmarking.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Project 6: Undergraduate Activity Related to the UCI Libraries Azia Foster. Andrew Zaldivar. Scott Roeder.
Sociological Research Methods. Survey Research - Interview - Questionnaire - Closed- end Questions - Open- ended Questions.
EARLY CAREER PRIZE WINNER 2014: UPDATE ON PROGRESS Benjamin Brown Health eResearch Centre, Farr Institute for Health Informatics Research University of.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
Job Analysis. The process of collecting and organizing information about jobs performed in the organization and the principle elements involved in performing.
Santi Thompson - Metadata Coordinator Annie Wu - Head, Metadata and Bibliographic Services 2013 TCDL Conference Austin, TX.
Chapter 5. Researching Your Subject © 2013 by Bedford/St. Martin's1 Understand the differences between academic and workplace research: In academic research,
Interviews, Questionnaires, and control flowcharts Chapter 19.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
RESEARCH METHODS Lecture 12. THE RESEARCH PROCESS.
Perspectives on Information Course Introduction January 25, 2016.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
Chapter 13: An evaluation framework. The aims are: To discuss the conceptual, practical and ethical issues involved in evaluation. To introduce and explain.
Cognitive Informatics for Biomedicine – Chapter 5
User Interface Evaluation
RESEARCH METHODS Lecture 12
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Research Methods for Business Students
How to organize the Methodology Chapter (section)
Websoft Research Group
Methods Choices Overall Approach/Design
Introducing Evaluation
SY DE 542 User Testing March 7, 2005 R. Chow
Chapter 20 Why evaluate the usability of user interface designs?
SBD: Analyzing Requirements
Project Plan: Searching the UCI Library Website
Lesson 1 Foundations of measurement in Psychology
Qualitative research Common types of qualitative research designs.
What is Science?.
CS 522: Human-Computer Interaction Lab: Formative Evaluation
SCIENCE AND ENGINEERING PRACTICES
Chapter 23 Deciding how to collect data
What is Science?.
Explaining the Methodology : steps to take and content to include
Customer inquiries of Norwegian R&D institutes
What is Science?.
Choosing Research Approach and Methods
Enacted: Generating data in research events
Actively Learning Ontology Matching via User Interaction
Human-Computer Interaction: Overview of User Studies
What is Science?.
RESEARCH METHODS Lecture 12
Understand the differences between academic and workplace research:
Presentation transcript:

A review of ontology editor evaluation Presenter: Yujie Cao Paper Sharing A review of ontology editor evaluation Presenter: Yujie Cao

Catalog 01 Research Method 02 Main Findings 03 Recommendations 04 Questions

01 Research Method – literature review

02 Main Findings 01 Doesn’t exist any standard approach Integrative analysis of the main methods and parameters 03 An evaluation framework and benchmark Evaluation content and process analysis from every study 02 Main evaluation parameters and methods An overview of ontology editor evaluation 01 Doesn’t exist any standard approach

02 Main Findings- parameters and methods General Ontology Edit Cooperative reliability knowledge representation interoperability simplicity creating taxonomies discussion threads learnability creating relations tracking of changes inference content reviewing clipping of views libraries zooming Testing Inspection Inquiry co-discovery learning heuristic evaluation questionnaire cognitive walkthroughs field observation interview

02 Main Findings- framework & benchmark Standard should contain: 1)a set of common steps 2)questions/checklist/heuristics 3)a standard formula

03 Recommendations 1) Common evaluation process of ontology editors 2) Related information need to be collected 3) Possible comparative tables for ontology editors 4) Questions need to be considered in evaluation

Step1: Identify evaluation objectives Step2: Introduce the tool to users Step3: Use ontology editors Step4: Obtain feedback Step5: compute evaluation

Protégé OntoSaurus …… User domain 5(medical) 1(unrelated) — experience 1 Specific PARM ontologies taxonomies relations inference interoperability tracking Logger record mistakes -1 -2 Overall PARM reliability 4 3 simplicity 5 Total score 8.5 7 Compute formula: 01 Set specific levels of parameters and give values to each level 02 Set weight for each parameter in the form based on Evaluation objectives

03 Recommendations- questions 1) What are the key features that should be analyzed? 2) What is the appropriate number and levels of participants? 3) How could the parameters be conducted by tasks? 4) How to abstract a standard from evaluations with different objectives? 5) What is the relationship between the evaluation result of ontology editor and the needs of users?

04 Questions about this paper 1) Is there any quantitative parameters in the ontology editors evaluation? General Ontology Edit Cooperative reliability knowledge representation interoperability simplicity creating taxonomies discussion threads learnability creating relations tracking of changes inference content reviewing clipping of views libraries zooming

2) Is there any specialized evaluation technologies or methods for ontology editors?

Yujie Cao Joint Phd student Thank you! Yujie Cao Joint Phd student