Web Search Results Visualization: Evaluation of Two Semantic Search Engines Kalliopi Kontiza, Antonis Bikakis,

Slides:



Advertisements
Similar presentations
GMD German National Research Center for Information Technology Darmstadt University of Technology Perspectives and Priorities for Digital Libraries Research.
Advertisements

Cognitive-metacognitive and content-technical aspects of constructivist Internet-based learning environments: a LISREL analysis 指導教授:張菽萱 報告人:沈永祺.
Alina Pommeranz, MSc in Interactive System Engineering supervised by Dr. ir. Pascal Wiggers and Prof. Dr. Catholijn M. Jonker.
Agent-Based Architecture for Intelligence and Collaboration in Virtual Learning Environments Punyanuch Borwarnginn 5 August 2013.
Children’s subjective well-being Findings from national surveys in England International Society for Child Indicators Conference, 27 th July 2011.
UCLA : GSE&IS : Department of Information StudiesJF : 276lec1.ppt : 5/2/2015 : 1 I N F S I N F O R M A T I O N R E T R I E V A L S Y S T E M S Week.
User Interface Evaluation Usability Inspection Methods
Azra Rafique Khalid Mahmood. Introduction “To learn each and everything in a limited time frame of degree course is not possible for students”. (Mahmood,
Galia Angelova Institute for Parallel Processing, Bulgarian Academy of Sciences Visualisation and Semantic Structuring of Content (some.
Dialogue – Driven Intranet Search Suma Adindla School of Computer Science & Electronic Engineering 8th LANGUAGE & COMPUTATION DAY 2009.
Web- and Multimedia-based Information Systems. Assessment Presentation Programming Assignment.
The art and science of measuring people l Reliability l Validity l Operationalizing.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
Measuring the quality of academic library electronic services and resources Jillian R Griffiths Research Associate CERLIM – Centre for Research in Library.
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
Design of metadata surrogates in search result interfaces of learning object repositories: Linear versus clustered metadata design Panos Balatsoukas Anne.
Component-specific usability testing Dr Willem-Paul Brinkman Lecturer Department of Information Systems and Computing Brunel University
ReQuest (Validating Semantic Searches) Norman Piedade de Noronha 16 th July, 2004.
Experimental Components for the Evaluation of Interactive Information Retrieval Systems Pia Borlund Dawn Filan 3/30/04 610:551.
Web Design cs414 spring Announcements Project status due Friday (submit pdf)
Instructional Design & Technology Cooperative Learning Effects in Online Instruction Beth Allred Oyarzun.
ICT TEACHERS` COMPETENCIES FOR THE KNOWLEDGE SOCIETY
The Role of Automated Categorization in E-Government Information Retrieval Tanja Svarre & Marianne Lykke, Aalborg University, DK ISKO conference, 8th of.
Chapter 14: Usability testing and field studies
Evaluation of digital collections' user interfaces Radovan Vrana Faculty of Humanities and Social Sciences Zagreb, Croatia
National and University Library of Slovenia University of Ljubljana, Faculty of Civil and Geodetic Engineering User-centred evaluation of digital repositories:
Human Factor Evaluation for Marine Education by using Neuroscience Tools N. Νikitakos D. Papachristos Professor Ph.d. candidate Dept. Shipping, Trade and.
Databases and Education Access Access Course Progression Access courses can be designed for intensive immersion or semester-long courses. Basic.
Evaluation Experiments and Experience from the Perspective of Interactive Information Retrieval Ross Wilkinson Mingfang Wu ICT Centre CSIRO, Australia.
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
University of Dublin Trinity College Localisation and Personalisation: Dynamic Retrieval & Adaptation of Multi-lingual Multimedia Content Prof Vincent.
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of Designing the User Interface: Strategies for Effective Human-Computer.
Proposal Presentation. Introduction - Not much studied - Topical - Likely to continue into future - High level of political interference - Problem w/
Successful implementation of user- centered game based learning in higher education: An example from civil engineering Adviser: Ming-Puu Chen Presenter:
Hao Wu Nov Outline Introduction Related Work Experiment Methods Results Conclusions & Next Steps.
RCDL Conference, Petrozavodsk, Russia Context-Based Retrieval in Digital Libraries: Approach and Technological Framework Kurt Sandkuhl, Alexander Smirnov,
Embedding information literacy in an undergraduate Management module: reflecting on students’ performance and attitudes over two academic years Clive Cochrane.
Cleo Sgouropoulou * Educational Technology & Didactics of Informatics Educational Technology & Didactics of Informatics Instructional.
Lecture 7: Requirements Engineering
Competitive Swimmers’ Interpretation of Motivational Climate Rebecca C. Trenz, M.A. Fordham University Psychology of Motivation.
Search Engine Architecture
The role of spatial abilities and age in performance in an auditory computer navigation task Presenter: Yu-Chu Chen Adviser: Ming-Puu Chen Date: June 8,
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
NTUST IM AHP Case Study 2 Identifying key factors affecting consumers' choice of wealth management services: An AHP approach.
Digital Libraries1 David Rashty. Digital Libraries2 “A library is an arsenal of liberty” Anonymous.
1 Chapter 4: User Interface Design. 2 Introduction … Purpose of user interface design:-  Easy to learn  Easy to use  Easy to understand.
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of Designing the User Interface: Strategies for Effective Human-Computer.
Surveying instructor and learner attitudes toward e-learning Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: April 12, 2008 Liaw, S., Huang, H.,
L&I SCI 110: Information science and information theory Instructor: Xiangming(Simon) Mu Sept. 9, 2004.
Virtual Information and Knowledge Environments Workshop on Knowledge Technologies within the 6th Framework Programme -- Luxembourg, May 2002 Dr.-Ing.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
Job Analysis. The process of collecting and organizing information about jobs performed in the organization and the principle elements involved in performing.
1 Information Systems Use Among Ohio Registered Nurses: Testing Validity and Reliability of Nursing Informatics Measurements Amany A. Abdrbo, RN, MSN,
User Modeling for the Mars Medical Assistant MCS Project By Mihir Kulkarni.
Monday, June 23, 2008Slide 1 KSU Females prospective on Maternity Services in PHC Maternity Services in Primary Health Care Centers : The Females Perception.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
WHIM- Spring ‘10 By:-Enza Desai. What is HCIR? Study of IR techniques that brings human intelligence into search process. Coined by Gary Marchionini.
Evaluation of an Information System in an Information Seeking Process Lena Blomgren, Helena Vallo and Katriina Byström The Swedish School of Library and.
Kenneth Baclawski et. al. PSB /11/7 Sa-Im Shin
Search Engine Architecture
Research amongst Physical Therapists in the State of Kuwait: Participation, Perception, Attitude and Barriers Presented by Sameera Aljadi, PT, PhD Assistant.
SIS: A system for Personal Information Retrieval and Re-Use
Athabasca University School of Computing and Information Science
Information Visualization (Part 1)
Towards Exploratory Relationship Search: A Clustering-Based Approach
CHAPTER 7: Information Visualization
Search Engine Architecture
CHAPTER 14: Information Visualization
Grace Orlyn SITOMPUL 5th ISC – Oct 30-31, 2017 APIU
Comp 15 - Usability & Human Factors
Presentation transcript:

Web Search Results Visualization: Evaluation of Two Semantic Search Engines Kalliopi Kontiza, Antonis Bikakis, University College London, Department of Information Studies

Overview ● Semantic search engines improve the accuracy of search results: - by understanding the meaning and context of terms as they appear in web documents, - by using semantics to represent and process the user’s queries and the web data. ● Other parameters that define the quality of a search engine: - its performance, -its usability, -the presentation of the search results

Overview “Whether and how do semantic search engines improve the visualization of search results, enhancing the search experience? “

Structure of the Presentation ●Methodology ●Background information (InfoVis) ○1. Analytical Inspection ●Experiment ○2. The User Evaluation ●Results of the User Evaluation ●Discussion

Methodology ●An analytical Inspection area of heuristic evaluation ‘the Visual Information-Seeking Mantra’, Shneiderman (1996) ● A user-oriented evaluation study ●Interactive Information Retrieval (IIR) systems: semantic search engines Sig.ma and KngineSig.maKngine

1. Background Information Information Visualization -Works as umbrella for all kinds of visualizations -Best applied for exploratory tasks -Ultimate purpose : amplify cognition -Requires well formed data

2. Analytical Inspection 1.OverviewTask-domain 2.Details on demandinformation 3.Filter out/Highlightactions supported 4.Relateby an information 5.Historyvisualization system, that 6.Export users wish to perform ➔ Layout of the SUIs: Control (ie more), Input (ie search box)Features of SERP Personalised (ie move content) interface Informational (ie result item)

2. Analytical Inspection ( Questionnaire videos) Questionnaire videos

3. Design & Set up of the User Evaluation A.Variables a. Dependent 1. Overview i. Task domain information actions 2. Details on demand 3. Filtering out ii. User Satisfaction4. Relate 5. History b. Independent6. Export Predefined queries: a) Web Transactional Navigational b) Informational Factual Source

3. Design & Set up of the User Evaluation B.Questionnaire - Online, closed-type questions, 5 point Likert, - Pre assigned queries presented in playlist of videos - Sections: A. Introduction B. Evaluation C. About C.Sample - 83 participants

4. Results of the User Evaluation ●55% male, 45% female ●34% age group, 51% age group, 11% age group ●67% had used more than one search engines ●86% rated their search skills with 4 and 5 on 5 point Likert scale

The comparative presentation of user ratings for the visualization of the task domain information action criteria

4. Results of the User Evaluation ● Informational tasks received 71%, 61% ● Visualization was ranked 4th, 73% graded it with 4 and 5 on 5 point Likert scale ● User-satisfaction perceived acceptance: o good satisfaction for history and export but o more expectations from overview and details on demand

5. Discussion -Different perspective to view data -Non linear and dynamic visualization favoured Q1. The visualization of the search results in semantic search engines improves the understanding of data and supports the user in assessing search results.

5. Discussion - Visualization was ranked important in tasks as Overview, Details on demand, Filter out, Relate -Careful consideration regarding additional visual representations Q2. Semantic search engines make more effective use of visualization in displaying search results providing a better user experience.

5. Discussion Q3. Semantics improve the visualization of search results. -User can filter out due to the semantically organised data in properties and values -The visualization of that task receives high preference amongst users

5. Discussion Q4. The visualization of search results in semantic search engines provides a better search and thus user’s satisfaction. -Users satisfied in general with the visualizations of the semantic search engines -Visualization of search results plays a significant role in shifting users searching behaviour

Conclusions ● While visualization methods used by semantic search engines improves user understanding of the results, the extent to which visualization methods are used in such search engines can be improved even more. ● User experience rated positively but user satisfaction not accomplished in all cases

Further questions to investigate A more in-depth analysis needs to be performed on the collected data: ● Are there any differences in the results of the user evaluation for the different types of queries, considering the type of data that is searched or the complexity of the query? ● Is there any correlation, for example, between the user characteristics and the obtained data? ● Could a standardized cognitive and ability test help us further investigate the relationship between information visualization in semantic search engines and knowledge visualization ?

Thank you for your attention