Presentation is loading. Please wait.

Presentation is loading. Please wait.

Review Spotlight: A User Interface for Summarizing User-generated Reviews Using Adjective-Noun Word Pairs Koji Yatani, Michael Novati, Andrew Trusty, and.

Similar presentations


Presentation on theme: "Review Spotlight: A User Interface for Summarizing User-generated Reviews Using Adjective-Noun Word Pairs Koji Yatani, Michael Novati, Andrew Trusty, and."— Presentation transcript:

1 Review Spotlight: A User Interface for Summarizing User-generated Reviews Using Adjective-Noun Word Pairs Koji Yatani, Michael Novati, Andrew Trusty, and Khai N. Truong CHI 2011, May 7-12, 2011, Vancouver, BC, Canada

2 Online Reviews Online reviews: a wealth of perspectives about products (and stores) Number of reviews for any given entity often extends well beyond what users can read quickly User generated reviews also vary greatly in length, detail and focus in comparison to ones written by professional editors These issues make it difficult for users to quickly and easily glean useful details about the entity

3 Filtering? Filtering reviews based on user ratings (about overall impression of the review, say 1-5..) But we want to know why the reviewer rated so (we have to read anyway) Filtering such reviews results in missing information that could be important to the user

4 Tag Cloud? Simply based on frequency of words appearing in reviews (possible to learn general info about restaurant) Problem: details are hard to overview

5 Review Spotlight Concept Design

6 Why Tag Cloud? Four user Tasks for which a tag cloud could be useful (Rivadeneira et al. 2007) –Searching: finding a particular term, often as a means to navigate to more detailed info about the team –Browsing: casually exploring information without seeking any specific terms –Impression formation: building an impression about the entity that the tag cloud visualizes –Recognizing: providing additional information about the entity to support the user in identifying the entity she is seeking

7 Effects of Tag Cloud Visual Features Words in larger font and those located at the upper-left corner of the tag cloud were easier to recall. (Rivadeneira at al.) Font size and font weight had strong effect on the word selection-but not the color. (Bateman at al.) Alphabetical ordering contributed to faster user performance than the random ordering in all layouts. (Halvey at al.)

8 Evaluation: exploratory, predictive, formative, summative Robert Stake's soup analogy: –When the cook tastes other cooks' soups, that's exploratory. –When the cook predicts the quality of a soup from a recipe, that's predictive. –When the cook tastes his own soup while making it, that's formative. –When the guests (or food critics) taste the soup, that's summative. Evaluation Comes in Many Guises, Keith Andrews, BELIV’08 Workshop, CHI 2008

9 Evaluation: exploratory, predictive, formative, summative Exploratory: Exploratory evaluation provides evidence of how an interface is used and what it is used for. Predictive: Predictive evaluation produces an estimate of user performance based on an interface design. Formative: Formative evaluation provides design feedback, often in the form of a list of problems and recommended solutions. Summative: Summative evaluation provides an overall assessment of a single interface or a comparison of multiple interfaces, often in the form of numerical data which is statistically analyzed. Evaluation Comes in Many Guises, Keith Andrews, BELIV’08 Workshop, CHI 2008

10

11 Formative Study 8 normal computer users Provided four different places to read (2 restaurants, 2 hotels), each place has more than 30 reviews. Asked to describe their impression of the location

12 Formative Study Key user behaviors found: –Formulating and adjusting an impression Overall rating Rating distribution Photograph –Verbalizing impression with short phrases Descriptive info about the venue (e.g., “Asian food”) Subjective opinion statement (e.g., “good steak”)

13 Formative Study Design implications –The system should help the user gain a quick overview of the comments –The system should allow the user to further evaluate the context –Showing short phrases could accelerate the formation of impressions and decision making

14 Review Spotlight Prototype Review Spotlight, Tag-cloud-like interface High Level Design

15 Review Spotlight Prototype 1.Extract adjective-noun word pairs 2.Count each word pair’s occurrences 3.Perform a sentiment analysis of each word pair 4.Display the word pairs

16 Review Spotlight Prototype 1. Extract “adjective-noun” word pairs –Remove noise such as articles –Use a Part-of-speech (POS) Tagger –Pair a noun with the closest adjective (in a given sentence) –Example: “The food is great” => “great food”

17 Review Spotlight Prototype 2. Count each word pair’s occurrences –Counting the number of occurrences of each word pair and group word pairs by noun –Font size Noun : The number of occurrences Adjective : The number of occurrences of the word pair Size : (min~max) 10~30 pixels

18 Review Spotlight Prototype 3. Perform a sentiment analysis of each word pair –Sentiment analysis Use SentiWordNet dictionary –Each word consists of three scores on sentiment factors (i.e., Positivity, Negativity, Objectivity) Calculate the sentiment value for an adjective by taking the average of the sentiment values of all the use contexts Calculate the sentiment value for a none by taking the average of all paired adjectives weighted by the number of occurrences. –Visualization Green, Red, Blue shades are used to represent positive, negative, and neutral meanings, respectively Darkness of the shade conveys the sentiment strength

19 Review Spotlight Prototype 4. Visualize the word pairs –Place pairs randomly so that a user is not biased to any specific terms based on their placement position –Include padding around each word pairs –Combine as many as four adjectives that are most frequently paired with the noun

20 User Study How do Review Spotlight address user requirements for an interface summarizing user-generated reviews? How do Review Spotlight support impression formation?

21 User Study Participants –10 participants (5 males, 5 females) –Between ages of 20 and 50, –With a variety of backgrounds –None of them participated formative study –Approximately 50min –$20 cash

22 Two restaurant reviews displayed side-by-side: a) Review pages; b) Review Spotlight.

23 User Study Procedure –PA1~PA4 : 4 pairs of restaurants w/ similar overall review ratings –PB1~PB4 : 4 pairs of restaurants w/ different overall review rating (High and Low) –PA1, PA2, PB1, PB2 to use both Review Spotlight and normal review –PA3, PB3 to use only Review Spotlight –PA4, PB4 to use only normal review

24 User Study Apparatus –Stored all the Review Spotlight interactions on the computer –Cached all the pages on the computer –Laptop (Windows Vista) –30.5 * 19.0 cm, 1280 * 800 pixel display –Mouse

25 User Study Results Performance Time –Review pages 157 seconds on average (SD=63 seconds) –Review Spotlight 122 seconds on average (SD=49 seconds)

26 User Study Results Forming detailed impressions using Review Spotlight –Same choices: 30 out of 40 cases –Different choices: 10 out of 40 cases –Review Spotlight helped users to uncover specific info that is important for decision making –For decision making, people used: Existing Review Page: rating, # of reviews Review Spotlight : specific details

27 User Study Results Quantitative analysis of Review Spotlight usage –People typically scan the tag cloud rather than reads it while performing searching task (Halvey 2007, WWW) –Users tended to read the word pairs: Avg. number of intentional hover events was 35.3 (SD=2.5)

28 User Study Results Quantitative analysis of Review Spotlight usage –Avg. number of click events: 10.0 –54.8% occurred on the adjective that appeared by default; the rest occurred on hidden adjectives. –28.4% of the intentional hover events resulted in a click –Review Spotlight encouraged the users to explore more details about how the words were used in the actual reviews


Download ppt "Review Spotlight: A User Interface for Summarizing User-generated Reviews Using Adjective-Noun Word Pairs Koji Yatani, Michael Novati, Andrew Trusty, and."

Similar presentations


Ads by Google