Presentation is loading. Please wait.

Presentation is loading. Please wait.

Effect of Shared-attention on Human-Robot Communication Written by Junyi Yamato, Kazuhiko Shinozawa, Futoshi Naya Presentation by Bert Gao.

Similar presentations


Presentation on theme: "Effect of Shared-attention on Human-Robot Communication Written by Junyi Yamato, Kazuhiko Shinozawa, Futoshi Naya Presentation by Bert Gao."— Presentation transcript:

1 Effect of Shared-attention on Human-Robot Communication Written by Junyi Yamato, Kazuhiko Shinozawa, Futoshi Naya Presentation by Bert Gao

2 Introduction  Agents and Robots are being developed that can serve as human communication partners.  Similarity and Difference between Agents and Robots.  Experiment condition: robot has to share the same space with users in order to perform as a good persuader.  Aim: Measure the effect of shared-attention in human- robot communication by the experiment.

3 Introduction (cont.)

4 Experiment  Color sample plate, color names Subjects looked at the color sample plate, and were asked to choose the color name from two candidates. All colors in the task were ambiguous, and some names were not so familiar to ordinary people. (e.g. carmine or vermilion for a bright red color plate) The answer was not obvious and most subjects had no prior references. Candidates of color names were chosen as the rule that expected average matching ratio would become around 0.5 with no- recommendation condition. Same recommended candidates for all subjects.

5 Experiment (cont.)  Subjects 28 people (14 male and 14 female) Age from 21 to 29, average 24.0 and the standard deviation was 1.83 Each subject saw 30 color plates in total, which were in the same order for all subjects.  TEG (Tokyo University Egogram) Subjects were required to take a personality-profiling test, TEG. TEG consists of 60 questions and measures five personality factors: CP (critical parent), NP (nurturing parent), A (adult), FC (free child), and AC (adapted child). Each factor is scaled from 0 to 20.  Achievement of shared-attention Gaze direction of robot: subject, color plate, button box, and other. Gaze direction of subject: robot face, robot (other), color plate, button box, and other. When both robot and subject looked at the same direction, shared-attention was considered to be achieved.

6 Experiment (cont.)  Robot Head robot with a human face tracking feature, built by the MIT AI Laboratory, named Kismet. Tow eyes with video camera in each, eyelids, a mouth with expressive lips, two fan-like ears, and a moveable neck. Vision system can extract and track the skin color region so that eye contact with the subject can be established. Speech was generated by the “Fluet” Japanese speech synthesizer.

7 Result  Matching ratio: 0.57, higher than that in non-recommendation condition, but not statistically significant different.  No correlation between the matching ratio and the amount of shared- attention time for all subjects.  A high-AC (adapted child) group of subjects (AC factor was more than 11) demonstrated a strong correlation between the matching ratio and SA time (Spearman’s r=0.51, p=0.051).  Comparing high-SA group and low-SA group among high-AC subjects, the matching ratio of the former was higher, and difference was statistically significant by t-test (p<0.05).

8 Discussion Shared-attention is important for human-robot communication. Shared-attention had not only positive affects but also negative affects. Robot can pretend to follow a user’s gaze by using an information source other than actual eye direction in real time. Matching RatioShared-attention Time (Sec) High-AC0.5950.2 Low-AC0.5453.4

9 Conclusion Amount of time that shared-attention was achieved has a positive correlation with the strength of the effect on human decision-making.

10 Evaluation  Cons Results are limited to a specific type of subjects (a high-AC factor group) Subjects limitation: age 21-29, average age 24.0 Shared-attention was based on face direction other than real eye direction  Pros Use TEG (Tokyo University Egogram) to categorize the subjects. Pseudo shared-attention, robot can pretend to follow a user’s gaze by using an information source other than actual eye direction in real time, reduces the image acquiring and processing.

11 Future work  Eye direction catching combined with face direction catching would be more accurate to measure shared-attention.  Used into designing of robot communication partner (instructor, assistant, consultant, and etc.). Improve the communication between human and robots.

12 Kismet Video: Vocal turn takingVocal turn taking Kismet homepage in MIT http://www.ai.mit.edu/projects/humanoid- robotics-group/kismet/kismet.html


Download ppt "Effect of Shared-attention on Human-Robot Communication Written by Junyi Yamato, Kazuhiko Shinozawa, Futoshi Naya Presentation by Bert Gao."

Similar presentations


Ads by Google