EWatchdog: An Electronic Watchdog for Unobtrusive Emotion Detection based on Usage Analysis Rayhan Shikder Department.

Slides:



Advertisements
Similar presentations
National Technical University of Athens Department of Electrical and Computer Engineering Image, Video and Multimedia Systems Laboratory
Advertisements

Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
Digital Interactive Entertainment Dr. Yangsheng Wang Professor of Institute of Automation Chinese Academy of Sciences
Chameleon: A Novel System for Defending Eavesdropping of Secret Information Saiyma Sarmin Department of Computer.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
MIND READING COMPUTER Presented By, Judy Francis.
Joemon M Jose (with Ioannis Arapakis & Ioannis Konstas) Department of Computing Science.
Recent Developments in Human Motion Analysis
Based on a fine paper byPhilippe Zimmermann
25 February New Interaction Techniques Target Selection Under Time Pressure Conditions New Interaction Techniques Department of Computer and Information.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
1 A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions Zhihong Zeng, Maja Pantic, Glenn I. Roisman, Thomas S. Huang Reported.
EKMAN’S FACIAL EXPRESSIONS STUDY A Demonstration.
Emotional Intelligence and Agents – Survey and Possible Applications Mirjana Ivanovic, Milos Radovanovic, Zoran Budimac, Dejan Mitrovic, Vladimir Kurbalija,
Recognizing Emotions in Facial Expressions
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
Artificial Intelligence
Facial Feature Detection
Crowdsourcing Game Development for Collecting Benchmark Data of Facial Expression Recognition Systems Department of Information and Learning Technology.
Automated Lip reading technique for people with speech disabilities by converting identified visemes into direct speech using image processing and machine.
Crowdsourcing Predictors of Behavioral Outcomes. Abstract Generating models from large data sets—and deter¬mining which subsets of data to mine—is becoming.
Technology to support psychosocial self-management Kurt L. Johnson, Ph.D. Henry Kautz, Ph.D.
Affective Computing Multimedia Communications University of Ottawa Ana Laura Pérez Rocha Anwar Fallatah.
A FACEREADER- DRIVEN 3D EXPRESSIVE AVATAR Crystal Butler | Amsterdam 2013.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
CSCE 5013 Computer Vision Fall 2011 Prof. John Gauch
Interactive Spaces Huantian Cao Department of Computer Science The University of Georgia.
Multimodal Information Analysis for Emotion Recognition
Enabling User Interactions with Video Contents Khalad Hasan, Yang Wang, Wing Kwong and Pourang Irani.
ASSISTIVE TECHNOLOGY Jessica Spitzer University of West Alabama ED 505.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
 Detecting system  Training system Human Emotions Estimation by Adaboost based on Jinhui Chen, Tetsuya Takiguchi, Yasuo Ariki ( Kobe University ) User's.
I Robot.
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
Multimodality, universals, natural interaction… and some other stories… Kostas Karpouzis & Stefanos Kollias ICCS/NTUA HUMAINE WP4.
Intelligent Control and Automation, WCICA 2008.
Virtual Characters. Overview What is a digital character? What is a digital character? Why do would we want digital characters? Why do would we want digital.
Electronic visualization laboratory, university of illinois at chicago Towards Lifelike Interfaces That Learn Jason Leigh, Andrew Johnson, Luc Renambot,
"Multimedia".
Chapter 7 Affective Computing. Structure IntroductionEmotions Emotions & Computers Applications.
Chris Hewitt, Wild Mouse Male, Age 42, Happy ARC31 2.
Jennifer Lee Final Automated Detection of Human Emotion.
Immersive Virtual Characters for Educating Medical Communication Skills J. Hernendez, A. Stevens, D. S. Lind Department of Surgery (College of Medicine)
Ekman’s Facial Expressions Study A Demonstration.
Facial Expressions and Emotions Mental Health. Total Participants Adults (30+ years old)328 Adults (30+ years old) Adolescents (13-19 years old)118 Adolescents.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
What is Multimedia Anyway? David Millard and Paul Lewis.
Portable Camera-Based Assistive Text and Product Label Reading From Hand-Held Objects for Blind Persons.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
Humanoid-Human Interaction Presented by KMR ANIK 1.
Facial Smile Detection Based on Deep Learning Features Authors: Kaihao Zhang, Yongzhen Huang, Hong Wu and Liang Wang Center for Research on Intelligent.
MIT Artificial Intelligence Laboratory — Research Directions Intelligent Perceptual Interfaces Trevor Darrell Eric Grimson.
Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal.
Applications · E-learning apps, changing the presentation style of an e-lesson according to the mood of the person. · Psychological health services, to.
Automated Detection of Human Emotion
AHED Automatic Human Emotion Detection
GESTURE RECOGNITION TECHNOLOGY
Voluntary (Motor Cortex)
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
J. Hernendez, A. Stevens, D. S. Lind
Emotions cse 574 winter 2004.
Project #2 Multimodal Caricatural Mirror Intermediate report
AHED Automatic Human Emotion Detection
Interactive media.
Automated Detection of Human Emotion
Presentation transcript:

eWatchdog: An Electronic Watchdog for Unobtrusive Emotion Detection based on Usage Analysis Rayhan Shikder Department of Computer Science and Engineering (CSE), BUET Background Emotion is, perhaps, the most critical attribute of living beings that is critical to detect and generate artificially. Its detection always remains a classical well-explored problem. There exist many approaches for determining human emotions based on facial expression analysis [1], thermal imaging of faces [2], gesture and pose tracking [3], voice intonation [4], etc. Working Procedure of Our Study Emotion Learning System User Usage Data Figure 1: Our proposed approach for emotion detection Survey Inputs Emotion Classification Features Conclusion and Future Work Detecting emotion is always of utmost interest, even though performing this task from widely available usage data is yet to be focused in the literature. We perform a study to detect emotion from widely-available usage data of electronic devices. We plan to develop the feature-emotion mapping module and conduct real survey using our developed system. In future, we plan to create emotion in personal computers based on the findings of our study. Figure 2: Data logged after feature extraction References [1] S. Ioannou, A. Raouzaiou, V. Tzouvaras, T. Mailis, K. Karpouzis and S. Kollias: "Emotion Recognition through Facial Expression Analysis Based on a Neurofuzzy Method," Neural Networks, vol. 18, pp , [2] S.G. Kong, J. Heo, B. Abidi, J. Paik and M. Abidi: "Recent advances in visual and infrared face recognition—a review" Computer Vision and Image Understanding, vol. 97, no. 1, pp , Jan [3] H. Gunes and M. Pantic, & ldquo, “Dimensional Emotion Prediction from Spontaneous Head Gestures for Interaction with Sensitive Artificial Listeners”, Proc. Int',l Conf. Intelligent Virtual Agents, pp , [4] O. W. Kwon, K. Chan, J. Hao, and T. W. Lee: “Emotion recognition by speech signals,” In Proc. EUROSPEECH, [5] U Dimberg: “Facial electromyography and emotional reactions”, Psychophysiology (Impact Factor: 3.18),1990. [6] R.W. Picard: “What does it mean for a computer to 'have' emotions?”, MIT Media Laboratory, [7] A.F.M. Nazmul Haque Nahin, Jawad Mohammad Alam, Hasan Mahmud and Kamrul Hasan: “Identifying emotion by keystroke dynamics and text pattern analysis”, Taylor & Francis online, March, Emotion Induction System Feature Extraction Feature-Emotion Mapping Current Progress of Our Study We are developing the survey system for our study. We have collected related videos, audios, images, and news texts for the survey system. We have developed the Feature Extraction module. Motivation and Contribution Motivation: Existing emotion detection approaches - 1.Demand additional infrastructure such as webcam, body mounted hardware [5], etc., which are often intrusive. 2.Require specialized information such as voice, gestures, facial expression, etc., which are not always available Our contribution: We propose a novel emotion detection system that detects emotion from widely-used electronic devices 1.Demanding no additional infrastructure, and 2.Exploiting conventionally available usage data. Building Blocks of Our Study 1.Emotion Classes: The target emotion classes in our study are happiness, sadness, fear, anger, disgust, and surprise. 2.Emotion Induction System: A survey system comprising videos, audios, images, and news texts that are used to induce different classes of emotion onto a participant of the system. The survey contents will be presented to the participant and the participant will be prompted to have feedback on the contents. The task of having feedback through a device eventually provides usage data. 3.Usage Data: Comprises different data pertinent for keyboard and mouse usage such as typing speed, dwelling time, clicking speed, etc. 4.Feature Extraction: A module that runs in the background and captures users significant usage data, also known as features, and logs them in every 5 seconds. 5.Feature-Emotion Mapping: A module that clusters features based on corresponding emotions Scope and Applications of Our Study Our study can facilitate making electronic devices more human like, which will lead towards ̵ Making personal computers and cellphones as emotional counterparts, ̵ Devising virtual psychologist to aid psychic patients using electronic devices.