HCI in Ubiquitous Computing

Slides:



Advertisements
Similar presentations
Composite Device Computing Environment: A Framework for Situated Interaction Using Small Screen Devices Thai-Lai Pham, Georg Schneider, Stuart Goose and.
Advertisements

Blue Eye T E C H N O L G Y.
Sharing Content and Experience in Smart Environments Johan Plomp, Juhani Heinila, Veikko Ikonen, Eija Kaasinen, Pasi Valkkynen 1.
Integrating Educational Technology into the Curriculum
Ying Wang EDN 303 Fall Objectives Define curriculum-specific learning Explain the difference between computer, information, and integration literacy.
I-Room : Integrating Intelligent Agents and Virtual Worlds.
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
Ubiquitous Computing Definitions Ubiquitous computing is the method of enhancing computer use by making many computers available throughout the physical.
Ubiquitous Computing and Augmented Realities Material from Authors of Human Computer Interaction Alan Dix, et al.
Component-Based Software Engineering Oxygen Paul Krause.
1 Ubiquitous Computing CS376 Reading Summary Taemie Kim.
HCI Futures UQI120S2. What are the challenges? New hardware devices New software techniques New user expectations Better psychology More connectivity.
Introduction to HCC and HCM. Human Centered Computing Philosophical-humanistic position regarding the ethics and aesthetics of a workplace Any system.
Ubiquitous Computing The death of PC?. Ubiquitous Computing ubiquitous = Being or seeming to be everywhere at the same time; omnipresent. Mark Weiser,
Ubiquitous Computing Computers everywhere. Agenda Old future videos
Ubiquitous Computing Computers everywhere. Thursday: presentations UCook Team NoName Save the Best for Last Food Networking.
Johan Mattsson Post Desktop user-interfaces iWand, evaluation, iStuff, iLounge, iROS & USE.
1 Application Areas Lecture 17 Date: 6 th April. 2 Overview of Lecture Application areas: CSCW Ubiquitous Computing What is ubiquitous computing? Major.
Ubiquitous Computing Computers everywhere.
Real-Time Systems and the Aware Home Anind K. Dey Ubiquitous Computing Future Computing Environments.
Security and Privacy in Ubiquitous Computing. Agenda Project issues? Project issues? Ubicomp quick overview Ubicomp quick overview Privacy and security.
New Technologies Are Surfacing Everyday. l Some will have a dramatic affect on the business environment. l Others will totally change the way you live.
Smart Home Technologies CSE 4392 / CSE 5392 Spring 2006 Manfred Huber
Sixth Sense Jyothi Priyanka Mudumala. Intro... Ever wondered taking a photo with just the fingers. Ever wondered calling home with just the hands and.
Sharena Paripatyadar.  What are the differences?
Building an Application Server for Home Network based on Android Platform Yi-hsien Liao Supervised by : Dr. Chao-huang Wei Department of Electrical Engineering.
報告日期 :2012/03/07 指導教授 : 蔡亮宙 報 告 者 : 吳烱華 自製率 :100%.
MCTS GUIDE TO MICROSOFT WINDOWS 7 Chapter 14 Remote Access.
Personalized Medicine Research at the University of Rochester Henry Kautz Department of Computer Science.
Amarino:a toolkit for the rapid prototyping of mobile ubiquitous computing Bonifaz Kaufmann and Leah Buechley MIT Media Lab High-Low Tech Group Cambridge,
There is more to Context than Location Albrecht Schmidt, Michael Beigl, and Hans-W. Gellersen Telecooperation Office (TecO), University of Karlsruhe, Elsevier,
Ambient intelligence Opportunities and Consequences of Its Use in Smart Classrooms Augusto, Juan Carlos. Innovation in Teaching and Learning in Information.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Some Computer Science Issues in Ubiquitous Computing Presenter : Junghee-Han Mark Weiser Communications of the ACM, July 1993.
Fall 2002CS/PSY Pervasive Computing Ubiquitous computing resources Agenda Area overview Four themes Challenges/issues Pervasive/Ubiquitous Computing.
Charting Past, Present, and Future Research in Ubiquitous Computing Gregory D. Abowd and Elizabeth D. Mynatt Georgia Institute of Technology LEE SEMUN.
Ubiquitous Computing Computers everywhere. Where are we going? What happens when the input is your car pulls into the garage, and the output is the heat.
Luigina Ciolfi, Interaction Design Centre CS4826, Human-Computer Interaction 09/04/2002
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Ubiquitous Computing Software Systems.
1 Chapter 7 Designing for the Human Experience in Smart Environments.
Usability in Pervasive Computing Environment Advance Usability October 18, 2004 Anuj A. Nanavati.
MODULE 1 Computing Essentials © Paradigm Publishing, Inc.1.
An Effect of interactive media in a social awareness ubiquitous learning community Associate Professor Dr.Jaitip Nasonkhla.
Comp 15 - Usability & Human Factors Unit 9 - Ubiquitous Computing in Healthcare This material was developed by Columbia University, funded by the Department.
FOREWORD By: Howard Shrobe MIT CS & AI Laboratory
Human-Computer Interaction
INFO 355Week #71 Systems Analysis II User and system interface design INFO 355 Glenn Booker.
Trends in Embedded Computing The Ubiquitous Computing through Sensor Swarms.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
Internet of Things. IoT Novel paradigm – Rapidly gaining ground in the wireless scenario Basic idea – Pervasive presence around us a variety of things.
1 Pervasive Computing: Vision and Challenges Myungchul Kim Tel:
Ubiquitous Computing Computers everywhere. Wednesday: presentations Ideal Concepts T.H.E. Team Infused Industries CommuniCORP Part 3 DUE!
Ambient Intelligence: Everyday Living Aid System for Elders
Slide no 1 Cognitive Systems in FP6 scope and focus Colette Maloney DG Information Society.
Computer Science and Engineering Department The University of Texas at Arlington MavHome: An Intelligent Home Environment.
What is Multimedia Anyway? David Millard and Paul Lewis.
SEMINAR ON WEARABLE COMPUTING. What is a “wearable computer”  A small portable computer that is designed to be worn on the body during use.  Wearable.
Internet of Things – Getting Started
BlueEyes Human Operator Monitoring System BlueEyes Human-Operator Monitoring System PRESENTED BY:- AYUSHI TYAGI B1803B37.
Assisted Cognition Systems Henry Kautz Department of Computer Science.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
Mobile learning three C’s
1st Draft for Defining IoT (1)
Ubiquitous Computing and Augmented Realities
Ubiquitous Computing Computers everywhere.
SIXTH SENSE TECHNOLOGY
Ambient Intelligence.
Pervasive Computing Ubiquitous computing resources
Ubiquitous Computing By: Patrick Yienger.
Presentation transcript:

HCI in Ubiquitous Computing 양 현 승 AIM (AI & Media) Lab KAIST 전자전산학과 hsyang@cs.kaist.ac.kr http://mind.kaist.ac.kr

Contents HCI in U-C Embedding Interaction U-C HCI Researches KAIST, AIM Lab

Ubiquitous Computing Ubiquitous computing is the method of enhancing computer use by making many computers available throughout the physical environment, but making them effectively invisible to the user (Mark Weiser, Xerox PARC) KAIST, AIM Lab

Ubiquitous Computing We are surrounded by computing Computing and processing is embedded into everyday devices There are many computers/processors per person Information access and communication is possible virtually everywhere Dedicated computing devices – information appliances – are all around us Devices can be connected and networked What gets us here? KAIST, AIM Lab

Ubiquitous Computing Mark Weiser: Computers enter everyday life Help people with everyday tasks in the office and at home (at any time, any place) A good tool is an invisible tool. By invisible, I mean that the tool does not intrude on your consciousness; you focus on the task, not the tool. [Weiser 94] KAIST, AIM Lab

HCI themes with U-Life Three past interaction themes: Natural Interfaces Context-Aware Interaction Automated Capture & Access to Life Experiences New interaction theme proposed: Everyday Computing KAIST, AIM Lab

Natural Interfaces Forms of Natural Interfaces Issues Encountered Speech, gestures handwriting (pen-based/free-form) Issues Encountered Need a way to represent information with new interface Error-prone, even humans can’t perfectly read handwriting KAIST, AIM Lab

Context-Aware Interaction What is appropriate context to use? Current – user and location Future – time, history, other users How to represent this context? Incorporate hierarchal info and relations Truly Ubiquitous? Limitation of many technologies. KAIST, AIM Lab

Context-Aware Interaction location identity objects KAIST, AIM Lab

Everyday Computing: Things to be Considered No clear beginning & end to all activities Interruption is expected Multiple activities operate concurrently Time is important discriminator Associative models of information KAIST, AIM Lab

Embedding Interaction

U-Life ? Web Electronic servers servers Mobile Web browsers browsers KAIST, AIM Lab

Change of UI Paradigm Single Screen-based UI Interact with a number of U-devices (distributed + interconnected) Ubiquitous Computing Highly personal and mobile appliances Systems that are integrated in everyday environment KAIST, AIM Lab

User interface in U-C Requirements Distribution of UI Implicit HCI All U-devices are distributed. Implicit HCI To reduce the need for explicit HCI To let explicit interfaces virtually disappear into the environment Awareness of the situation, the environment and the aims of the user Being noticed only when needed KAIST, AIM Lab

User interface in U-C Current Interaction Interaction in U-C Explicit HCI By command-line By direct manipulation using a GUI, gesture, or speech input Interaction in U-C Implicit HCI It allows the computer to interpret the user’s behavior and the surrounding situation and use this information as input KAIST, AIM Lab

What is different from traditional ‘HCI’ and ‘HCI in UbiComp’ ? Output modalities not just an audio visual channel all senses! Input modalities more than pressing buttons and moving an object in two dimensions Distribution – physical and conceptual Magic beyond the screen … it is a vivid physical relationship KAIST, AIM Lab

Development Process? Research Approach? Not anymore designing and programming a GUI Interdisciplinary teams – ethnography, design, CS It is about creating an experience by Understanding the interaction and process Designing and constructing a set of devices and an environment Implement the human-information interface based on the created devices/environment Test it yourself Test it with users … go back an refine the hardware and start again KAIST, AIM Lab

Prototypes Functional prototypes are essential to learn, understand and experience how to interact with the ubiquitous computer From the idea to knowledge Prototyping has been central to hallmark research in the area (e.g. ParcTab, ActiveBadge) Learning occurs when along the prototyping process as well as in use Evaluation Functional prototypes are the means for evaluation “Confronting” real people – already with version 0.001 Deployment in a living lab environment Facilitating everyday environments with real users KAIST, AIM Lab

Ubi-Comp Environment is itself the Interface Everyday objects augmented with sensing table chairs glasses … Creating a digital shadow reflecting the interaction KAIST, AIM Lab

Embedding Interaction Basic technologies for embedding interaction Sensing technologies Environmental conditions Users’ location Co-location with others Physiological and emotional state of the user User goals User schedules … Agent technologies Combining a multitude of sometimes contradictory inputs to make sense at a higher level Adopting a system’s output to be appropriate to whatever situation might arise KAIST, AIM Lab

Implicit Interaction (1/2) Implicit Human-Computer Interaction (iHCI) iHCI is the interaction of a human with the environment and with artifacts which is aimed to accomplish a goal. Within this process the system acquires implicit inputs from the user and may present implicit output to the user. Implicit Input Implicit inputs are actions and behaviour of humans, which are done to achieve a goal and are not primarily regarded as interaction with a computer, but captured, recognized and interpret by a computer system as input. Implicit Output Output of a computer that is not directly related to an explicit input and which is seamlessly integrated with the environment and the task of the user. KAIST, AIM Lab

U-C HCI Researches

OXYGEN Project Speech and vision technologies enable us to communicate with Oxygen as if we’re interacting with another person, saving much time and effort KAIST, AIM Lab MIT Media Lab

AwareHome Georgia Tech. Designing the Interactive Experience Digital Family Portrait reconnects geographically distant extended family members by allowing them to remain aware of each other in a non-obtrusive, lightweight manner What Was I Cooking? a context-aware system that captures the transient information of recent activities and passively displays them as visual cues. Gesture Pendant Gesture Pendant recognizes and then translates gestures into commands for your home appliances AwareHome with human-like perception could improve quality of life for many, especially seniors. KAIST, AIM Lab Georgia Tech.

Easy Living(1) Microsoft Key features Computer vision for person-tracking and visual user interaction. Multiple sensor modalities combined. Use of a geometric model of the world to provide context. Automatic or semi-automatic sensor calibration and model building. Fine-grained events and adaptation of the user interface. Device-independent communication and data protocols. Ability to extend the system in many ways. EasyLiving is developing a prototype architecture and technologies for building intelligent environments System Architecture Rules Engine Person Tracker Detector Seat Sensors PC Logon Fingerprint Room Lights A/V Media Systems Terminal Server Control UI KB/Mouse Redirect Desktop Manager World Model Agent Lookup person tracking world model room control authentication KAIST, AIM Lab Microsoft

New sensor measurement “Person creation zone” Easy Living(2) . . . . color depth patches people Personal Detection Stereo Processing with commercial software Background subtraction and person detection Reports sent to central personal tracker about 7Hz Personal Tracking Process each new report from a sensor Past locations Predicted location New sensor measurement “Person creation zone” KAIST, AIM Lab Microsoft

HomeLab Philips HomeLab Philips Research appearance looking and feeling like a regular home for testing its new home technology prototypes in the most realistic possible way WWICE PHENOM EASY ACCESS POGO: an interactive game for children virtual story world interfaced by active tools Intelligent Personal-Care Environment based on measurements from the activity monitor and heart rate sensor KAIST, AIM Lab Philips Research

KAIST AIM Lab Research

Role of Wearable Computer in Ubiquitous Computing Environment It easily acquires personal data (personalization). It guarantees safety of personal data (privacy). It enhances user’s interaction with many devices. It reduces network traffic about personal data transmitting. It assists us to work (agent). The wearable computer plays an important role in ubiquitous computing environment. It is somewhat easy for the wearable computer to acquire personal data because it is worn on the human body. And it guarantees safety of personal data since only the user can login to the system. It reduces for us to be interfered with by many devices as well. In addition to that, it reduces network traffic. Because personal data is managed by the wearable computer, personal data is transmitted only to the devices which require them. So, network traffic can be reduced. And last, it also assists us to do work as an intelligent agent. KAIST, AIM Lab

Background Various electronic media will be scattered around us in the near future (ubiquitous computing environments). We will frequently interact with those media. (We will feel much annoyed with this interaction.) A system that assists us in interacting with those media in our daily life is required. We are now living in the complicated environment where various electronic devices are used very frequently. In the future, the environment will be more and more complicated than now. Then, we will feel much annoyed with interaction of those devices. A system that assists us in interacting with lots of devices must be indispensable. This system should be able to understand a user’s intention or preference and to communicate with different electronic media. We have chosen a wearable computer as this system. This system should understand a user’s intention or preference. This system should communicate with various electronic media. KAIST, AIM Lab

Research Objective ♦ To Establish Some Concepts - IEM - IWAS ♦ To Propose an IWAS prototype and IEM prototypes Our research objective is to establish two concepts, IEM and IWAS, which will be used in ubiquitous computing environment. And we will propose an IWAS prototype and IEM prototypes. By demonstrating interaction process of IWAS and IEM, we will present the possibility of unifying wearable computing technology and ubiquitous computing technology. ♦ To demonstrate interaction of IWAS and IEM KAIST, AIM Lab

IEM . . . Interactive Electronic Media electronic media in ubiquitous computing environment that are not only controlled by a user’s command but that also respond to context or the user’s emotional state . . . IEM is the controllable electronic media, in ubiquitous computing environment, that is not only controlled by a user’s command but that also respond to context of the user’s emotional state. For example, in a home networking environment, we can control a TV, a PC, a lamp, and so on by one control system if those are wirelessly controlled. Wireless Control IEM KAIST, AIM Lab

IEM IEM Examples IEM encapsulated electronic appliances such as a TV, a video player, a radio, a computer, and etc. Responsive digital media interactive media artworks All objects with embedded computer chips or sensors an automatic curtain that rises or falls according to a user’s intention or preference a lamp that intelligently controls the intensity of light according to a user’s emotional state IEM Features Wireless control => ultimately, automation (agent system) Unique ID Interaction capability As IEM examples, we can think electronic appliances such as a TV, a radio, a computer. But, the electronic appliances itself is not IEM. they cannot interact with people for itself. We must attach some devices on the electronic appliances. And, the responsive digital media is general IEM. IEM can include all objects with embedded computer chips or sensors if they can respond to a user’s wireless command. An automatic curtain that rises or falls according to the user’s intention or preference and a lamp that intelligently controls the light intensity according to the user’s emotional state are also IEM. IEM requires wireless control method and ultimately automatic control method. IEM can take power on or off, volume up or down according to the user’s command. And IEM must have it’s own ID. KAIST, AIM Lab

IWAS Intelligent Wearable Assistance System KAIST, AIM Lab IWAS is a kind of wearable computer that can sense, control and communicate with different IEM. Its main objective is to provide intuitive and convenient communication between a user and IEM. Through the IWAS interface, a user can easily control and interact with various IEM. Although electronic media would be more intelligent than before, a system that control electronic media would be necessary. We expect the wearable computer would play such role in ubiquitous computing environment. KAIST, AIM Lab

IWAS H/W Design KAIST, AIM Lab

IWAS H/W Design Self-contained System to Wear User-friendly Interface integrating all components of wearable computer with a suit User-friendly Interface input: speech recognition, key-pad, mouse, etc. output: see-through HMD, small speakers. Various Sensors FSR and postural sensing unit Infra-red tag reading unit Wireless networking Wireless LAN, IEEE 802.11b KAIST, AIM Lab

Functions of IWAS Intelligent User-Assistance local identification using IR sensor device direct control of IEM using IR remote controller communication via wireless LAN or Bluetooth information service such as schedule alert, email check interacting with media KAIST, AIM Lab

Functions of IWAS UbiComp Environment Sensing Control Home Network (Home RF, IEEE902.11, ···) Home G/W TV PC Audio Player Phone Sensing Control Intelligent Agent IEM identification using IR sensor wireless control using IR remote controller UbiComp Environment providing personalized information service Lamp IWAS must be able do the following functions. First, for sensing function, IEM identification is required. Second, for control function, wireless control is required. And last, IWAS must provide personalized information service like intelligent agent. KAIST, AIM Lab

IWAS H/W Prototype See-through HMD IR tag reader & FSR sensor See-through HMD with speech head set IR tag reader & IR remote controller 3-axis postural sensor IWAS suit This slide shows the developed IWAS H/W prototype and a few sensors and devices. KAIST, AIM Lab

Interaction with IWAS and IEM CASE 1: Operating a laptop computer CASE 2: Turning on TV KAIST, AIM Lab

Interaction with IWAS and IEM CASE 3: Controlling virtual system KAIST, AIM Lab