Presentation is loading. Please wait.

Presentation is loading. Please wait.

HCI in Ubiquitous Computing

Similar presentations

Presentation on theme: "HCI in Ubiquitous Computing"— Presentation transcript:

1 HCI in Ubiquitous Computing
양 현 승 AIM (AI & Media) Lab KAIST 전자전산학과

2 Contents HCI in U-C Embedding Interaction U-C HCI Researches

3 Ubiquitous Computing Ubiquitous computing is the method of enhancing computer use by making many computers available throughout the physical environment, but making them effectively invisible to the user (Mark Weiser, Xerox PARC) KAIST, AIM Lab

4 Ubiquitous Computing We are surrounded by computing
Computing and processing is embedded into everyday devices There are many computers/processors per person Information access and communication is possible virtually everywhere Dedicated computing devices – information appliances – are all around us Devices can be connected and networked What gets us here? KAIST, AIM Lab

5 Ubiquitous Computing Mark Weiser: Computers enter everyday life
Help people with everyday tasks in the office and at home (at any time, any place) A good tool is an invisible tool. By invisible, I mean that the tool does not intrude on your consciousness; you focus on the task, not the tool. [Weiser 94] KAIST, AIM Lab

6 HCI themes with U-Life Three past interaction themes:
Natural Interfaces Context-Aware Interaction Automated Capture & Access to Life Experiences New interaction theme proposed: Everyday Computing KAIST, AIM Lab

7 Natural Interfaces Forms of Natural Interfaces Issues Encountered
Speech, gestures handwriting (pen-based/free-form) Issues Encountered Need a way to represent information with new interface Error-prone, even humans can’t perfectly read handwriting KAIST, AIM Lab

8 Context-Aware Interaction
What is appropriate context to use? Current – user and location Future – time, history, other users How to represent this context? Incorporate hierarchal info and relations Truly Ubiquitous? Limitation of many technologies. KAIST, AIM Lab

9 Context-Aware Interaction
location identity objects KAIST, AIM Lab

10 Everyday Computing: Things to be Considered
No clear beginning & end to all activities Interruption is expected Multiple activities operate concurrently Time is important discriminator Associative models of information KAIST, AIM Lab

11 Embedding Interaction

12 U-Life ? Web Electronic servers servers Mobile Web browsers browsers

13 Change of UI Paradigm Single Screen-based UI Interact with a number of
U-devices (distributed + interconnected) Ubiquitous Computing Highly personal and mobile appliances Systems that are integrated in everyday environment KAIST, AIM Lab

14 User interface in U-C Requirements Distribution of UI Implicit HCI
All U-devices are distributed. Implicit HCI To reduce the need for explicit HCI To let explicit interfaces virtually disappear into the environment Awareness of the situation, the environment and the aims of the user Being noticed only when needed KAIST, AIM Lab

15 User interface in U-C Current Interaction Interaction in U-C
Explicit HCI By command-line By direct manipulation using a GUI, gesture, or speech input Interaction in U-C Implicit HCI It allows the computer to interpret the user’s behavior and the surrounding situation and use this information as input KAIST, AIM Lab

16 What is different from traditional ‘HCI’ and ‘HCI in UbiComp’ ?
Output modalities not just an audio visual channel all senses! Input modalities more than pressing buttons and moving an object in two dimensions Distribution – physical and conceptual Magic beyond the screen … it is a vivid physical relationship KAIST, AIM Lab

17 Development Process? Research Approach?
Not anymore designing and programming a GUI Interdisciplinary teams – ethnography, design, CS It is about creating an experience by Understanding the interaction and process Designing and constructing a set of devices and an environment Implement the human-information interface based on the created devices/environment Test it yourself Test it with users … go back an refine the hardware and start again KAIST, AIM Lab

18 Prototypes Functional prototypes are essential to learn, understand and experience how to interact with the ubiquitous computer From the idea to knowledge Prototyping has been central to hallmark research in the area (e.g. ParcTab, ActiveBadge) Learning occurs when along the prototyping process as well as in use Evaluation Functional prototypes are the means for evaluation “Confronting” real people – already with version 0.001 Deployment in a living lab environment Facilitating everyday environments with real users KAIST, AIM Lab

19 Ubi-Comp Environment is itself the Interface
Everyday objects augmented with sensing table chairs glasses Creating a digital shadow reflecting the interaction KAIST, AIM Lab

20 Embedding Interaction
Basic technologies for embedding interaction Sensing technologies Environmental conditions Users’ location Co-location with others Physiological and emotional state of the user User goals User schedules Agent technologies Combining a multitude of sometimes contradictory inputs to make sense at a higher level Adopting a system’s output to be appropriate to whatever situation might arise KAIST, AIM Lab

21 Implicit Interaction (1/2)
Implicit Human-Computer Interaction (iHCI) iHCI is the interaction of a human with the environment and with artifacts which is aimed to accomplish a goal. Within this process the system acquires implicit inputs from the user and may present implicit output to the user. Implicit Input Implicit inputs are actions and behaviour of humans, which are done to achieve a goal and are not primarily regarded as interaction with a computer, but captured, recognized and interpret by a computer system as input. Implicit Output Output of a computer that is not directly related to an explicit input and which is seamlessly integrated with the environment and the task of the user. KAIST, AIM Lab

22 U-C HCI Researches

23 OXYGEN Project Speech and vision technologies enable us to communicate with Oxygen as if we’re interacting with another person, saving much time and effort KAIST, AIM Lab MIT Media Lab

24 AwareHome Georgia Tech. Designing the Interactive Experience
Digital Family Portrait reconnects geographically distant extended family members by allowing them to remain aware of each other in a non-obtrusive, lightweight manner What Was I Cooking? a context-aware system that captures the transient information of recent activities and passively displays them as visual cues. Gesture Pendant Gesture Pendant recognizes and then translates gestures into commands for your home appliances AwareHome with human-like perception could improve quality of life for many, especially seniors. KAIST, AIM Lab Georgia Tech.

25 Easy Living(1) Microsoft Key features
Computer vision for person-tracking and visual user interaction. Multiple sensor modalities combined. Use of a geometric model of the world to provide context. Automatic or semi-automatic sensor calibration and model building. Fine-grained events and adaptation of the user interface. Device-independent communication and data protocols. Ability to extend the system in many ways. EasyLiving is developing a prototype architecture and technologies for building intelligent environments System Architecture Rules Engine Person Tracker Detector Seat Sensors PC Logon Fingerprint Room Lights A/V Media Systems Terminal Server Control UI KB/Mouse Redirect Desktop Manager World Model Agent Lookup person tracking world model room control authentication KAIST, AIM Lab Microsoft

26 New sensor measurement “Person creation zone”
Easy Living(2) color depth patches people Personal Detection Stereo Processing with commercial software Background subtraction and person detection Reports sent to central personal tracker about 7Hz Personal Tracking Process each new report from a sensor Past locations Predicted location New sensor measurement “Person creation zone” KAIST, AIM Lab Microsoft

27 HomeLab Philips HomeLab Philips Research
appearance looking and feeling like a regular home for testing its new home technology prototypes in the most realistic possible way WWICE PHENOM EASY ACCESS POGO: an interactive game for children virtual story world interfaced by active tools Intelligent Personal-Care Environment based on measurements from the activity monitor and heart rate sensor KAIST, AIM Lab Philips Research

28 KAIST AIM Lab Research

29 Role of Wearable Computer in Ubiquitous Computing Environment
It easily acquires personal data (personalization). It guarantees safety of personal data (privacy). It enhances user’s interaction with many devices. It reduces network traffic about personal data transmitting. It assists us to work (agent). The wearable computer plays an important role in ubiquitous computing environment. It is somewhat easy for the wearable computer to acquire personal data because it is worn on the human body. And it guarantees safety of personal data since only the user can login to the system. It reduces for us to be interfered with by many devices as well. In addition to that, it reduces network traffic. Because personal data is managed by the wearable computer, personal data is transmitted only to the devices which require them. So, network traffic can be reduced. And last, it also assists us to do work as an intelligent agent. KAIST, AIM Lab

30 Background Various electronic media will be scattered around us
in the near future (ubiquitous computing environments). We will frequently interact with those media. (We will feel much annoyed with this interaction.) A system that assists us in interacting with those media in our daily life is required. We are now living in the complicated environment where various electronic devices are used very frequently. In the future, the environment will be more and more complicated than now. Then, we will feel much annoyed with interaction of those devices. A system that assists us in interacting with lots of devices must be indispensable. This system should be able to understand a user’s intention or preference and to communicate with different electronic media. We have chosen a wearable computer as this system. This system should understand a user’s intention or preference. This system should communicate with various electronic media. KAIST, AIM Lab

31 Research Objective ♦ To Establish Some Concepts - IEM - IWAS
♦ To Propose an IWAS prototype and IEM prototypes Our research objective is to establish two concepts, IEM and IWAS, which will be used in ubiquitous computing environment. And we will propose an IWAS prototype and IEM prototypes. By demonstrating interaction process of IWAS and IEM, we will present the possibility of unifying wearable computing technology and ubiquitous computing technology. ♦ To demonstrate interaction of IWAS and IEM KAIST, AIM Lab

32 IEM . . . Interactive Electronic Media
electronic media in ubiquitous computing environment that are not only controlled by a user’s command but that also respond to context or the user’s emotional state . . . IEM is the controllable electronic media, in ubiquitous computing environment, that is not only controlled by a user’s command but that also respond to context of the user’s emotional state. For example, in a home networking environment, we can control a TV, a PC, a lamp, and so on by one control system if those are wirelessly controlled. Wireless Control IEM KAIST, AIM Lab

33 IEM IEM Examples IEM encapsulated electronic appliances
such as a TV, a video player, a radio, a computer, and etc. Responsive digital media interactive media artworks All objects with embedded computer chips or sensors an automatic curtain that rises or falls according to a user’s intention or preference a lamp that intelligently controls the intensity of light according to a user’s emotional state IEM Features Wireless control => ultimately, automation (agent system) Unique ID Interaction capability As IEM examples, we can think electronic appliances such as a TV, a radio, a computer. But, the electronic appliances itself is not IEM. they cannot interact with people for itself. We must attach some devices on the electronic appliances. And, the responsive digital media is general IEM. IEM can include all objects with embedded computer chips or sensors if they can respond to a user’s wireless command. An automatic curtain that rises or falls according to the user’s intention or preference and a lamp that intelligently controls the light intensity according to the user’s emotional state are also IEM. IEM requires wireless control method and ultimately automatic control method. IEM can take power on or off, volume up or down according to the user’s command. And IEM must have it’s own ID. KAIST, AIM Lab

34 IWAS Intelligent Wearable Assistance System KAIST, AIM Lab
IWAS is a kind of wearable computer that can sense, control and communicate with different IEM. Its main objective is to provide intuitive and convenient communication between a user and IEM. Through the IWAS interface, a user can easily control and interact with various IEM. Although electronic media would be more intelligent than before, a system that control electronic media would be necessary. We expect the wearable computer would play such role in ubiquitous computing environment. KAIST, AIM Lab

35 IWAS H/W Design KAIST, AIM Lab

36 IWAS H/W Design Self-contained System to Wear User-friendly Interface
integrating all components of wearable computer with a suit User-friendly Interface input: speech recognition, key-pad, mouse, etc. output: see-through HMD, small speakers. Various Sensors FSR and postural sensing unit Infra-red tag reading unit Wireless networking Wireless LAN, IEEE b KAIST, AIM Lab

37 Functions of IWAS Intelligent User-Assistance
local identification using IR sensor device direct control of IEM using IR remote controller communication via wireless LAN or Bluetooth information service such as schedule alert, check interacting with media KAIST, AIM Lab

38 Functions of IWAS UbiComp Environment Sensing Control
Home Network (Home RF, IEEE902.11, ···) Home G/W TV PC Audio Player Phone Sensing Control Intelligent Agent IEM identification using IR sensor wireless control using IR remote controller UbiComp Environment providing personalized information service Lamp IWAS must be able do the following functions. First, for sensing function, IEM identification is required. Second, for control function, wireless control is required. And last, IWAS must provide personalized information service like intelligent agent. KAIST, AIM Lab

39 IWAS H/W Prototype See-through HMD IR tag reader &
FSR sensor See-through HMD with speech head set IR tag reader & IR remote controller 3-axis postural sensor IWAS suit This slide shows the developed IWAS H/W prototype and a few sensors and devices. KAIST, AIM Lab

40 Interaction with IWAS and IEM
CASE 1: Operating a laptop computer CASE 2: Turning on TV KAIST, AIM Lab

41 Interaction with IWAS and IEM
CASE 3: Controlling virtual system KAIST, AIM Lab

Download ppt "HCI in Ubiquitous Computing"

Similar presentations

Ads by Google