Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 5 Human Computer Interaction

Similar presentations


Presentation on theme: "Chapter 5 Human Computer Interaction"— Presentation transcript:

1 Chapter 5 Human Computer Interaction
UbiCom Book Slides Chapter 5 Human Computer Interaction Stefan Poslad Ubiquitous computing: smart devices, environments and interaction

2 Ubiquitous computing: smart devices, environments and interaction
HCI: Overview This part (a) first discusses: What is Human Computer Interaction or Interfaces (HCI) and why we need good HCI for human interactive systems? What is a sub-type of HCI, implicit HCI (iHCI), how is it differentiated from conventional explicit HCI (eHCI) and why do we need this to enhance pervasive computing? How to use eHCI in some common types of device? How to use iHCI in (mobile and static) devices that are not permanently attached to humans? How to use iHCI in (mobile and static) devices that accompany humans through being surface-mounted (wearable) or embedded (implants) Ubiquitous computing: smart devices, environments and interaction

3 Ubiquitous computing: smart devices, environments and interaction
Chapter 5 Related Links iHCI is a type of context-awareness for the human environment (Chapter 7) Human behaviour models of intelligence (Chapter 8) Social & other consequences of making devices more human and more intelligent (Chapter 12) Ubiquitous computing: smart devices, environments and interaction

4 Ubiquitous computing: smart devices, environments and interaction
HCI: Overview The slides for this chapter are also expanded and split into several parts in the full pack Part A: eHCI Use in some common smart device types Part B iHCI for accompanied smart devices Part C: iHCI for wearable & implanted smart devices Part D: Human Centred Design Part E: User Models and iHCI Design Ubiquitous computing: smart devices, environments and interaction

5 Ubiquitous computing: smart devices, environments and interaction
HCI: Overview HCI, eHCI & iHCI  eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

6 Ubiquitous computing: smart devices, environments and interaction

7 Ubiquitous computing: smart devices, environments and interaction
HCI: Introduction Term HCI, widely used, since onset of Personal Computing era in 1980s. However groundwork for field of HCI started earlier, during onset of the industrial revolution Tasks became automated and powered-assisted -> triggers an interest in studying human-machine interaction Some tasks require little human interaction during operation, e.g., clothes-, dish- washing etc Other tasks are very interactive, e.g., face washing, playing the violin, etc Ubiquitous computing: smart devices, environments and interaction

8 Ubiquitous computing: smart devices, environments and interaction
H,C & I Basic concepts of HCI are: Humans Computers / devices Interaction Ubiquitous computing: smart devices, environments and interaction

9 Ubiquitous computing: smart devices, environments and interaction
HCI: Motivation Machines (systems) aid human performance, but systems that interact poorly with humans will be a poor human aid. Need design models & process that are (user) interactive The motivation for HCI is clear; to support more effective use (Dix, 2004a) in three ways Useful: Usable: Be used: Ubiquitous computing: smart devices, environments and interaction

10 HCI: Usability vs. Usefulness
Success of a product depends largely on ? Summarised as Heckel's law and Heckel's inverse law: Heckel’s law: Heckel’s inverse law: What this law expresses ? Ubiquitous computing: smart devices, environments and interaction

11 Ubiquitous computing: smart devices, environments and interaction
Explicit HCI (eHCI) eHCI design: explicit interaction during a device’s normal operation. What are the Dominant eHCI UIs Pure eHCI Context-free Focus on H2C (Human-to-Computer) Interaction Ubiquitous computing: smart devices, environments and interaction

12 eHCI versus Natural Interaction
Natural interaction and familiarity and expertise Familiarity with use of tool is cultural and subjective Note also Natural Interaction linked to use of iHCI Ubiquitous computing: smart devices, environments and interaction

13 Ubiquitous computing: smart devices, environments and interaction
iHCI Concept of implicit HCI (iHCI) Proposed by Schmidt (2000) Defined as “an action, performed by the user that is not primarily aimed to interact with a computerized system but which such a system understands as input”. Our definition of iHCI bit different: inputs with an implicit or implied context, Ubiquitous computing: smart devices, environments and interaction

14 Ubiquitous computing: smart devices, environments and interaction
iHCI iHCI is more about C2H (Computer to Human) Interaction iHCI assumes Chas a certain Model of H user Model of H used as additional input Need to share implicit context between human and system Implicit interaction naturally supports hidden device design. Ubiquitous computing: smart devices, environments and interaction

15 eHCI + iHCI or iHCI vs eHCI
E.g.?? . eHCI, usability design? Alternative iHCI design? Shift from eHCI design to also include iHCI design will be a key enabler for effective UbiCom systems Ubiquitous computing: smart devices, environments and interaction

16 Ubiquitous computing: smart devices, environments and interaction
iHCI: Challenges Complex to accurately and reliably determine user context. Why? Ubiquitous computing: smart devices, environments and interaction

17 Ubiquitous computing: smart devices, environments and interaction
Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices  iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

18 How Device Interfaces & Interaction Varies
Devices can be characterized according to?: Ubiquitous computing: smart devices, environments and interaction

19 UI and HCI Designs for 4 Common Devices
PC Mobile Phone Games Console but many sub-types TV / Projectors How does the UI and HCI design differ between these? Ubiquitous computing: smart devices, environments and interaction

20 UI Type: Personal Computer Interface
??? Ubiquitous computing: smart devices, environments and interaction

21 Ubiquitous computing: smart devices, environments and interaction
PC UI use in Mobiles Using a conventional PC UI approach won’t be optimum for mobile computing & ubiquitous computing - need a different approach, Why? Ubiquitous computing: smart devices, environments and interaction

22 UI Type: Mobile Device Interfaces
PC / WIMPS models not so suitable for mobile (one handed) devices, Why not? Ubiquitous computing: smart devices, environments and interaction

23 Mobile Device Interface: Limited I/P
How to support mobile user and small size of input? Ubiquitous computing: smart devices, environments and interaction

24 Mobile Device Interface: Limited O/P
How to overcome limited output? Haptic interface use, e.g., vibration to signal incoming call Maximising use of small screen: scrolling, switching screen Peephole displays Foldable displays Filter information so receive and display less information, e.g., using Personalisation(Chapter 7) Personal Agents (Chapter 8) Ubiquitous computing: smart devices, environments and interaction

25 UI Type: Games Console Interfaces
Games consoles: an important driver and can contribute to UbiCom in a number of ways. Computer games have often acted as an incubator for many innovations driving computing. How? Many different types of Games Console Interface Ubiquitous computing: smart devices, environments and interaction

26 Games Console Interfaces: D-pad
How does the D-pad controller work? Ubiquitous computing: smart devices, environments and interaction

27 Games Console Interfaces: 3D Gesture-Based
How does the 3D Gesture-Based controller work? Use of MEMS/ Sensors (Chapter 7) Use of gesture recognition (see later) Ubiquitous computing: smart devices, environments and interaction

28 UI Type: Control (Panel) Interfaces
Different Types of remote controllers depending on how remote the controller is: User approx. co-located with device being controlled User not co-located with device being Controlled Ubiquitous computing: smart devices, environments and interaction

29 UI Type: Localised Remote Control Interfaces
Characteristics Input controller and device separation Input device interfaces Wireless link between input control device and device Ubiquitous computing: smart devices, environments and interaction

30 UI Type: Localised Remote Control Interfaces
But profusion of remote control devices which have overlapping features Is it necessary to have a specialised controller per consumer device? Problems? How to solve this? Ubiquitous computing: smart devices, environments and interaction

31 Ubiquitous computing: smart devices, environments and interaction
UIs often introduce modes to minimise the number of UI controls (e.g., toggle buttons vs. radio buttons) AM and PM are modes! Up and down buttons introduce modes UI that contain more controls & less modes appear more complex but are often easier to operate Changing the gesture to trigger action (e.g., adaptive menus) are more useful if they are inclusive rather than exclusive Ubiquitous computing: smart devices, environments and interaction

32 Localised Remote Control Interface Design
Instructors can add more detail about the discussion and design of universal controller here or delete this slide. (Section 5.2.5) Ubiquitous computing: smart devices, environments and interaction

33 Ubiquitous computing: smart devices, environments and interaction
Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices  iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

34 iHCI use in Accompanied Smart Devices: Topics
Single vs. Multi-Modal Visual Interfaces Gesture Interfaces Reflective versus Active Displays Combining Input and Output User Interfaces ??? Auditory Interfaces Natural Language Interfaces Ubiquitous computing: smart devices, environments and interaction

35 Single vs. Multi-Modal Visual Interfaces
Mode of human interaction uses human senses? Which Interactive ICT systems have modalities that mimic human senses. What? Ubiquitous computing: smart devices, environments and interaction

36 Computer input & output modalities
Ubiquitous computing: smart devices, environments and interaction

37 Single vs. Multi-Modal Visual Interfaces
Many interactive ICT systems use single visual mode of output interaction. Problems? Solutions? Ubiquitous computing: smart devices, environments and interaction

38 Multi-Modal Interaction Design: challenges
Integrating multiple modes is complex. Why? Ubiquitous computing: smart devices, environments and interaction

39 Multi-Modal Interaction: Design
Two main approaches Data for each modality can be processed separately, then combined at the end. Data for each modality can be processed & combined concurrently Ubiquitous computing: smart devices, environments and interaction

40 Ubiquitous computing: smart devices, environments and interaction
Gesture Interfaces What are Gestures? Expressive, meaningful body motions Involving physical movements. Which? With the intent of conveying meaningful information about interacting with the environment. Ubiquitous computing: smart devices, environments and interaction

41 Ubiquitous computing: smart devices, environments and interaction
Gesture Interfaces What are the main types of human gestures? How can gestures be sensed? Ubiquitous computing: smart devices, environments and interaction

42 Gesture Interfaces: Classification
Gestures can also be classified into 2D versus 3D Contactful versus Contactless Directly sensed versus indirectly sensed Ubiquitous computing: smart devices, environments and interaction

43 Gesture Interfaces: Applications
1st basic contact based gesture interfaces? From the mid 2000s, contact less gestures being used in several types of games consoles, mobile phones, cameras, etc. Ubiquitous computing: smart devices, environments and interaction

44 Gesture Interfaces: Applications
????? Ubiquitous computing: smart devices, environments and interaction

45 Gesture Interfaces: Applications
Gesture: Rotate or flip hand Action: Rotate or flip image Ubiquitous computing: smart devices, environments and interaction

46 Gesture Interfaces: Applications
Gesture: tilt display away Action: Menu selection moves up Ubiquitous computing: smart devices, environments and interaction

47 Gesture Interfaces: Applications
Gesture: Two finger stretch Action: Stretch image Ubiquitous computing: smart devices, environments and interaction

48 Gesture Interfaces: HCI->HPI->HHI->HCI
Ubiquitous computing: smart devices, environments and interaction

49 Gesture Design: Challenges
???. Ubiquitous computing: smart devices, environments and interaction

50 Reflective vs Active Displays
Which is more pervasive today and which will be more pervasive in the future: paper or active display devices? What are inherent characteristics of paper versus active displays and how do these effect their ability to become truly pervasive? Ubiquitous computing: smart devices, environments and interaction

51 Reflective versus Active Displays
Can we produce ICT displays that support more of the properties of physical paper? Display design mimics paper Epaper display design differs from actual paper Ubiquitous computing: smart devices, environments and interaction

52 ElectroPhoretic Displays or EPDs
Ubiquitous computing: smart devices, environments and interaction

53 Combining Input and Output User Interfaces
UIs discussed so far, input devices are separated from the output devices State of the input is available as a visual cue only. How can we combine / link input and output better? Ubiquitous computing: smart devices, environments and interaction

54 Ubiquitous computing: smart devices, environments and interaction
Touchscreen What are touchscreens? Displays where position of contact with screen is detected Via pointed physical objects such as pens, fingers, etc Events can then be generated for an associated visual object at that position and Associated actions can then be triggered. Ubiquitous computing: smart devices, environments and interaction

55 Ubiquitous computing: smart devices, environments and interaction
Touchscreen Touchscreen behaves as 2D, planar smart skin. Wherever it is touched, a virtual object can be activated. Types of touchscreens ? Resistive Capacitive Surface acoustic waves etc. Touch screen can behave as a: soft control panel and user interface that is reprogrammable which can be customised to suit a range of applications and users Ubiquitous computing: smart devices, environments and interaction

56 Touchscreen: Benefits
What are the benefits? These characteristics make them ideal for many workplaces and public spaces. Ubiquitous computing: smart devices, environments and interaction

57 Touchscreen: Applications
Touchscreens sed routinely in many applications & devices ?? To ease use of pointing To ease use of gestures Single versus multiple finger gestures Ubiquitous computing: smart devices, environments and interaction

58 Tangible User Interface (TUI)
(TUI) is a UI that augments the real physical world by coupling digital information to everyday physical objects and environments. Tangible user interfaces are also referred to as passive real-world props, graspable user interfaces, manipulative user interfaces embodied user interfaces Ubiquitous computing: smart devices, environments and interaction

59 Tangible User Interface (TUI)
How do Tangible Interfaces work? Attach micro sensors and actuators (Section 6.4) to physical objects Used as input devices to allow their manipulation to generate data streams in an output device or virtual view in a related virtual environment, (Section 6.2). Ubiquitous computing: smart devices, environments and interaction

60 Tangible User Interface (TUI)
Taxonomy of TUIs based upon embodiment and metaphors Four types of embodiment can be differentiated Full embodiment e.g.,?? Nearby embodiment e.g. ?? Environmental embodiment e.g., ??? Distant embodiment, e.g., ??? Ubiquitous computing: smart devices, environments and interaction

61 Ubiquitous computing: smart devices, environments and interaction
Tangible Bits Project Instructors can explain in more detail how this works or delete this slide Ubiquitous computing: smart devices, environments and interaction

62 Ubiquitous computing: smart devices, environments and interaction
DataTiles Project Allows users to manipulate data in form of tangible “tiles” Combinations of data streams and functions make it possible to create new applications Ubiquitous computing: smart devices, environments and interaction

63 Ubiquitous computing: smart devices, environments and interaction
DataTiles Project Instructors can explain in more detail how this works or delete this slide Ubiquitous computing: smart devices, environments and interaction

64 Ubiquitous computing: smart devices, environments and interaction
Organic Interfaces Similar to Tangible Interfaces 3 characteristics which characterize organic UIs. Typically use Organic Light-Emitting Diode (OLED) type materials Ubiquitous computing: smart devices, environments and interaction

65 Ubiquitous computing: smart devices, environments and interaction
Organic Interfaces Instructors can add more detail about this or delete this slide Ubiquitous computing: smart devices, environments and interaction

66 Ubiquitous computing: smart devices, environments and interaction
Auditory Interfaces What are the Benefits? Design challenges? Ubiquitous computing: smart devices, environments and interaction

67 Auditory Interfaces: Non-Speech Based
2 basic auditory interfaces: Speech based Non-speech based Non-speech auditory interfaces: ????? Ubiquitous computing: smart devices, environments and interaction

68 Auditory Interfaces: Speech Based
????. Ubiquitous computing: smart devices, environments and interaction

69 Natural Language Interfaces
Natural language interaction with machines can occur in a variety of forms. Which? Ubiquitous computing: smart devices, environments and interaction

70 Natural Language Interfaces
Generally, interaction can be more easily processed and understood if it defined using an expressive language that has a well-defined syntax or grammar and semantics requires that users already know the syntax. Benefits in using NL in HCI? Ubiquitous computing: smart devices, environments and interaction

71 Natural Language Interfaces: Challenges
What are the challenges in using NL Interfaces (NLI)? Ubiquitous computing: smart devices, environments and interaction

72 Ubiquitous computing: smart devices, environments and interaction
Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices  Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

73 Hidden UI via Wearable and Implanted Devices
In the Posthuman model, technology can be used to extend a person's normal conscious experience and sense of presence, across space and time. There are 3 types of post-human technology: Accompanied e.g. ??? Wearable e.g., ??? Implants E.g., ??? Ubiquitous computing: smart devices, environments and interaction

74 Ubiquitous computing: smart devices, environments and interaction
Wearable computers Wearable interfaces include a combination of ICT devices & modalities Wearable computers are especially useful when? Focus is on multi-modal interaction which includes visual interaction. Ubiquitous computing: smart devices, environments and interaction

75 Ubiquitous computing: smart devices, environments and interaction
Wearable computers Visual modal systems are divided according to how humans interact with the system: ?? Visual interaction can be classified into command non-command interfaces. Non-command vision-based (human motion) analysis systems generally have four stages: motion segmentation object classification tracking interpretation. Ubiquitous computing: smart devices, environments and interaction

76 Wearable Computer: WearComp and WearCam
Many researchers contributed to the advancement of wearable computing Perhaps the most important Pioneer of Wearable Computing is Steve Mann His 1st early main application focussed on recording personal visual memories that could be shared with other via the Internet. Ubiquitous computing: smart devices, environments and interaction

77 Wearable Computer: WearComp and WearCam
Photo courtesy of Wikimedia Commons, Ubiquitous computing: smart devices, environments and interaction

78 Wearable computing: Mann’s definition
Mann (1997): 3 criteria to define wearable computing. Eudaemonic criterion Existential criterion Ephemeral criterion Ubiquitous computing: smart devices, environments and interaction

79 Wearable computing: Types
Some different type of wearable computers?? N.B. Not all these meet Mann’s criteria Ubiquitous computing: smart devices, environments and interaction

80 Head(s)-Up Display or HUD:
presents data without blocking the user's view pioneered for military aviation - now used in commercial aviation and cars. 2 types of HUD Fixed HUD: Head-mounted HUD Ubiquitous computing: smart devices, environments and interaction

81 EyeTap & Virtual Retinal Display
Instructors can add more detail about these here or delete this slide. Ubiquitous computing: smart devices, environments and interaction

82 Brain Computer Interface (BCI) or Brain Machine Interfaces (BMI)
HCI focuses on indirect interfaces from human brain via human actuators BCI are direct functional interfaces between brains and machines BCI represents ultimate natural interface Would you choose to make use of one when they become available in the future? Ubiquitous computing: smart devices, environments and interaction

83 Brain Computer Interface (BCI) or Brain Machine Interfaces (BMI)
Direct vs. Indirect coupling design choices ?? See also BANs in Chapter 11 Brain versus nerve direct coupling design choices?? Ubiquitous computing: smart devices, environments and interaction

84 Ubiquitous computing: smart devices, environments and interaction
Computer Implants Opposite of wearing computers outside the body is to have them more directly interfaced to the body. Many people routinely use implants ???? Of specific interest is developing devices that can adapt to signals in the human nervous system. By connecting electronic circuitry directly to the human nervous system, ??? Ubiquitous computing: smart devices, environments and interaction

85 Ubiquitous computing: smart devices, environments and interaction
Cyborg 2 Electrode array surgically implanted into Warwick’s left arm and interlinked into median nerve fibres is being monitored. Ubiquitous computing: smart devices, environments and interaction

86 Ubiquitous computing: smart devices, environments and interaction
BCI Instructors can add more detail about experiments here or delete this slide Ubiquitous computing: smart devices, environments and interaction

87 Ubiquitous computing: smart devices, environments and interaction
PostHuman Model Use of alterative technology mediated realities A feeling of presence in the experience provides feedback to a person about the status of his or her activity. The subject perceives any variation in the feeling of presence and tunes its activity accordingly. Ubiquitous computing: smart devices, environments and interaction

88 PostHuman Model and Reality
People can experience alternative realities depending on: the type of environment people are situated in on their perception of the environment. Reality can be: Technology mediated, e.g., ??? Chemically mediated, , e.g., ??? Psychologically mediated, , e.g., ??? Ubiquitous computing: smart devices, environments and interaction

89 Ubiquitous computing: smart devices, environments and interaction
Realities: VR, AR and MR (Revision of Section ) Ubiquitous computing: smart devices, environments and interaction

90 Ubiquitous computing: smart devices, environments and interaction
Virtual Reality (VR) VR seeks to immerse a physical user in a virtual 3D world VR uses a computer simulation of a subset of the world and immerses the user in it using UIs based upon: ?? VR seeks to enable humans to interact using a more natural interaction that humans use in the real world Ubiquitous computing: smart devices, environments and interaction

91 Augmented Reality (AR)
Electronic images are projected over the real world so that images of the real and virtual world are combined. VR considered as a subset of AR? Early E.g. head-mounted display by Sutherland (1968). Similar systems are in use today in types of military aircraft. Ubiquitous computing: smart devices, environments and interaction

92 Telepresence & Telecontrol
Telepresence allow a person in 1 local environment to: ?? . Telecontrol refers to the ability of a person in 1 place to ??? Ubiquitous computing: smart devices, environments and interaction

93 Ubiquitous computing: smart devices, environments and interaction
Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD)  User Models: Acquisition and Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

94 Conventional Design versus HCD
Ubiquitous computing: smart devices, environments and interaction

95 Conventional Functional System Design
Ubiquitous computing: smart devices, environments and interaction

96 Human Centred Design (HCD)
Focus on types of UbiCom System & environments: Need to make the type of user explicit: human users In contrast, automatic / autonomous systems Ubiquitous computing: smart devices, environments and interaction

97 Human Centred Design (HCD)
ISO standard human centred design life-cycle involves 4 main sets of activities: Define context of use Specify stake-holder and organisational requirements Multiple alternative (UI) designs need to be built. Designs need to be validated against user requirements. Ubiquitous computing: smart devices, environments and interaction

98 Human Centred Design (HCD)
Ubiquitous computing: smart devices, environments and interaction

99 A Fuller Range of System & User Requirements / Use Contexts
HCD System & User requirements -> Wider requirements than back-end functional requirements HCD Methodologies are a powerful way to get the wide range of environment requirements & use contexts for UbiCom systems What is the fuller ranges of UbiCom / HCD requirements? Ubiquitous computing: smart devices, environments and interaction

100 A Fuller Range of System & User Requirements / Use Contexts
??? Physical Environment Users Types Task & goals User interface Social Usability & User experience: Usability: User experiences Ubiquitous computing: smart devices, environments and interaction

101 HCD: Use Context / Requirements
Ubiquitous computing: smart devices, environments and interaction

102 HCD: Usability as a User Requirement
Usability is defined as ?? Usability is not a single, one-dimensional property of a user interface. Usability is a combination of factors. ISO explicitly mentions no. of factors ??? These usability factors can often be expanded further sub-properties, Ubiquitous computing: smart devices, environments and interaction

103 Ubiquitous computing: smart devices, environments and interaction
HCD: Stake-Holders End-user is obvious stake-holder in HCD Design Who are the other stake-holders in the personal memory scenario? Are there additional stake-holder requirements? Ubiquitous computing: smart devices, environments and interaction

104 HCD: Acquiring User Context/User Requirements
Several dimensions for get user requirements during HCD life-cycle In Controlled conditions (Lab) vs. in the field Direct user involvement (e.g., interview, questionnaire) vs. indirect (e.g., observations) Individual users vs. user groups, HCI / domain experts vs. predictive user models (no users) Ubiquitous computing: smart devices, environments and interaction

105 HCD: Methods to Acquire User Requirements
Which Methods? ????? Analysis of data gathered depends on: Amount of time, level of detail, uncertainty etc Knowledge the analysis requires Ubiquitous computing: smart devices, environments and interaction

106 Usability Requirements & Use Contexts Examples
For each of the scenarios in chapter 1, e.g., the personal video memories, define the use context and usability requirements. Ubiquitous computing: smart devices, environments and interaction

107 HCI / HCD versus User Context Awareness
Are these the same or similar concepts? See User context awareness (Chapter 7) See HCI / HCD (Chapter 5) Ubiquitous computing: smart devices, environments and interaction

108 HCD: System Model for Users vs. Users’ Model of System
What model of the system does it project to the user? What model does the user have of the system? What if these models differ? Ubiquitous computing: smart devices, environments and interaction

109 HCD Design: Conceptual Models & Mental Models
Amazing number of everyday things & objects ???? Very challenging for people to learn to operate and understand many devices of varying degrees of complexity if the interaction with each of them is unique. Complexity of interacting with new machines cm be reduced. How? Ubiquitous computing: smart devices, environments and interaction

110 HCD Design: Conceptual Models
Discuss some example conceptual models Ubiquitous computing: smart devices, environments and interaction

111 HCD Design: Affordances
Complexity of interacting with new systems is reduced if: they have parts that provide strong clues on how to operate themselves. These are referred to as affordances What are examples of physical UI affordances? Ubiquitous computing: smart devices, environments and interaction

112 HCD Design: Virtual Affordances
Many analogue physical objects being replaced by virtual computer UIs Virtual UI affordances are being increasing important. How to design virtual UI affordances? Can link virtual objects or widgets in it to related & familiar physical world objects Challenges in linking widgets to familiar physical objects? Ubiquitous computing: smart devices, environments and interaction

113 HCD: Multiple Prototype Designs
Example: Consider PVM Scenario (Chapter 1) What type of design? Is there only 1 type of design for recording / playing / transmitting multimedia? ? Consider the requirements: Ubiquitous computing: smart devices, environments and interaction

114 Ubiquitous computing: smart devices, environments and interaction
HCD: Evaluation Summative versus Formative Evaluation Summative Conventional To verify Design Formative HCD Ubiquitous computing: smart devices, environments and interaction

115 HCD: System Evaluation Methods
Can use similar techniques to gathering user requirements. Ubiquitous computing: smart devices, environments and interaction

116 Ubiquitous computing: smart devices, environments and interaction
Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation  iHCI Design Ubiquitous computing: smart devices, environments and interaction

117 User Modelling: Design Choices
Implicit vs. explicit models User instance (Individual) modelling versus user (stereo)type modelling Static versus dynamic user models Generic versus application specific models Content-based versus collaborative user models Ubiquitous computing: smart devices, environments and interaction

118 User Modelling Design: Implicit vs. Explicit models
Systems can either use Explicit feedback Implicit feedback Often these can be combined. How? Some specific techniques for acquiring a user model are described in more detail elsewhere (Section 5). Hybrid user models may also be used. Ubiquitous computing: smart devices, environments and interaction

119 Indirect User Input and Modelling
Benefits? Methods? See Previous Slides Accuracy & precision? Handing inaccuracy & imprecision Ubiquitous computing: smart devices, environments and interaction

120 Direct User Input and Modelling
Benefits versus Challenges? User requirements & user model built using: Single-shot versus Multi-shot user input Static versus Dynamic input Also need to consider user model maintenance Ubiquitous computing: smart devices, environments and interaction

121 Ubiquitous computing: smart devices, environments and interaction
User Stereotypes Challenge in bootstrapping user model / behaviour leads to use of group behaviour Stereotype: infers user model from small number of facts using a larger set of facts from a group user model. Used by collaborative type user model, e.g., recommender systems Challenges? Ubiquitous computing: smart devices, environments and interaction

122 Modelling Users’ Planned Tasks and Goals
Users often interact purposely with a system in a task-driven way, to achieve a particular goal. Several ways to analyse and model user tasks: Hierarchical Task Analysis or HTA Etc Consider each scenario in Chapter 1, e.g., PVM scenario, give a user task / goal model (next slide) Ubiquitous computing: smart devices, environments and interaction

123 HCD: Functional Requirements
Ubiquitous computing: smart devices, environments and interaction

124 Multiple User Tasks and Activity Based Computing
Use tasks as part of activities that require access to services across multiple devices, Devices can be used by different types of people Users are engaged in multiple concurrent activities Users are engaged in activities which may occur across multiple physical environments, Activities may be shared between participants Activities on occasion need to be suspended and resumed. (See Chapter 12) Ubiquitous computing: smart devices, environments and interaction

125 Situation Action versus Planned Action Models
2 basic approaches to task design Planned actions: ???? Situated action: ??? Ubiquitous computing: smart devices, environments and interaction

126 Models of Human Users: HCI vs. AI
Field of HCI proposes models of humans that focus on supporting high-level usability criteria and heuristics Focus is less on explicit computation models of how humans think and act. Field of AI proposes models of humans that make explicit computation models to simulate how humans think, act and interact (Chapters 8 and 9) Ubiquitous computing: smart devices, environments and interaction

127 Ubiquitous computing: smart devices, environments and interaction
Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design  Ubiquitous computing: smart devices, environments and interaction

128 Ubiquitous computing: smart devices, environments and interaction
iHCI iHCI Model Characteristics User Context Awareness Intuitive and Customised Interaction Personalisation Affective Computing iHCI Design Heuristics and Patterns Ubiquitous computing: smart devices, environments and interaction

129 Ubiquitous computing: smart devices, environments and interaction
Types of User Model Several related terms & kinds of user model are differentiated User Models Personal Profiles User contexts Application / User requirements System Models Mental Models Conceptual Models Ubiquitous computing: smart devices, environments and interaction

130 User Context Awareness
User context aware can be exploited to beneficially lessen the degree of explicit HCI needed. User context-awareness is a sub-type of general context-awareness (Chapter 7) User context-awareness can include: Social environment context Users’ physical characteristics and capabilities for HCI User presence in a locality or detected activity User identity (Section ). User planned tasks and goals (Section 5.6.4). Users’ situated tasks (Sections 5.6.5, 5.6.6). User emotional state (Section 5.7.5) Ubiquitous computing: smart devices, environments and interaction

131 Intuitive and Customised Interaction
Are current computer systems dominated by MTOS based devices & use of desktop UI metaphor intuitive? E.g., ?? E.g., ?? E.g., etc Ubiquitous computing: smart devices, environments and interaction

132 Intuitive and Customised Interaction
Moran & Zhai propose 7 principles to evolve desktop model into more intuitive model for UbiCom From Office Container to Personal Information Cloud From desktop to a diverse set of visual representations From Interaction with 1 device to interaction with many From Mouse & Keyboard to  Interactions & modalities Functions may move from Applications to Services From Personal to Interpersonal to Group to Social From low-level tasks to higher level activities Ubiquitous computing: smart devices, environments and interaction

133 Ubiquitous computing: smart devices, environments and interaction
Personalisation Personalisation: tailoring applications & services specifically to an individual’s needs, interests, preferences Adaptation of consumer product, electronic or written medium, based on person profile Applications of personalisation targeted marketing product & service customisation including information filtering CRM Ubiquitous computing: smart devices, environments and interaction

134 Personalisation: Benefits
??? Ubiquitous computing: smart devices, environments and interaction

135 Personalisation: Challenges (Cons)
??? Ubiquitous computing: smart devices, environments and interaction

136 Ubiquitous computing: smart devices, environments and interaction
Personalisation Personalisation: a more complete model of user-context that is more reusable and persists: ???? 2 key issues: design of model so that it can be distributed and shared dynamic vs. static task-driven user preference contexts Ubiquitous computing: smart devices, environments and interaction

137 Personalisation: Mechanisms
Instructors can add more slides about how personalisation mechanisms, e.g., recommender systems, work here or delete this slides Ubiquitous computing: smart devices, environments and interaction

138 Affective Computing: Interactions using Users’ Emotional Context
Affective computing: computing relates to, arises from, or influences emotions. Applications include: ??? Design challenges for affective computing with those for: determining the user context developing more complex human-like intelligence models Ubiquitous computing: smart devices, environments and interaction

139 Ubiquitous computing: smart devices, environments and interaction
Affective Computing Pickard (2003) identified six design challenges: Range & modalities of emotion expression is broad People’s expression of emotion is idiosyncratic & variable Cognitive models for human emotions are incomplete Sine qua non of emotion expression is the physical body but computers not embodied in the same way Emotions are ultimately personal and private No need to contaminate purely logical computers with emotional reactiveness Ubiquitous computing: smart devices, environments and interaction

140 iHCI: Design Heuristics and Patterns
Many different higher-level HCI design usability / user experience criteria have been proposed by different HCI designers to promote good design of HCI interaction. Many different HCI heuristics (rules of thumb derived from experience) have proposed to support HCI criteria Specific guidance is needed to engineer UIs to comply with these usability & user experience HCI principles. UI design patterns can support HCI usability principles and then be mapped into lower-level more concrete design patterns Ubiquitous computing: smart devices, environments and interaction

141 iHCI: Design Heuristics and Patterns
Example iHCI patterns include: Ubiquitous computing: smart devices, environments and interaction

142 iHCI: Design Patterns & Heuristics
Instructors can propose many more examples here or delete this slide. Ubiquitous computing: smart devices, environments and interaction

143 iHCI: Engineering iHCI Design Patterns
Can propose simplify design models along 2 dimensions that are interlinked Organisation / structural models versus time-driven interaction models Front-end / Presentation (UI) interaction versus back-end system actions that support this interaction Need to organise UI widgets or objects at UI Need to organise and link presentation to actions Need to design interaction with these widgets (see next slide as an example) Ubiquitous computing: smart devices, environments and interaction

144 iHCI: Engineering iHCI Design Patterns
Image Search Ubiquitous computing: smart devices, environments and interaction

145 Ubiquitous computing: smart devices, environments and interaction
Overview HCI, eHCI & iHCI  eHCI use in 4 Widely Used Devices  iHCI use in accompanied smart devices  iHCI use in wearable and implanted smart devices  Human Centred Design (HCD)  User Models: Acquisition & Representation  iHCI Design  Ubiquitous computing: smart devices, environments and interaction

146 Ubiquitous computing: smart devices, environments and interaction
Summary A human centred design process for interactive systems specifies four principles of design: the active involvement of users and a clear understanding of user and task requirements; an appropriate allocation of function between users and technology based upon the relative competence of the technology and humans; iteration is inevitable because designers hardly ever get it right the first time; a multi-disciplinary approach to the design. Human centred design life-cycle involves user participation throughout four main sets of activities: defining user tasks and the (physical, ICT) environment context; defining user and organisational requirements; iterative design prototyping and validation against the requirements. Ubiquitous computing: smart devices, environments and interaction

147 Ubiquitous computing: smart devices, environments and interaction
Summary To enable humans to effectively interact with devices to perform tasks and to support human activities, systems need to be designed to support good models of user interfaces and processes of human computer interaction. Users can be modelled directly and indirectly. User task models can be modelled as task plans or as situated actions. iHCI design concerns three additional concerns: support for natural (human computer) interaction; user models including models of emotions which can be used to anticipate user behaviour and user context awareness including personalisation. Some design patterns and heuristics oriented towards iHCI are described. Ubiquitous computing: smart devices, environments and interaction

148 Ubiquitous computing: smart devices, environments and interaction
Summary & Revision For each chapter See book web-site for chapter summaries, references, resources etc. Identify new terms & concepts Apply new terms and concepts: define, use in old and new situations & problems Debate problems, challenges and solutions See Chapter exercises on web-site Ubiquitous computing: smart devices, environments and interaction

149 Exercises: Define New Concepts
Touchscreen, etc Ubiquitous computing: smart devices, environments and interaction

150 Exercise: Applying New Concepts
What is the difference between touchscreen and a normal display? Ubiquitous computing: smart devices, environments and interaction


Download ppt "Chapter 5 Human Computer Interaction"

Similar presentations


Ads by Google