Chapter 5 Human Computer Interaction

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Structured Design The Structured Design Approach (also called Layered Approach) focuses on the conceptual and physical level. As discussed earlier: Conceptual.
Design, prototyping and construction
Map of Human Computer Interaction
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
SECOND MIDTERM REVIEW CS 580 Human Computer Interaction.
Lecture 07 Marketing. Working Definition of the concept > – The process of determining customer wants and needs and – then providing.
Ubiquitous Computing Definitions Ubiquitous computing is the method of enhancing computer use by making many computers available throughout the physical.
WHAT IS INTERACTION DESIGN?
User-Interface Design Process Lecture # 6 1Gabriel Spitz.
Page16/2/2015 Sirlan Usage and usability considerations for SIRLAN solution success.
Object-Oriented Analysis and Design
Design Activities in Usability Engineering laura leventhal and julie barnes.
John Hu Nov. 9, 2004 Multimodal Interfaces Oviatt, S. Multimodal interfaces Mankoff, J., Hudson, S.E., & Abowd, G.D. Interaction techniques for ambiguity.
Introduction to HCC and HCM. Human Centered Computing Philosophical-humanistic position regarding the ethics and aesthetics of a workplace Any system.
Class 6 LBSC 690 Information Technology Human Computer Interaction and Usability.
Knowledge Acquisitioning. Definition The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
1 / 31 CS 425/625 Software Engineering User Interface Design Based on Chapter 15 of the textbook [SE-6] Ian Sommerville, Software Engineering, 6 th Ed.,
The Process of Interaction Design. What is Interaction Design? It is a process: — a goal-directed problem solving activity informed by intended use, target.
Software Requirements
Specialized Business Information Systems Chapter 11.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 7: Expert Systems and Artificial Intelligence Decision Support.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
4. Interaction Design Overview 4.1. Ergonomics 4.2. Designing complex interactive systems Situated design Collaborative design: a multidisciplinary.
Chapter 12: Intelligent Systems in Business
Lecture 4: Perception and Cognition in Immersive Virtual Environments Dr. Xiangyu WANG.
Overview of Long-Term Memory laura leventhal. Reference Chapter 14 Chapter 14.
MSIS 110: Introduction to Computers; Instructor: S. Mathiyalakan1 Specialized Business Information Systems Chapter 11.
Design, goal of design, design process in SE context, Process of design – Quality guidelines and attributes Evolution of software design process – Procedural,
An Intelligent Broker Architecture for Context-Aware Systems A PhD. Dissertation Proposal in Computer Science at the University of Maryland Baltimore County.
1. Human – the end-user of a program – the others in the organization Computer – the machine the program runs on – often split between clients & servers.
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
Principles of User Centred Design Howell Istance.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Chapter 5: Spatial Cognition Slide Template. FRAMES OF REFERENCE.
CSC 480 Software Engineering Lecture 19 Nov 11, 2002.
Lecture 6 User Interface Design
© 2007 Tom Beckman Features:  Are autonomous software entities that act as a user’s assistant to perform discrete tasks, simplifying or completely automating.
Human Computer Interaction
1 Chapter 7 Designing for the Human Experience in Smart Environments.
Artificial Intelligence By Michelle Witcofsky And Evan Flanagan.
1 Introduction to Software Engineering Lecture 1.
Human-Computer Interaction
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
COMPSCI 705 / SOFTENG 702 Exam Review Lecture Jim Warren Professor of Health Informatics Course coordinator CS705/SE702.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
A Context Model based on Ontological Languages: a Proposal for Information Visualization School of Informatics Castilla-La Mancha University Ramón Hervás.
Chapter 2. 3D User Interfaces: History and Roadmap.
User Interfaces 4 BTECH: IT WIKI PAGE:
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
Design Process … and some design inspiration. Course ReCap To make you notice interfaces, good and bad – You’ll never look at doors the same way again.
New Human-Computer Paradigms. New Realities 2 Augmented Reality Enhanced view of a physical world Augmented by computer generated input – Data/Graphics/GPS.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Investigate Plan Design Create Evaluate (Test it to objective evaluation at each stage of the design cycle) state – describe - explain the problem some.
Adaptivity, Personalisation and Assistive Technologies Hugh Davis.
NCP meeting Jan 27-28, 2003, Brussels Colette Maloney Interfaces, Knowledge and Content technologies, Applications & Information Market DG INFSO Multimodal.
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
Introduction to 3D User Interface. 첫번째 강의 내용  강의 계획서 설명 강의와 논문 발표 ( 학생 ) 발표 논문 리스트  Chapter 1 & 2 참고  SIGGRAPH 2001: Course Notes 44 Advance Topics.
Done by Fazlun Satya Saradhi. INTRODUCTION The main concept is to use different types of agent models which would help create a better dynamic and adaptive.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
Human Computer Interaction
Fundamentals of Information Systems, Sixth Edition
CHAPTER 1 Introduction BIC 3337 EXPERT SYSTEM.
OVERVIEW Impact of Modelling and simulation in Mechatronics system
Unit III Human Computer Interaction
Ubiquitous Computing and Augmented Realities
Chapter 6: Interfaces and interactions
Chapter 9 System Control
Presentation transcript:

Chapter 5 Human Computer Interaction UbiCom Book Slides Chapter 5 Human Computer Interaction Stefan Poslad http://www.eecs.qmul.ac.uk/people/stefan/ubicom Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction HCI: Overview This part (a) first discusses: What is Human Computer Interaction or Interfaces (HCI) and why we need good HCI for human interactive systems? What is a sub-type of HCI, implicit HCI (iHCI), how is it differentiated from conventional explicit HCI (eHCI) and why do we need this to enhance pervasive computing? How to use eHCI in some common types of device? How to use iHCI in (mobile and static) devices that are not permanently attached to humans? How to use iHCI in (mobile and static) devices that accompany humans through being surface-mounted (wearable) or embedded (implants) Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Chapter 5 Related Links iHCI is a type of context-awareness for the human environment (Chapter 7) Human behaviour models of intelligence (Chapter 8) Social & other consequences of making devices more human and more intelligent (Chapter 12) Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction HCI: Overview The slides for this chapter are also expanded and split into several parts in the full pack Part A: eHCI Use in some common smart device types Part B iHCI for accompanied smart devices Part C: iHCI for wearable & implanted smart devices Part D: Human Centred Design Part E: User Models and iHCI Design Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction HCI: Overview HCI, eHCI & iHCI  eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction HCI: Introduction Term HCI, widely used, since onset of Personal Computing era in 1980s. However groundwork for field of HCI started earlier, during onset of the industrial revolution Tasks became automated and powered-assisted -> triggers an interest in studying human-machine interaction Some tasks require little human interaction during operation, e.g., clothes-, dish- washing etc Other tasks are very interactive, e.g., face washing, playing the violin, etc Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction H,C & I Basic concepts of HCI are: Humans Computers / devices Interaction Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction HCI: Motivation Machines (systems) aid human performance, but systems that interact poorly with humans will be a poor human aid. Need design models & process that are (user) interactive The motivation for HCI is clear; to support more effective use (Dix, 2004a) in three ways Useful: Usable: Be used: Ubiquitous computing: smart devices, environments and interaction

HCI: Usability vs. Usefulness Success of a product depends largely on ? Summarised as Heckel's law and Heckel's inverse law: Heckel’s law: Heckel’s inverse law: What this law expresses ? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Explicit HCI (eHCI) eHCI design: explicit interaction during a device’s normal operation. What are the Dominant eHCI UIs Pure eHCI Context-free Focus on H2C (Human-to-Computer) Interaction Ubiquitous computing: smart devices, environments and interaction

eHCI versus Natural Interaction Natural interaction and familiarity and expertise Familiarity with use of tool is cultural and subjective Note also Natural Interaction linked to use of iHCI Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction iHCI Concept of implicit HCI (iHCI) Proposed by Schmidt (2000) Defined as “an action, performed by the user that is not primarily aimed to interact with a computerized system but which such a system understands as input”. Our definition of iHCI bit different: inputs with an implicit or implied context, Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction iHCI iHCI is more about C2H (Computer to Human) Interaction iHCI assumes Chas a certain Model of H user Model of H used as additional input Need to share implicit context between human and system Implicit interaction naturally supports hidden device design. Ubiquitous computing: smart devices, environments and interaction

eHCI + iHCI or iHCI vs eHCI E.g.?? . eHCI, usability design? Alternative iHCI design? Shift from eHCI design to also include iHCI design will be a key enabler for effective UbiCom systems Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction iHCI: Challenges Complex to accurately and reliably determine user context. Why? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices  iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

How Device Interfaces & Interaction Varies Devices can be characterized according to?: Ubiquitous computing: smart devices, environments and interaction

UI and HCI Designs for 4 Common Devices PC Mobile Phone Games Console but many sub-types TV / Projectors How does the UI and HCI design differ between these? Ubiquitous computing: smart devices, environments and interaction

UI Type: Personal Computer Interface ??? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction PC UI use in Mobiles Using a conventional PC UI approach won’t be optimum for mobile computing & ubiquitous computing - need a different approach, Why? Ubiquitous computing: smart devices, environments and interaction

UI Type: Mobile Device Interfaces PC / WIMPS models not so suitable for mobile (one handed) devices, Why not? Ubiquitous computing: smart devices, environments and interaction

Mobile Device Interface: Limited I/P How to support mobile user and small size of input? Ubiquitous computing: smart devices, environments and interaction

Mobile Device Interface: Limited O/P How to overcome limited output? Haptic interface use, e.g., vibration to signal incoming call Maximising use of small screen: scrolling, switching screen Peephole displays Foldable displays Filter information so receive and display less information, e.g., using Personalisation(Chapter 7) Personal Agents (Chapter 8) Ubiquitous computing: smart devices, environments and interaction

UI Type: Games Console Interfaces Games consoles: an important driver and can contribute to UbiCom in a number of ways. Computer games have often acted as an incubator for many innovations driving computing. How? Many different types of Games Console Interface Ubiquitous computing: smart devices, environments and interaction

Games Console Interfaces: D-pad How does the D-pad controller work? Ubiquitous computing: smart devices, environments and interaction

Games Console Interfaces: 3D Gesture-Based How does the 3D Gesture-Based controller work? Use of MEMS/ Sensors (Chapter 7) Use of gesture recognition (see later) Ubiquitous computing: smart devices, environments and interaction

UI Type: Control (Panel) Interfaces Different Types of remote controllers depending on how remote the controller is: User approx. co-located with device being controlled User not co-located with device being Controlled Ubiquitous computing: smart devices, environments and interaction

UI Type: Localised Remote Control Interfaces Characteristics Input controller and device separation Input device interfaces Wireless link between input control device and device Ubiquitous computing: smart devices, environments and interaction

UI Type: Localised Remote Control Interfaces But profusion of remote control devices which have overlapping features Is it necessary to have a specialised controller per consumer device? Problems? How to solve this? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction UIs often introduce modes to minimise the number of UI controls (e.g., toggle buttons vs. radio buttons) AM and PM are modes! Up and down buttons introduce modes UI that contain more controls & less modes appear more complex but are often easier to operate Changing the gesture to trigger action (e.g., adaptive menus) are more useful if they are inclusive rather than exclusive Ubiquitous computing: smart devices, environments and interaction

Localised Remote Control Interface Design Instructors can add more detail about the discussion and design of universal controller here or delete this slide. (Section 5.2.5) Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices  iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

iHCI use in Accompanied Smart Devices: Topics Single vs. Multi-Modal Visual Interfaces Gesture Interfaces Reflective versus Active Displays Combining Input and Output User Interfaces ??? Auditory Interfaces Natural Language Interfaces Ubiquitous computing: smart devices, environments and interaction

Single vs. Multi-Modal Visual Interfaces Mode of human interaction uses human senses? Which Interactive ICT systems have modalities that mimic human senses. What? Ubiquitous computing: smart devices, environments and interaction

Computer input & output modalities Ubiquitous computing: smart devices, environments and interaction

Single vs. Multi-Modal Visual Interfaces Many interactive ICT systems use single visual mode of output interaction. Problems? Solutions? Ubiquitous computing: smart devices, environments and interaction

Multi-Modal Interaction Design: challenges Integrating multiple modes is complex. Why? Ubiquitous computing: smart devices, environments and interaction

Multi-Modal Interaction: Design Two main approaches Data for each modality can be processed separately, then combined at the end. Data for each modality can be processed & combined concurrently Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Gesture Interfaces What are Gestures? Expressive, meaningful body motions Involving physical movements. Which? With the intent of conveying meaningful information about interacting with the environment. Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Gesture Interfaces What are the main types of human gestures? How can gestures be sensed? Ubiquitous computing: smart devices, environments and interaction

Gesture Interfaces: Classification Gestures can also be classified into 2D versus 3D Contactful versus Contactless Directly sensed versus indirectly sensed Ubiquitous computing: smart devices, environments and interaction

Gesture Interfaces: Applications 1st basic contact based gesture interfaces? From the mid 2000s, contact less gestures being used in several types of games consoles, mobile phones, cameras, etc. Ubiquitous computing: smart devices, environments and interaction

Gesture Interfaces: Applications ????? Ubiquitous computing: smart devices, environments and interaction

Gesture Interfaces: Applications Gesture: Rotate or flip hand Action: Rotate or flip image Ubiquitous computing: smart devices, environments and interaction

Gesture Interfaces: Applications Gesture: tilt display away Action: Menu selection moves up Ubiquitous computing: smart devices, environments and interaction

Gesture Interfaces: Applications Gesture: Two finger stretch Action: Stretch image Ubiquitous computing: smart devices, environments and interaction

Gesture Interfaces: HCI->HPI->HHI->HCI Ubiquitous computing: smart devices, environments and interaction

Gesture Design: Challenges ???. Ubiquitous computing: smart devices, environments and interaction

Reflective vs Active Displays Which is more pervasive today and which will be more pervasive in the future: paper or active display devices? What are inherent characteristics of paper versus active displays and how do these effect their ability to become truly pervasive? Ubiquitous computing: smart devices, environments and interaction

Reflective versus Active Displays Can we produce ICT displays that support more of the properties of physical paper? Display design mimics paper Epaper display design differs from actual paper Ubiquitous computing: smart devices, environments and interaction

ElectroPhoretic Displays or EPDs Ubiquitous computing: smart devices, environments and interaction

Combining Input and Output User Interfaces UIs discussed so far, input devices are separated from the output devices State of the input is available as a visual cue only. How can we combine / link input and output better? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Touchscreen What are touchscreens? Displays where position of contact with screen is detected Via pointed physical objects such as pens, fingers, etc Events can then be generated for an associated visual object at that position and Associated actions can then be triggered. Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Touchscreen Touchscreen behaves as 2D, planar smart skin. Wherever it is touched, a virtual object can be activated. Types of touchscreens ? Resistive Capacitive Surface acoustic waves etc. Touch screen can behave as a: soft control panel and user interface that is reprogrammable which can be customised to suit a range of applications and users Ubiquitous computing: smart devices, environments and interaction

Touchscreen: Benefits What are the benefits? These characteristics make them ideal for many workplaces and public spaces. Ubiquitous computing: smart devices, environments and interaction

Touchscreen: Applications Touchscreens sed routinely in many applications & devices ?? To ease use of pointing To ease use of gestures Single versus multiple finger gestures Ubiquitous computing: smart devices, environments and interaction

Tangible User Interface (TUI) (TUI) is a UI that augments the real physical world by coupling digital information to everyday physical objects and environments. Tangible user interfaces are also referred to as passive real-world props, graspable user interfaces, manipulative user interfaces embodied user interfaces Ubiquitous computing: smart devices, environments and interaction

Tangible User Interface (TUI) How do Tangible Interfaces work? Attach micro sensors and actuators (Section 6.4) to physical objects Used as input devices to allow their manipulation to generate data streams in an output device or virtual view in a related virtual environment, (Section 6.2). Ubiquitous computing: smart devices, environments and interaction

Tangible User Interface (TUI) Taxonomy of TUIs based upon embodiment and metaphors Four types of embodiment can be differentiated Full embodiment e.g.,?? Nearby embodiment e.g. ?? Environmental embodiment e.g., ??? Distant embodiment, e.g., ??? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Tangible Bits Project Instructors can explain in more detail how this works or delete this slide Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction DataTiles Project Allows users to manipulate data in form of tangible “tiles” Combinations of data streams and functions make it possible to create new applications Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction DataTiles Project Instructors can explain in more detail how this works or delete this slide Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Organic Interfaces Similar to Tangible Interfaces 3 characteristics which characterize organic UIs. Typically use Organic Light-Emitting Diode (OLED) type materials Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Organic Interfaces Instructors can add more detail about this or delete this slide Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Auditory Interfaces What are the Benefits? Design challenges? Ubiquitous computing: smart devices, environments and interaction

Auditory Interfaces: Non-Speech Based 2 basic auditory interfaces: Speech based Non-speech based Non-speech auditory interfaces: ????? Ubiquitous computing: smart devices, environments and interaction

Auditory Interfaces: Speech Based ????. Ubiquitous computing: smart devices, environments and interaction

Natural Language Interfaces Natural language interaction with machines can occur in a variety of forms. Which? Ubiquitous computing: smart devices, environments and interaction

Natural Language Interfaces Generally, interaction can be more easily processed and understood if it defined using an expressive language that has a well-defined syntax or grammar and semantics requires that users already know the syntax. Benefits in using NL in HCI? Ubiquitous computing: smart devices, environments and interaction

Natural Language Interfaces: Challenges What are the challenges in using NL Interfaces (NLI)? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices  Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

Hidden UI via Wearable and Implanted Devices In the Posthuman model, technology can be used to extend a person's normal conscious experience and sense of presence, across space and time. There are 3 types of post-human technology: Accompanied e.g. ??? Wearable e.g., ??? Implants E.g., ??? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Wearable computers Wearable interfaces include a combination of ICT devices & modalities Wearable computers are especially useful when? Focus is on multi-modal interaction which includes visual interaction. Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Wearable computers Visual modal systems are divided according to how humans interact with the system: ?? Visual interaction can be classified into command non-command interfaces. Non-command vision-based (human motion) analysis systems generally have four stages: motion segmentation object classification tracking interpretation. Ubiquitous computing: smart devices, environments and interaction

Wearable Computer: WearComp and WearCam Many researchers contributed to the advancement of wearable computing Perhaps the most important Pioneer of Wearable Computing is Steve Mann His 1st early main application focussed on recording personal visual memories that could be shared with other via the Internet. Ubiquitous computing: smart devices, environments and interaction

Wearable Computer: WearComp and WearCam Photo courtesy of Wikimedia Commons, http://en.wikipedia.org/wiki/Wearable_computing) Ubiquitous computing: smart devices, environments and interaction

Wearable computing: Mann’s definition Mann (1997): 3 criteria to define wearable computing. Eudaemonic criterion Existential criterion Ephemeral criterion Ubiquitous computing: smart devices, environments and interaction

Wearable computing: Types Some different type of wearable computers?? N.B. Not all these meet Mann’s criteria Ubiquitous computing: smart devices, environments and interaction

Head(s)-Up Display or HUD: presents data without blocking the user's view pioneered for military aviation - now used in commercial aviation and cars. 2 types of HUD Fixed HUD: Head-mounted HUD Ubiquitous computing: smart devices, environments and interaction

EyeTap & Virtual Retinal Display Instructors can add more detail about these here or delete this slide. Ubiquitous computing: smart devices, environments and interaction

Brain Computer Interface (BCI) or Brain Machine Interfaces (BMI) HCI focuses on indirect interfaces from human brain via human actuators BCI are direct functional interfaces between brains and machines BCI represents ultimate natural interface Would you choose to make use of one when they become available in the future? Ubiquitous computing: smart devices, environments and interaction

Brain Computer Interface (BCI) or Brain Machine Interfaces (BMI) Direct vs. Indirect coupling design choices ?? See also BANs in Chapter 11 Brain versus nerve direct coupling design choices?? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Computer Implants Opposite of wearing computers outside the body is to have them more directly interfaced to the body. Many people routinely use implants ???? Of specific interest is developing devices that can adapt to signals in the human nervous system. By connecting electronic circuitry directly to the human nervous system, ??? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Cyborg 2 Electrode array surgically implanted into Warwick’s left arm and interlinked into median nerve fibres is being monitored. Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction BCI Instructors can add more detail about experiments here or delete this slide Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction PostHuman Model Use of alterative technology mediated realities A feeling of presence in the experience provides feedback to a person about the status of his or her activity. The subject perceives any variation in the feeling of presence and tunes its activity accordingly. Ubiquitous computing: smart devices, environments and interaction

PostHuman Model and Reality People can experience alternative realities depending on: the type of environment people are situated in on their perception of the environment. Reality can be: Technology mediated, e.g., ??? Chemically mediated, , e.g., ??? Psychologically mediated, , e.g., ??? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Realities: VR, AR and MR (Revision of Section 1.2.3.3) Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Virtual Reality (VR) VR seeks to immerse a physical user in a virtual 3D world VR uses a computer simulation of a subset of the world and immerses the user in it using UIs based upon: ?? VR seeks to enable humans to interact using a more natural interaction that humans use in the real world Ubiquitous computing: smart devices, environments and interaction

Augmented Reality (AR) Electronic images are projected over the real world so that images of the real and virtual world are combined. VR considered as a subset of AR? Early E.g. head-mounted display by Sutherland (1968). Similar systems are in use today in types of military aircraft. Ubiquitous computing: smart devices, environments and interaction

Telepresence & Telecontrol Telepresence allow a person in 1 local environment to: ?? . Telecontrol refers to the ability of a person in 1 place to ??? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD)  User Models: Acquisition and Representation iHCI Design Ubiquitous computing: smart devices, environments and interaction

Conventional Design versus HCD Ubiquitous computing: smart devices, environments and interaction

Conventional Functional System Design Ubiquitous computing: smart devices, environments and interaction

Human Centred Design (HCD) Focus on types of UbiCom System & environments: Need to make the type of user explicit: human users In contrast, automatic / autonomous systems Ubiquitous computing: smart devices, environments and interaction

Human Centred Design (HCD) ISO standard human centred design life-cycle involves 4 main sets of activities: Define context of use Specify stake-holder and organisational requirements Multiple alternative (UI) designs need to be built. Designs need to be validated against user requirements. Ubiquitous computing: smart devices, environments and interaction

Human Centred Design (HCD) Ubiquitous computing: smart devices, environments and interaction

A Fuller Range of System & User Requirements / Use Contexts HCD System & User requirements -> Wider requirements than back-end functional requirements HCD Methodologies are a powerful way to get the wide range of environment requirements & use contexts for UbiCom systems What is the fuller ranges of UbiCom / HCD requirements? Ubiquitous computing: smart devices, environments and interaction

A Fuller Range of System & User Requirements / Use Contexts ??? Physical Environment Users Types Task & goals User interface Social Usability & User experience: Usability: User experiences Ubiquitous computing: smart devices, environments and interaction

HCD: Use Context / Requirements Ubiquitous computing: smart devices, environments and interaction

HCD: Usability as a User Requirement Usability is defined as ?? Usability is not a single, one-dimensional property of a user interface. Usability is a combination of factors. ISO-940-11 explicitly mentions no. of factors ??? These usability factors can often be expanded further sub-properties, Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction HCD: Stake-Holders End-user is obvious stake-holder in HCD Design Who are the other stake-holders in the personal memory scenario? Are there additional stake-holder requirements? Ubiquitous computing: smart devices, environments and interaction

HCD: Acquiring User Context/User Requirements Several dimensions for get user requirements during HCD life-cycle In Controlled conditions (Lab) vs. in the field Direct user involvement (e.g., interview, questionnaire) vs. indirect (e.g., observations) Individual users vs. user groups, HCI / domain experts vs. predictive user models (no users) Ubiquitous computing: smart devices, environments and interaction

HCD: Methods to Acquire User Requirements Which Methods? ????? Analysis of data gathered depends on: Amount of time, level of detail, uncertainty etc Knowledge the analysis requires Ubiquitous computing: smart devices, environments and interaction

Usability Requirements & Use Contexts Examples For each of the scenarios in chapter 1, e.g., the personal video memories, define the use context and usability requirements. Ubiquitous computing: smart devices, environments and interaction

HCI / HCD versus User Context Awareness Are these the same or similar concepts? See User context awareness (Chapter 7) See HCI / HCD (Chapter 5) Ubiquitous computing: smart devices, environments and interaction

HCD: System Model for Users vs. Users’ Model of System What model of the system does it project to the user? What model does the user have of the system? What if these models differ? Ubiquitous computing: smart devices, environments and interaction

HCD Design: Conceptual Models & Mental Models Amazing number of everyday things & objects ???? Very challenging for people to learn to operate and understand many devices of varying degrees of complexity if the interaction with each of them is unique. Complexity of interacting with new machines cm be reduced. How? Ubiquitous computing: smart devices, environments and interaction

HCD Design: Conceptual Models Discuss some example conceptual models Ubiquitous computing: smart devices, environments and interaction

HCD Design: Affordances Complexity of interacting with new systems is reduced if: they have parts that provide strong clues on how to operate themselves. These are referred to as affordances What are examples of physical UI affordances? Ubiquitous computing: smart devices, environments and interaction

HCD Design: Virtual Affordances Many analogue physical objects being replaced by virtual computer UIs Virtual UI affordances are being increasing important. How to design virtual UI affordances? Can link virtual objects or widgets in it to related & familiar physical world objects Challenges in linking widgets to familiar physical objects? Ubiquitous computing: smart devices, environments and interaction

HCD: Multiple Prototype Designs Example: Consider PVM Scenario (Chapter 1) What type of design? Is there only 1 type of design for recording / playing / transmitting multimedia? ? Consider the requirements: Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction HCD: Evaluation Summative versus Formative Evaluation Summative Conventional To verify Design Formative HCD Ubiquitous computing: smart devices, environments and interaction

HCD: System Evaluation Methods Can use similar techniques to gathering user requirements. Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation  iHCI Design Ubiquitous computing: smart devices, environments and interaction

User Modelling: Design Choices Implicit vs. explicit models User instance (Individual) modelling versus user (stereo)type modelling Static versus dynamic user models Generic versus application specific models Content-based versus collaborative user models Ubiquitous computing: smart devices, environments and interaction

User Modelling Design: Implicit vs. Explicit models Systems can either use Explicit feedback Implicit feedback Often these can be combined. How? Some specific techniques for acquiring a user model are described in more detail elsewhere (Section 5). Hybrid user models may also be used. Ubiquitous computing: smart devices, environments and interaction

Indirect User Input and Modelling Benefits? Methods? See Previous Slides Accuracy & precision? Handing inaccuracy & imprecision Ubiquitous computing: smart devices, environments and interaction

Direct User Input and Modelling Benefits versus Challenges? User requirements & user model built using: Single-shot versus Multi-shot user input Static versus Dynamic input Also need to consider user model maintenance Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction User Stereotypes Challenge in bootstrapping user model / behaviour leads to use of group behaviour Stereotype: infers user model from small number of facts using a larger set of facts from a group user model. Used by collaborative type user model, e.g., recommender systems Challenges? Ubiquitous computing: smart devices, environments and interaction

Modelling Users’ Planned Tasks and Goals Users often interact purposely with a system in a task-driven way, to achieve a particular goal. Several ways to analyse and model user tasks: Hierarchical Task Analysis or HTA Etc Consider each scenario in Chapter 1, e.g., PVM scenario, give a user task / goal model (next slide) Ubiquitous computing: smart devices, environments and interaction

HCD: Functional Requirements Ubiquitous computing: smart devices, environments and interaction

Multiple User Tasks and Activity Based Computing Use tasks as part of activities that require access to services across multiple devices, Devices can be used by different types of people Users are engaged in multiple concurrent activities Users are engaged in activities which may occur across multiple physical environments, Activities may be shared between participants Activities on occasion need to be suspended and resumed. (See Chapter 12) Ubiquitous computing: smart devices, environments and interaction

Situation Action versus Planned Action Models 2 basic approaches to task design Planned actions: ???? Situated action: ??? Ubiquitous computing: smart devices, environments and interaction

Models of Human Users: HCI vs. AI Field of HCI proposes models of humans that focus on supporting high-level usability criteria and heuristics Focus is less on explicit computation models of how humans think and act. Field of AI proposes models of humans that make explicit computation models to simulate how humans think, act and interact (Chapters 8 and 9) Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Overview HCI, eHCI & iHCI eHCI use in 4 Widely Used Devices iHCI use in accompanied smart devices iHCI use in wearable and implanted smart devices Human Centred Design (HCD) User Models: Acquisition & Representation iHCI Design  Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction iHCI iHCI Model Characteristics User Context Awareness Intuitive and Customised Interaction Personalisation Affective Computing iHCI Design Heuristics and Patterns Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Types of User Model Several related terms & kinds of user model are differentiated User Models Personal Profiles User contexts Application / User requirements System Models Mental Models Conceptual Models Ubiquitous computing: smart devices, environments and interaction

User Context Awareness User context aware can be exploited to beneficially lessen the degree of explicit HCI needed. User context-awareness is a sub-type of general context-awareness (Chapter 7) User context-awareness can include: Social environment context Users’ physical characteristics and capabilities for HCI User presence in a locality or detected activity User identity (Section 12.3.4). User planned tasks and goals (Section 5.6.4). Users’ situated tasks (Sections 5.6.5, 5.6.6). User emotional state (Section 5.7.5) Ubiquitous computing: smart devices, environments and interaction

Intuitive and Customised Interaction Are current computer systems dominated by MTOS based devices & use of desktop UI metaphor intuitive? E.g., ?? E.g., ?? E.g., etc Ubiquitous computing: smart devices, environments and interaction

Intuitive and Customised Interaction Moran & Zhai propose 7 principles to evolve desktop model into more intuitive model for UbiCom From Office Container to Personal Information Cloud From desktop to a diverse set of visual representations From Interaction with 1 device to interaction with many From Mouse & Keyboard to  Interactions & modalities Functions may move from Applications to Services From Personal to Interpersonal to Group to Social From low-level tasks to higher level activities Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Personalisation Personalisation: tailoring applications & services specifically to an individual’s needs, interests, preferences Adaptation of consumer product, electronic or written medium, based on person profile Applications of personalisation targeted marketing product & service customisation including information filtering CRM Ubiquitous computing: smart devices, environments and interaction

Personalisation: Benefits ??? Ubiquitous computing: smart devices, environments and interaction

Personalisation: Challenges (Cons) ??? Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Personalisation Personalisation: a more complete model of user-context that is more reusable and persists: ???? 2 key issues: design of model so that it can be distributed and shared dynamic vs. static task-driven user preference contexts Ubiquitous computing: smart devices, environments and interaction

Personalisation: Mechanisms Instructors can add more slides about how personalisation mechanisms, e.g., recommender systems, work here or delete this slides Ubiquitous computing: smart devices, environments and interaction

Affective Computing: Interactions using Users’ Emotional Context Affective computing: computing relates to, arises from, or influences emotions. Applications include: ??? Design challenges for affective computing with those for: determining the user context developing more complex human-like intelligence models Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Affective Computing Pickard (2003) identified six design challenges: Range & modalities of emotion expression is broad People’s expression of emotion is idiosyncratic & variable Cognitive models for human emotions are incomplete Sine qua non of emotion expression is the physical body but computers not embodied in the same way Emotions are ultimately personal and private No need to contaminate purely logical computers with emotional reactiveness Ubiquitous computing: smart devices, environments and interaction

iHCI: Design Heuristics and Patterns Many different higher-level HCI design usability / user experience criteria have been proposed by different HCI designers to promote good design of HCI interaction. Many different HCI heuristics (rules of thumb derived from experience) have proposed to support HCI criteria Specific guidance is needed to engineer UIs to comply with these usability & user experience HCI principles. UI design patterns can support HCI usability principles and then be mapped into lower-level more concrete design patterns Ubiquitous computing: smart devices, environments and interaction

iHCI: Design Heuristics and Patterns Example iHCI patterns include: Ubiquitous computing: smart devices, environments and interaction

iHCI: Design Patterns & Heuristics Instructors can propose many more examples here or delete this slide. Ubiquitous computing: smart devices, environments and interaction

iHCI: Engineering iHCI Design Patterns Can propose simplify design models along 2 dimensions that are interlinked Organisation / structural models versus time-driven interaction models Front-end / Presentation (UI) interaction versus back-end system actions that support this interaction Need to organise UI widgets or objects at UI Need to organise and link presentation to actions Need to design interaction with these widgets (see next slide as an example) Ubiquitous computing: smart devices, environments and interaction

iHCI: Engineering iHCI Design Patterns Image Search Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Overview HCI, eHCI & iHCI  eHCI use in 4 Widely Used Devices  iHCI use in accompanied smart devices  iHCI use in wearable and implanted smart devices  Human Centred Design (HCD)  User Models: Acquisition & Representation  iHCI Design  Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Summary A human centred design process for interactive systems specifies four principles of design: the active involvement of users and a clear understanding of user and task requirements; an appropriate allocation of function between users and technology based upon the relative competence of the technology and humans; iteration is inevitable because designers hardly ever get it right the first time; a multi-disciplinary approach to the design. Human centred design life-cycle involves user participation throughout four main sets of activities: defining user tasks and the (physical, ICT) environment context; defining user and organisational requirements; iterative design prototyping and validation against the requirements. Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Summary To enable humans to effectively interact with devices to perform tasks and to support human activities, systems need to be designed to support good models of user interfaces and processes of human computer interaction. Users can be modelled directly and indirectly. User task models can be modelled as task plans or as situated actions. iHCI design concerns three additional concerns: support for natural (human computer) interaction; user models including models of emotions which can be used to anticipate user behaviour and user context awareness including personalisation. Some design patterns and heuristics oriented towards iHCI are described. Ubiquitous computing: smart devices, environments and interaction

Ubiquitous computing: smart devices, environments and interaction Summary & Revision For each chapter See book web-site for chapter summaries, references, resources etc. Identify new terms & concepts Apply new terms and concepts: define, use in old and new situations & problems Debate problems, challenges and solutions See Chapter exercises on web-site Ubiquitous computing: smart devices, environments and interaction

Exercises: Define New Concepts Touchscreen, etc Ubiquitous computing: smart devices, environments and interaction

Exercise: Applying New Concepts What is the difference between touchscreen and a normal display? Ubiquitous computing: smart devices, environments and interaction