Cognitive, Social, and Emotional Interaction

Dr Charles Martin

Plan for the class

Universal Usability

Creating technologies that are accessible and effective for a broad range of users. (Shneiderman, 2000)

(Shneiderman Chapter 2)

Activity: Question

Does everybody in Australia have equal access to computer systems?

Does everybody at ANU have equal access to computer systems?

What about the interface limits access?

Variations in Physical Abilities and Workplaces

Diverse Cognitive and Perceptual Abilities

Personality Differences

Cultural and International Diversity

Users with Disabilities

Older Adult Users

Children

Accomodating Diversity

  • Shneiderman 2.9

Cognitive Aspects

COGNITION (Photo by Stefano Bucciarelli on Unsplash)

Cognition

A definition from Oxford English Dictionary:

cognition, n.

“The action or faculty of knowing taken in its widest sense, including sensation, perception, conception, etc., as distinguished from feeling and volition…”

Source: “cognition, n.”  OED Online , Oxford University Press, June 2022, https://www.oed.com/view/Entry/35876. Accessed 31 July 2022.

(Photo by Mathilda Khoo on Unsplash)

Cognitive processes

Cognitive processes can change depending on the problem:

  • Experiential Cognition: 2 + 2 =
  • Reflective Cognition: 21 x 19 =

What are cognitive processes?

  1. Attention
  2. Perception
  3. Memory
  4. Learning
  5. Reading, speaking, listening
  6. Problem solving, planning, reasoning, decision making

(Eysenck & Brysbaert, 2023)

(Photo by Luke Jones on Unsplash)

1. Attention

Selecting things to focus on, relevant to our needs, from possibilities.

  • clear goals (directed searching vs browsing)
  • information presentation (structure and layout in the interface)
  • multitasking and attention
    • depends on individuals and context
    • relevance of distractions
    • effort to task switch
    • designing to support effective multitasking
(Image: Rogers et al. (2023) p.104)
(Rogers et al. (2023) p.107)

Design Implications for Attention

  • consider context to make information salient when required
  • techniques: animation, colour, ordering, spacing
  • avoid clutter
  • support switching and returning
(Rogers et al. (2023) p.108)

2. Perception

  • “[…] how information is acquired from the environment via the five sense organs (vision, hearing, taste, smell, and touch) and transformed into experiences of objects, events, senses, and tastes” (Roth, 1986 in Rogers et al. (2023) p. 109)
  • Proprioception: Awareness of position and movement of body through muscles and joints
  • Vision -> hearing -> touch (sighted individuals)
(Image: Rogers et al. (2023) p.109)

Design Implications for Perception

  • design icons and graphics to be distinguished
  • white space and separators to group information
  • sounds (earcons!) can help distinguish information
  • colour contrast is important for perception (and accessibility)
  • haptic feedback: use carefully, perhaps in response to user initiated actions
(Rogers et al. (2023) p.109)

3. Memory

  • brain filters what to remember and what to forget to avoid overload – but not always in the way we want to!
  • filtering into memory – depends on encoding process (e.g., active vs passive learning) and context (e.g., seeing someone in a different context)
  • people are better at recognition than recall
  • relying on technology rather than memory (e.g., car navigation system, “let’s ask Claude..”)
  • personal information management
  • remembering passwords and multifactor authentication
Photo by Piotr Miazga on Unsplash

Design Implications for Memory

  • avoid long and complex procedures for carrying out tasks
  • design interfaces for recognition rather than recall (familiar patterns and consistency)
  • provide ways to label digital information for identification

(Rogers et al. (2023) p.119)

4. Learning

“accumulation of skills and knowledge that would be impossible to achieve without memory” (Rogers et al., 2023, p. 119)

  • incidental learning vs intentional learning
  • learning by reading vs learning by doing
  • learning through collaboration
  • micro-learning
  • multimodal learning through new and emerging technologies e.g., augmented reality and virtual reality

Design Implications for Learning

  • design to encourage exploration
  • design constraints and guide users to appropriate actions
(Rogers et al. (2023) p.119)

5. Reading, Speaking, Listening

  • communication skills
  • meaning the same across modes
  • writing is permanent, speaking is transient
  • reading quicker than listening
  • listening less cognitive effort than reading
  • some more grammatical than others
  • interactive books, speech technologies, natural language processing, tactile interfaces, assistive technologies
Source: Design Boom
Photo by Nicolas J Leclercq

Design Implications for Communication

  • keep length of speech menus to minimum (less than 3-4 options)
  • extra intonation on artificial speech
  • provide options for making text large
(Rogers et al. (2023) p.121)

6. Problem- Solving, Planning, Reasoning, Decision-Making

  • Involve “reflective cognition” in relation to actions, choices, consequences
  • How do you make purchasing decisions? What role does technology play?
(Rogers et al. (2023) p.122)

Activity

TODO.

Cognitive Frameworks

  • Mental models
  • Gulfs of Execution and Evaluation
  • Information Processing
  • External Cognition
  • Distributed Cognition
  • Embodied Interaction

Gulfs of Execution and Evaluation (Don Norman)

External Cognition (Scaife and Rogers 1996 in Rogers et al. (2023)

  • Internal representations (in the head) and external representations (artefacts in the world) and how they interact
  • Combined with tools (e.g., pens, calculators, spreadsheets) to support cognitive activities
  • External cognition: “the cognitive processes involved when we interact with different external representations” (Rogers et al. (2023) p.129)
  • Reducing memory load, computational offloading, annotating, cognitive tracing
Reducing Memory Load and Annotating
Computational Offloading
Cognitive Tracing

Distributed Cognition

  • interactions between multiple people and artefacts
  • event-driven, systems as the unit of analysis, collective behaviours
  • information flows and transformations through the system
  • different levels of granularity
  • analysis addresses problem solving, communication (verbal and non-verbal), coordination mechanisms, accessing and sharing knowledge
  • analysis can inform design implications and decision-making in designing to support distributed cognition

(Rogers et al. (2023) p.120; Rogers (2012) pp.38-39)

Image: Rogers et al. (2023) p.128

Social Interaction

Activity

  • What are the kinds of situations in which you would phone someone?
  • What are the kinds of situations in which you would send someone a text message?
  • What leads to the difference?

Introduce yourself to someone nearby, talk for 2-3 minutes, and we’ll hear some answers.

Face-to-Face Conversations

  • Conversational rules from Conversational Analysis (Sacks 1987):
  • Adjacency pairs – setting up an expectation of a response (Schegloff and Sacks 1973)
  • Breakdowns and repairs: Breaking rules or missing cues
  • Designing conversational user interfaces and to support face-to-face communication
(Rogers et al. (2023) p.139-142)
  • Collaborative activity involving social skills, rules, norms, and conventions
  • Often tacit – people are not actively aware of, or thinking about, following them
  • People have different communication styles, backgrounds, and abilities
  • Different ways of opening, maintaining, and closing a conversation – implicit and explicit rules

Remote Conversations

  • phone and video conferencing
  • telepresence: “The perception of being there while physically remote” (Rogers et al. (2023) p.144)
  • telepresence rooms, robots, virtual reality
  • features of interaction design can establish a sense of presence and facilitate remote conversation
  • social presence: the feeling of being there

TODO

Co-Presence

“supporting people in their activities when they are interacting in the same physical space” (Rogers et al. (2023) p.150)

  • supporting effective collaboration
  • hand gestures, body language, use of objects
  • awareness: knowing what is going on around you, functioning as “close-knit teams”
  • shareable interfaces: whiteboards, touch screens
  • social translucence: enabling participants adn activities to be visible
The Reflect Table: Pierre Dillenbourg in Rogers et al. (2023) p.55
GatherTown for translucent interactions

Social Engagement

  • “Participation in the activities of a social group (Anderson and Binstock, 2012). Often involves some form of social exchange where people give and receive something from others […] voluntary and unpaid” (Rogers et al. (2023) p.158)
  • Connecting people with a common interest – e.g., Twitter battles, viral posts. Digital volunteering – disaster information sharing, citizen science
Twitter (in Rogers et al. (2023) p.158)

Emotional Interaction

Pepper Robot (Photo by Alex Knight

A Model of Emotional Design

  • visceral: look, feel, sound
  • behavioural: use (usability)
  • reflective: meaning, personal value, culture
(in Rogers et al. (2023) p.171)

Expressive Interfaces and Emotional Design

  • design features that seek to create an emotional connection with users or elicit emotional responses in users
  • denoting the system state
  • expressivity through animated icons, sonofications (sound effects), vibrotactile feedback (e.g., mobile phone or watch buzzing)
  • nice looking design affects people’s perceptions of the usability + they are pleasurable to use
  • annoying interfaces elicit negative emotional responses (e.g., unable to do the task, feeling patronised, unhelpful, time-consuming to use, intrusive, passive-aggressive)
Duolingo Reminders
Windows Blue Screen of Death

Affective Computing

Rosalind Picard: Affective Computing, Engineering Emotion

Emotional AI

  • automating the measurement of feelings and behaviours by using AI technologies
  • various sensors and measures
  • six fundamental emotions classified by Affdex: Anger, contempt, disgust, fear, joy, sadness
  • applications such as improving driver safety (e.g., improving mood and concentration, detecting drowsiness)
  • eye-tracking, words and phrases, biometric data (e.g., heart rate)
(Sharp et a;l. 2019, p.180)

Contextual Factors

Think about the setting and context in which interaction takes place:

  • Who, what, when, where, why, how of your activity
  • How technology extends spatial and temporal dimensions
  • What the explicit and unwritten rules, norms, conventions, practices are
  • How people interact with tools, technologies, and their environment
  • How the setting shape the activity and outputs – “situated actions and practices” (Suchman, 1987)
Image: Photo by Yassine Khalfalli

Conceptual Design

  • Developing a Conceptual Model
  • An outline of what people can do and what concepts are needed to understand how to interact
  • Need to understand the problem space and users
  • Generate ideas based on understanding

Conceptual Model

  • Metaphor, analogies, concepts, relationships, mappings
  • How to choose interface metaphors that will help users understand the product?
  • Which interaction types would best support the users’ activities?
  • Do different interface types suggest alternative design insights or options?

Interface Metaphors

Choosing metaphors (Erickson, 1995)

  1. Identify functional requirements (what it will do)
  2. Which parts are likely to cause users problems?
  3. Generate metaphors

Evaluate metaphors: Structure, Relevance, Representation, Understandable, Extensible (Sharp et al., p. 440-441 for example)

Interaction Types

Instructing, conversing, manipulating, exploring, responding

  • Which is best, depends on the design
  • Most models will include a combination
  • Different parts will have different types

Interface Types

  • Prompt and support different user experiences / behaviour
  • Prototyping will require an interface type or candidates
  • Depends on product constraints from requirements
  • Input and output modes <- user/context requirements

Questions: Who has a question?

Who has a question?

  • I can take cathchbox question up until 2:55
  • For after class questions: meet me outside the classroom at the bar (for 30 minutes)
  • Feel free to ask about any aspect of the course
  • Also feel free to ask about any aspect of computing at ANU! I may not be able to help, but I can listen.
Meet you at the bar for questions. 🍸🥤🫖☕️ Unfortunately no drinks served! 🙃

References

Cheng, A., Yang, L., & Andersen, E. (2017). Teaching language and culture with a virtual reality game. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 541–549. https://doi.org/10.1145/3025453.3025857
Erickson, T. D. (1995). Working with interface metaphors. In R. M. Baecker, J. Grudin, W. A. S. Buxton, & S. Greenberg (Eds.), Readings in human-computer interaction (pp. 147–151). Morgan Kaufmann. https://doi.org/https://doi.org/10.1016/B978-0-08-051574-8.50018-2
Eysenck, M. W., & Brysbaert, M. (2023). Fundamentals of cognition (4th ed.). Routledge. https://doi.org/10.4324/9781003384694
Hooper, C. J., Preston, A., Balaam, M., Seedhouse, P., Jackson, D., Pham, C., Ladha, C., Ladha, K., Plötz, T., & Olivier, P. (2012). The french kitchen: Task-based learning in an instrumented kitchen. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, 193–202. https://doi.org/10.1145/2370216.2370246
Rogers, Y. (2012). HCI theory: Classical, modern, and contemporary. Morgan; Claypool Publishers. https://doi.org/10.2200/S00418ED1V01Y201205HCI014
Rogers, Y., Sharp, H., & Preece, J. (2023). Interaction design: Beyond human-computer interaction, 6th edition. John Wiley & Sons, Inc. https://quicklink.anu.edu.au/kv9b
Shneiderman, B. (2000). Universal usability. Commun. ACM, 43(5), 84–91. https://doi.org/10.1145/332833.332843
Suchman, L. A. (1987). Plans and situated actions: The problem of human-machine communication. Cambridge University Press.