HUMAN-COMPUTER
INTERACTION
C h r i s t o p h e r Ta n
PSY 340 c h r i s t o p h e r. t a n @ h e l p . e d u . m y
OVERVIEW
• Human-computer interaction
o HCI principles in UI design
• Website & application design
• HCI in other contexts
o Wearable technology
o Driving
HUMAN-COMPUTER INTERACTION
• The study of how people use complex technological artifacts (May, 2001)
o And how these artifacts can be designed to facilitate this use
• HCI activity → Computer mediating between user and task (Card, 2014)
o Task can be done without computer but with other tools
o Similar in principle, but not HCI
HUMAN COMPUTER INTERACTION
• Goals • Software interface (i.e., info • Tasks performed with use of
• Cognitive & perceptual abilities provided, controls, input computer technology
• Mental states (i.e. fatigue, mechanisms, layout)
workload, stress, anxiety) • Hardware (i.e., physical design,
• Behavioural tendencies computer performance, speed,
responsiveness, capacities)
HUMAN-COMPUTER INTERACTION
• HCI goes beyond specifying displays & controls
• Asks the question: How can technology help people achieve their goals?
o How does interface design support goal-directed behaviour?
• Examples:
o Personal change – health & fitness, time management, financial monitoring
o Consumers – shopping, banking, gaming
o Work – productivity, collaboration
o Social relationships – social media, communication
HUMAN-COMPUTER INTERACTION
User experience & interface
• User experience (UX) → Overall experience when using a product/system
o How do users perceive and feel about the interface/technology?
• Usability
o Learnability; ease of understanding system
o Ease and efficiency of achieving goals
• User interface (UI) design
o How is information organized?
o Interface layout
HUMAN-COMPUTER INTERACTION
Principles & guidelines
• Interface & HCI design involves displays & controls
o Hence, HCI design requires consideration of display & control design
• Similar to/overlap with principles & guidelines covered in previous chapters
o Basic design principles (Norman)
o Display design principles (Wickens)
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Anticipate user needs
• Design is proactive by nature; user needs, wishes, expectations, etc.
• Avoid requiring users to search for info/remember from one page to next
• Provide necessary info and tools for task and context
o E.g. Price of shopping cart; “?” buttons beside data fields
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Limit interruptions & distractions
• Attention = finite cognitive resource
• Attention-switching diminishes task performance
o E.g., Alerts, notifications – some less important than others; should not interrupt primary task
• Give people control over when to respond to interruption
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Minimize info access cost
• Recall: Display design
• Physical & cognitive effort to retrieve info – sometimes excessive
o E.g. No. of clicks/steps to reach desired page
• Interfere with concurrent tasks – e.g. retaining info in WM
Example: Online banking fund transfer (steps from homepage)
Maybank RHB
1. Click “pay & transfer” 1. Hover over pull-out menu
2. Click “transfer” 2. Click “fund transfer”
3. Click “other bank normal transfer IBG”
4. Click “open transfer”
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Make system structure & affordances visible
• Recall: Affordances & signifiers (Norman)
• Affordances in real world are present everywhere → makes things intuitive
• BUT not a natural feature in computers
o Requires intentional design to guide users
o What actions are possible within the system?
• Graphical user interfaces signify possible actions
o Using icons, buttons, colour, design
o E.g. 3D-shaped buttons, greyed out links, etc.
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Be consistent
• Internal consistency (within system)
o Design elements repeated in consistent manner
• Similar info should be located in same locations
o Or info that serves similar functions → grouped together
• Same actions should accomplish same tasks
• Also: External consistency
o Similar to mental models
o Consistency with other systems used in same context
o E.g. Different apps have similar icons, groupings, colours, etc. within Windows platform – mental
model of how Windows functions
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Be credible & trustable
• People trust and rely on technology/interfaces when it is credible
o E.g., More likely to share credit card details to purchase from a reputable online store
• Features that degrade trust & credibility:
o Typos
o Broken links
o Poor design
o Dodgy aesthetics
• Features that enhance trust:
o Connection with real people
o Ability to contact assistance
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Consider aesthetics/simplicity
• Eliminate irrelevant elements that compete with the relevant
• Limit amount of typefaces/fonts, colours, etc. used
o Colour should be used as “redundant coding” – i.e., can be understood without it
• Also: Overall look & feel
o UX is about satisfaction as well
HUMAN-COMPUTER INTERACTION
Principles & guidelines
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Support flexibility, efficiency, & personalization
• Software flexibility to match needs of user
o Shorten tasks that users perform frequently/routinely
• Users often benefit from interfaces tailored to their preferences
o Features such as background colour, theme, wallpaper, font, etc.
o Customized shortcuts & commands – e.g., swipe direction to scroll
• E.g., Workers performing same task may configure their computers differently
HUMAN-COMPUTER INTERACTION
Principles & guidelines
Make system robust to errors
• Make commission of severe errors difficult – i.e., avoid accidental activation
o Confirmation buttons, warnings, multiple steps
• Make warnings clear; not vague
o E.g., Lengthy warnings, error codes → takes effort to interpret; unsure
• Errors are sometimes unavoidable, so minimize negative consequences
o Or allow for easy recovery from errors
o E.g., Ctrl + Z, undo action, unsend email
• User must know that error occurred, what the error is, and how to undo it
WEBSITE & APPLICATION DESIGN
Guidelines & practices
• HCI in typical computers focused on supporting info search & retrieval
o How easily & efficiently can the user navigate the system?
• Legibility
o Font choice – Arial, Helvetica, Verdana, etc.
o Size
• Navigation
o Good UI design guides users through task; makes navigation easy
o Simplify steps needed to achieve goals
o “3 click rule” → find important info within 3 clicks
• Always doable?
WEBSITE & APPLICATION DESIGN
Guidelines & practices
• Readability – The “F-Shaped Visual Scan”
o Users scan computer screens in an F-shaped
pattern (Nielson, 2006)
• Content area, not page itself
o 2 horizontal lines followed by 1 vertical line
o Not a perfect F-shape; gaze time varies from
left to right
o First lines of text & first few words on each
line receive more gaze-time
o Consider placement of important info
• Subheadings, bullet points
WEBSITE & APPLICATION DESIGN
Guidelines & practices
WEBSITE & APPLICATION DESIGN
Guidelines & practices
• Vertical attention – “The Fold”
o Above the fold
• What you see without scrolling
• Info immediately present on screen without further action
o Below the fold
• Info visible only after scrolling downwards
o Users will scroll for info
• BUT will pay less attention as they scroll further down
WEBSITE & APPLICATION DESIGN
Guidelines & practices
• Vertical attention – “The Fold”
WEBSITE & APPLICATION DESIGN
Guidelines & practices
• Vertical attention – “The Fold” (NN Group, 2018)
o Reserve top of pages for high-priority content → key user goals?
• E.g., What do users use online banking websites for?
o Beware of “false floors” – illusion of completeness
• Illusion of completeness can interfere/prevent scrolling behaviour
• Include signifiers to inform user that content is present below the fold
• E.g., Arrows, blurred out texts, “read more” buttons
o Attract attention to important content
• Users’ scanning behaviour is influenced by their search for important info
• Visually distinct and consistently styled
• E.g., Use headers, bolded & italicized text, etc. – help identify when info is important
WEBSITE & APPLICATION DESIGN
Guidelines & practices
• Banner blindness (NN Group, 2007)
o Desensitization to banner ad-like pointlessness
o Users almost never look at anything that resemble ads
• Heatmaps show close to zero fixations on ads
• Whether or not it is actually an ad
o The more an ad looks like a native site component, the more fixation time it gets
• For web designers → ensure content on page does not resemble banner ads; make all
components in line with other design elements
• For advertisers → masquerade ads to look a part of the webpage it is placed on
o People have learned to ignore the right rail of webpages
• Putting content in right-hand corners may be a bad idea
WEBSITE & APPLICATION DESIGN
Guidelines & practices
HCI IN OTHER CONTEXTS
• HCI design must consider context of use
o What are the tasks/goals of user?
o What will the user be doing while achieving those goals?
Wearable technology
• HCI integrated into clothing/accessories → practical functions/features
o E.g., Fitbit, Apple Watch, Android Wear, Google Glass
• HCI in wearable tech should consider:
o Comfort – extended period of use
o Prioritization of info – constrained display spaces; filter out critical info
o Accessible – simplify info to ease access (i.e., single-glance accessibility)
o Non-visual cues – does not limit user to visual scanning; ensures attentional capture
o Simplify tasks – collapse lengthy processes into simple commands/gestures
HCI IN OTHER CONTEXTS
Computers in cars
• Cars now provide drivers with a lot of info and added functionality/features
o E.g., entertainment, vehicle control, connectivity, automated activities
• HCI in cars → concerned with integrating added functionality without compromising
drivers’ performance
HCI IN OTHER CONTEXTS
Computers in cars
HCI design considerations
• Limit distractions & amount of visual info on
displays
o Distracts drivers from primary task of driving
o E.g., Lengthy written texts, info clutter
• Simplify interactions
o Reduce no. of options, steps, & screens
o Systems demanding glances > 2s → risk
A U TO M AT I O N
C h r i s t o p h e r Ta n
PSY 340 c h r i s t o p h e r. t a n @ h e l p . e d u . m y
OVERVIEW
• Benefits of automation
• Problems with automation
AUTOMATION
• A machine that performs task otherwise performed by humans
o Also: Tasks that humans are incapable of performing – i.e., beyond perceptual & physical
capabilities
• Contrast with HCI
o HCI → human in control; role of computer is relatively small
o Automation → shifts human from direct control to supervisory control
• Supervisory control – human’s role is to manage automation
AUTOMATION
Examples:
• Manufacturing, lifting
• Kneading bread
• Floor sweeping
• Heating/cooling systems
• Driving → automated parking, adaptive cruise control
• Autopilot function
• Hazard detectors
• Predictive displays
AUTOMATION
Why automate?
• When tasks are impossible/hazardous
o Robotic handling of hazardous materials (or in hazardous environments)
o Heavy-lifting beyond human capacities
o Complex mathematical processes (statistical analysis)
o Automatic readers for visually impaired
AUTOMATION
Why automate?
• When tasks are difficult
o Operators can carry out, but effortful & error-prone
• E.g., ‘Simple’ calculations, assembly, autopilot systems, medical diagnosis & decision-making
o Routine tasks – automate repetitive & fatiguing human operations
• E.g., Assembly line work, long-haul flights/drives
o Vigilant monitoring (low signal base rate)
• E.g., Warning & alert systems
AUTOMATION
Why automate?
• Extend human capability
o Human WM vulnerable to many factors
o Aid humans in doing things in difficult circumstances
o Automated aids → supplements WM; relieves operator of cognitive load
• E.g., Predictive displays, decision aids
o Extends multitasking capabilities
• E.g., Autopilot function relieves from aircraft control when other task demands temporarily spike
AUTOMATION
Why automate?
• Automation is technically possible & inexpensive
o Economics; productivity
o Inexpensive; reduce labour costs (although not necessarily the best use of automation)
• E.g., Automated phone menus, restaurant kiosks
o Simply because technology is available
PROBLEMS WITH AUTOMATION
• Automation reliability
• Trust: calibration & mistrust
• Overtrust, complacency, & out-of-the-loop behaviour
• Workload & situation awareness
PROBLEMS WITH AUTOMATION
Automation reliability
• Automation (computers in general) does what operators expect it to do
• In human-automation interaction, issue is not reliability
o Perceived reliability
• Not “unreliable”, but “imperfect”
o Often asked to do tasks that are impossible to do perfectly (dynamic environment)
o Imperfect automation still provides value
PROBLEMS WITH AUTOMATION
Automation reliability
Possible reasons for perceived unreliability:
• Actual unreliability
o Automated systems → more complex; more components than manual
o More can go wrong
• Automation does not operate well in certain situations
o All automation → limited operating range (designers’ assumption of its intended use)
o E.g., cruise control on downward slope
o Appears to be erroneous; in reality, lack understanding of how system was designed
• Incorrect “set up” of automation
o Keypress errors, slips, etc. → configuration errors not uncommon
o E.g., periodic painkiller administration
o Automation appears “dumb & dutiful”; blind obedience to operator
PROBLEMS WITH AUTOMATION
Trust calibration & distrust
• Trust in automation
o Degree to which human believes automation will operate as intended
• Trust is linked to perceived automation reliability
o Trust (and automation dependence) increases with perceived reliability
• Trust should be calibrated – in proportion to its reliability
o I.e., Trust should be high when automation is reliable
PROBLEMS WITH AUTOMATION
Trust calibration & distrust
Over-trust
Subjective Trust
Under-trust
Automation Reliability
Wickens et al. (2013)
PROBLEMS WITH AUTOMATION
Trust calibration & distrust
• Poor calibration of trust
o Distrust – fail to trust automation as much as is appropriate; leads to disuse
• E.g., Preference for manual control; alarms; Excel formulas; perception-enhancing automation
• Common causes of distrust:
o “Cry wolf effect” – i.e., under trust; systems with high FA rates
o Failure to understand how automated algorithms produce outputs
• E.g., Perceive FAs as failures; actually low response criterion to ensure safety
• Consequences of distrust → not necessarily severe (except with alarms)
o Leads people to reject good assistance – inefficiency (e.g., Excel)
PROBLEMS WITH AUTOMATION
Trust calibration & distrust
• Rate of automation distrust is alarming – when valid warnings are not heeded
• Recall: Issues with high FA systems → ignore when actual signal is detected
Examples:
Sorkin (1989)
• Train engineers taping over alert speakers due to typically false alarms
Seagull & Sanderson (2001)
• 42% of alarms heard by anaesthesiology nurses ignored (no action taken)
Wickens et al. (2009)
• 45% of conflict alerts received by air traffic controllers required no action (and no action taken)
PROBLEMS WITH AUTOMATION
Overtrust, complacency, OOTLUF
• Overtrust in automation
o Trust in automation more than is warranted
o A.k.a. complacency/automation bias
o Operator expecting automation to be functioning well → less likely to monitor its job; trust that
machine has done the necessary analysis/calculations
• Causes of complacency:
o Top-down processing – Human tendency to let experience guide our expectations; TDP > BUP
o Path of least cognitive effort
o Perceived authority or reliability
• E.g., Pilot following advice of flight planning automated system although wrong
• E.g., Flight simulation experiment (Mosier et al., 1992):
o 75% of pilots wrongly shut down engine due to wrong diagnosis & recommendation of automation
o Only 25% of pilots committed same error when using traditional checklist (i.e., checking raw data)
PROBLEMS WITH AUTOMATION
Overtrust, complacency, OOTLUF
• Automation overtrust & overdependence → lead to deskilling
o Ability to manually perform automated task declines over time
o E.g., Skill loss among pilots of highly automated aircrafts; mitigated by occasionally hand flying
(Wiener, 1988)
o E.g., Calculators
• OOTLUF – Out-of-the-loop unfamiliarity
o Degraded detection through complacency, awareness/diagnosis, and manual skill loss
o Overall system unfamiliarity; become unaware of system states (i.e., out-of-the-loop)
• Dangerous when automation fails and operator has to retake control
• Automation vs. OOTLUF → opposing concerns
o Too much workload vs. too little workload
PROBLEMS WITH AUTOMATION
Workload & situation awareness
• Automation intended to reduce operator workload
o Free mental resources to focus on primary task
o E.g., Automated lane-keeping, blind spot monitors, aircraft alert automation
• In reality, reduces arousal & situation awareness
o Increased automation correlated with decreased SA & workload
PROBLEMS WITH AUTOMATION
Workload & situation awareness
• Automation can undermine situation awareness
o Undermines SA → operator not actively involved in choosing & executing actions
• Similar to “Generation Effect”
• In practice, sometimes causes “clumsy automation”
o Recall: Work overload & underload (i.e., loss of arousal)
o Reduce workload in low-workload periods; increase workload in high-workload periods
• I.e., makes easy tasks easier and hard tasks harder
o Automation failure often occurs in complex situations/problems
• Magnifies the problem; operator has to “re-enter the loop” in more challenging situations
RESOURCES
• Designing for People: An Introduction to Human Factors Engineering (Lee, Wickens, Liu,
& Boyle, 2017)
• Engineering Psychology and Human Performance (4th Ed.) (Wickens et al., 2013)