Human Perception
Pratheeba Jeyananthan
Faculty of Engineering
University of Jaffna
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 1 / 46
Lecture plan – Chapter 2
Lecture hours: 3 hours
Teaching method: PowerPoint slide and conventional teaching
method
Assignments: No assignment
Practical: No practical
References: Human–Computer Interaction: Fundamentals and
Practice by Gerard Jounghyun Kim
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 2 / 46
Human Information Processing
An effective human-computer interaction interface requires two basic
elements:
Computer factors (software/hardware)
Human behavior
This chapter take a brief look at some basic human factors which
affect this interaction.
To design a user satisfied model which satisfies all the principles
(Chapter 1), the requirements must be investigated, solicited, derived
and understood directly from the end user often through interviews
and surveys.
In addition to that, the knowledge of human factors also can be used
in this phase.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 3 / 46
Human Information Processing . . .
Based on the underlying theory of HCI, human factors can be largely
divided into:
Cognitive science: Explains the human’s capability and model of
conscious processing of high-level information.
Ergonomics: This interprets the way how external signals are accepted
by human senses and processed up to preattentive level and later acted
via motor organs.
Knowledge related to human factors will help us to design the HCI in
the following ways:
Task/interaction modeling:
Here the steps are formulated based on an idea about how humans
might interact to solve and carry out a given task/problem – for an
interaction model.
Designers’ knowledge in cognitive science will help greatly in developing
the model.
Prediction, assessment and evaluation of interactive behavior:
Understand and predict how humans might react mentally to various
information-presentation and input-solicitation methods as a basis for
interface selection.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 4 / 46
Task modeling and Human problem-solving model
The whole process of interaction could be viewed as a human
attempting to solve a problem and applying certain actions on objects
to arrive for the final solution.
In general, human problem-solving or information-processing efforts
consist of these important parts:
Sensation: This senses external information such as visual, aural or
haptic and perception which interprets and extracts basic meanings of
the external information.
Memory: This stores temporary and short-term information or
long-term knowledge. This knowledge includes information about the
external world, procedures, rules, relations, schemas, candidates of
actions to apply, current objective, the plan of action, etc.
Decision maker/ executor: Formulates and revises a plan, then
decides what to do based on the various knowledge in the memory and
finally acts it out by commanding the motor system (e.g., click the left
mouse button).
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 5 / 46
Flow of the whole process
In the above figure, (a) indicates the overall process in a flowchart.
(b) shows the overall process in a flowchart.
Top goal is a defined problem which needs to be solved.
In the next step, our main goal is broken into subgoals to formulate a
hierarchical plan.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 6 / 46
Defining subgoals
Generally a number of subtasks are identified in the hope of solving
the individual subgoals considering the external situation.
By enacting the series of these subtasks to solve the subgoals, the top
goal is eventually accomplished.
There might be failures in the subtasks.
Hence the whole process is repeated by observing the resulting
situation.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 7 / 46
Hierarchical task plan
This is equivalent to hierarchical goal plant.
Above figure is an example showing a simple task of changing the
font.
Generally in this task model, certain tasks should be applied in series,
and few may be concurrently.
The interaction model must match the users mind as much as
possible.
The fulfillment of the overall task depends on the match with the
mental model.
Ergonomics, user preference and other requirements determine the
interface selection.
Finally, the subtask structure can lend itself to the menu structure.
Actions and objects to which the actions apply can serve as the basis
for an object-oriented diagram.
This diagram will help for implementing an object-oriented interactive
software.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 8 / 46
Human reaction and prediction of cognitive performance
To some extent we can predict how humans will react and perform for
a particular design.
Two aspects of human performance can be considered: cognitive and
ergonomics.
Gulf of execution/evaluation: This explains how users can be left
confused by an interactive system.
If an interactive system does not offer certain actions or result in an
unexpected state, then such an interface is based on an ill-modeled
interaction.
A user before using an interactive system should first form a mental
model equivalent to the hierarchical action plan of the task.
The gap between the mental model and the actual system creates the
”gulf”.
If they both match to each other, the performance will be fluid.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 9 / 46
Human reaction and prediction of cognitive performance
...
Another factor influencing on the interactive performance is memory
capacity.
There are two types of memory in the human cognitive system: short
term (working memory) and long term.
Humans are known to remember about eight chunks of memory
lasting only a very short amount of time.
Hence an interface cannot rely on the human’s short-term memory
beyond this capacity for fast operation.
Imagine an interface with a large number of options or menu items.
It makes the decision taking option very hard – have to scan through
it many times to remember them.
Proper approaches should be there to handle such issues.
Retrieving information from long-term memory is difficult and
relatively time-consuming.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 10 / 46
Predictive Performance Assessment
Many cognitive activities have been analyzed in terms of their typical
approximate process time, encoding of information into the long term
memory, responding to a visual stimulus and interpreting its content,
etc.
Based on the task-sequence model and other figures, one might be
able to quantitatively estimate the time taken to complete a given
task. Hence the original performance requirements.
GOMS (Goals, Operators, Methods and Selection): This is an
evaluation method used for this purpose.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 11 / 46
Time estimation, example
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 12 / 46
GOMS
This starts with the hierarchical task modeling, see the example
(previous slide).
Once the subtasks are derived, you can map a specific operator (form
the figure) to each of the substask.
Original GOMS model was developed mainly for the desktop
computing environment, with performance figures for mouse clicks,
keyboard input, hand movement and mental operators.
This model is almost 30 years old and still fits for our advancements –
human’s capabilities mostly remained the same.
There are proposed GOMS models for other computing environments
as well.
GOMS is quite simple where we evaluate in terms of the task
performance.
They are evaluation the mental operations as well, which says that
there might be inaccuracies in the calculation.
Moreover, there are other criteria to evaluate the performances.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 13 / 46
Sensation and Perception of Information
Now focus on raw information processing.
First, input side: with humans sensory system (at least five senses).
Modalities of visual, aural, haptic (force feedback) and tactile are
known to be relevant to HCI.
Taking external stimulation or raw sensory information (may be
computer generated) and then processing it for perception is the first
part in any HCI.
Naturally the information must be supplied within the bound of a
human’s perceptual capabilities.
Another aspect in this criteria is attention – how to make the user
selectively tune in to a particular part of the information or
stimulation.
Highly attentive information can be used for alerts, reminders,
highlighting of prioritized/structured information, guidance, etc.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 14 / 46
Visual
The process of sensation and perception can be examined in four
major modalities and the associated human capabilities in this regard:
Visual, Aural, Tactile and haptic and Multimodal interaction.
Visual modality is by far the most important information medium.
Over 40% of the human brain is said to be involved with the
processing of visual information.
The parameters of the visual interface design and display system will
have to confirm to the capacity and characteristics of the human
visual system.
There are few important properties of the human visual system and
their implications for interface design.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 15 / 46
1. Visual and display parameters
Field of view (FOV): This is the angle subtended by the visible area
by the human user in the horizontal or vertical direction. Human FOV
is nearly 180°
Viewing distance: This is the perpendicular distance to the surface
of the display. Viewing distance may change with user movements.
One might be able to define a nominal and typical viewing distance
for a given or operating environment.
Display field of view: This is the angle subtended by the display
area from a particular viewing distance.
Pixel: Display system composed of an array of small rectangular
areas called pixels.
Display resolution: Number of pixels in the horizontal and vertical
directions for a fixed area.
Visual acuity: Resolution perceivable by the human eye from a fixed
distance. This is closely related to the power of sight, which differ
between people and age group.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 16 / 46
Horizontal field of view
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 17 / 46
Display parameters and display
Human visual and display parameters need to be matched with a
comfortable and effective visual display environment as much as
possible.
E.g.:
display FOV to human FOV.
Display resolution/object size to visual acuity
The display FOV is more important than the absolute size of the
display.
It is desirable to choose the most economical display – not necessarily
biggest or the one with the highest resolution, with respect to the
requirement of the task and the typical user characteristics.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 18 / 46
2. Detail and peripheral vision
The human eye contains two types of cells cones and rods that react
to light intensities in different ways.
Cones are responsible for color and detail recognition are distributed
heavily in the center of the retina –subtends about 5° in the human
FOV are roughly establishes the area of focus.
Details perceived throughout these cells are displayed in the oval
region of the figure.
Rods are responsible for motion detection and less detailed peripheral
vision.
Rods contributes to our awareness of the surrounding environment.
Displays are different from human perception, they have uniform
resolution.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 19 / 46
Detail and peripheral vision . . .
If the object details can be adjusted depending on where the user is
looking or based on what the user may be interested in, the overall
rendering of the image can be made more economical.
This can be used to doubly emphasize certain objects relative to
others in their neighborhood.
From a nominal viewing distance, only a small portion of the large
display will correspond to the foveal area (area related to cones).
Effectiveness of high resolution display:
While viewing one portion, the other major parts of the large display
resolution could be wasted.
Hence, it can be argued that smaller high reolution display placed at a
close distance are more economical than large tiled high resolution
display.
Example: Illumiroom, a Microsoft Research released display system in
which a high-resolution display is used in the middle and a wide
low-resolution projection and peripheral display provides high
immersion.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 20 / 46
3. Color, Brightness and Contrast
Brightness: Amount of light energy emitted by the object.
Color: Human response to different wavelengths of light, namely for
those corresponding to red, green, blue and their mixtures. A color
can be specified by the three fundamental colors and also by hue
(wavelength), saturation (relative difference in the major wavelength
and the rest in the light) and brightness value (total amount of light
energy).
Contrast:
Contrast is the relative difference in brightness or color between two
visual objects.
Difference or ratio of the amount of light energies between objects is
defined as the contrast in brightness.
The recommended ratio of the foreground to background brightness
contrast is at least 3:1.
Differences or ratios in the dimensions of hue and saturation
determines the color contrast.
Brightness contrast seems to be more effective for detail perception
than the color contrast.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 21 / 46
4. Pre-Attentive features and High-level diagrammatic
semantics
All the above mentioned raw visual properties such as detail, color,
brightness and contrast are very low level.
All of them are finally consolidated for conscious recognition.
Pre-attentive features might be used to attract our attention.
They are composite, primitive and intermediate visual elements that
are automatically recognized before entering our consciousness –
typically 10ms after entering the sensory system.
These features may rely on the relative differences in color, size,
shape, orientation, depth, texture, motion, etc.
There are universal recognition for certain high-level complex
geometric shapes and properties as a whole and understand the
underlying concepts.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 22 / 46
Pre-Attentive features and High-level diagrammatic
semantics . . .
Above diagram shows few universally accepted concepts:
connection/relation, dependency, causality, inclusion,
hierarchy/structure, flow/process, etc.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 23 / 46
Aural
This is the next important mode for information feedback.
It can be roughly divided into three types:
Simple beep-like sounds.
Short symbolic sound bytes known as earcons. E.g., paper-crunching
sound while putting a file into the trash.
Relatively longer ”as is” sound feedback.
Aural display parameters:
Intensity (amplitude): Refers to the amount of sound energy which is
similar to the volume. Measured in decibels (dB), where 0 dB is the
lowest and about 130dB is the highest.
Frequencies and amplitudes: Sound can be viewed as a number of
sinusoidal waves with different frequencies and corresponding
amplitudes. The dominant frequency component determines various
characteristics of sound such as the pitch, timber and direction.
Human audible range is 20 and 20, 000 Hz.
Phase: Time difference among sound waves from the same source.
They contribute to the perception of spatialized sound such as stereo.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 24 / 46
1. Aural display parameters
While using the aural feedback, these parameters should be set
properly.
General recommendation:
Sound signal should be between 50 and 5000 Hz.
Composed of atleast four prominent harmonic frequency components,
each within the range of 1000-4000 Hz.
General use of audible feedback is intermittent alarms.
Over loud alarms are known to frighten the user and lower the
usability.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 25 / 46
2. Other characteristics of sound as interaction feedback
Few difference between aural and visual feedback:
Sound is effectively omnidirectional.
For this reason, sound is most often used to attract and direct user
attention.
Sometimes it can be a task interrupter because of the startle effect.
Making use of contrast is possible with sound – For example, auditory
feedback requires 15-30 dB and the difference in this frequency
component can be used to convey certain information.
Continuous sound is more subject to becoming habituated than
stimulation with other modalities.
Only one aural aspect can be interpreted at a time.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 26 / 46
3. Aural modality as input method
So far the aural modality has been explained only in the context of
passive feedback.
To change it into active, two major methods are:
Keyword recognition
natural language understanding ‘
Isolated-word-recognition technology has become robust lately – still
it needs speaker-specific training or a relatively quite background.
Another relative difficult here is the ”segmentation” problem. How to
segment out from a stream?
Changing to the voice mode is still a pain for ordinary user.
This method is much effective where hands are totally occupied or
there is very little background noise or there is no mixture of
conversation with the voice commands.
Machine understanding of long sentences and natural language based
commands are still computationally difficult and demanding.
However, it is advancing fast: Apple Siri and IBM Watson
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 27 / 46
Tactile and haptic
Haptic is defined to be the modality that takes advantage of touch by
applying forces, vibrations or motions to the user.
Thus haptic refers to both fore feedback and touch (tactile).
Here haptic refers to the modality for sensing force and kinesthetic
feedback through our joints and muscles.
Tactile refers to the sensing different types of touch (texture, light
pressure/contact, pain, vibration and even temperature) through our
skin.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 28 / 46
1. Tactile display parameters
Tactile resolution: Skin sensitivity to physical objects is different over
the human body. Fingertip is generally used in HCI.
Vibration frequency: Rapid movement sych as vibration is mostly
sensed by the Pacinian corpuscle (ending of a sensory nerve). Signal
response range 100-300Hz. For comfortable perception, 250Hz is
known to be optimal.
Pressure threshold: LIghtest amount of pressure a human can sense is
about 1000N/m2 . For the fingertip, this is about 0.02N.
Maximum threshold is difficult to measure as the factors depend on
the user.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 29 / 46
2. Haptic display parameters
Important haptic parameters are:
Degree of freedom: number of directions in which force or torque can
be display.
force range: Should be at least greater than 0.5mN
Operating/interaction range: Allowed movement through the device.
Stability: Stability of the supplied force.
Activation force for the joints is between 0.5 to 2.5 mN.
This range would vary according to the age, gender, strength, size,
weight, etc. of the user.
Haptic devices are both input and output devices at the same time.
Haptic device simplest form: simple electromagnetic latch used in
game controllers, More-complicated one is robotic kinematic chain
either fixed on the ground or worn in body.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 30 / 46
Haptic display parameters . . .
Stability is one of the important haptic parameter.
This is a by-product of the proper sampling period, which refers to
the time taken to sense the current amount of force at the interaction
point and then determine whether the target value has been reached
and reinforce it.
The ideal sampling period is 1000 Hz.
When the sampling period falls under a certain value, the robotic
mechanism exhibits instability and thus lower usability.
Providing a high sampling rate requires a heavy computation load,
not only in updating the output force but also in physical simulation.
Hence a careful satisficing solution is needed to balance the level of
the haptic device performance and the user experience.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 31 / 46
Multimodal interaction
Visual oriented interfaces are most conventional.
However, for various reasons multimodal interfaces are gaining
popularity with the ubiquity of multimedia devices.
By employing more than one modality, interfaces can become more
effective in number of ways:
Complementary: different modalities can assume different roles and
act in a complementary fashion to achieve specific interaction objective.
E.g., Arrival of the phone call via aural and the caller’s name as visual.
Redundant: Different modality input methods or feedback can be
used to ensure a reliable achievement of the interaction objective. For
example, Phone call is simultaneously aural and tactile to facilitate the
pick-up probability.
Alternative: Providing alternate ways to interact gives people more
choices. Phone call can be made either by touching a button or by
calling the callee’s name.
For the effectiveness of the interfaces, feedback must be properly
synchronized – sound and visual in a button touch must occur within
a short time.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 32 / 46
HCI guidelines
While principles are very general and applicable to wide areas and
aspects of HCI design, guidelines tend to be more specific.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 33 / 46
Examples of HCI guidelines
1. Visual Display Layout (General HCI Design):
This problem concerns organizing and allotting relevant information in
one visible screen or scrollable page.
Generally the display layout is organized according to the information
content (importance, sequence or functionality), sized manageable
(divided into proper sections), attention grabbing and visually pleasing.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 34 / 46
2. Information structuring and navigation (General HCI
design)
Single display is not sufficient to encompass all the required
information of a UI for a given application.
Structuring the information and making it easy to move among items
is a very important issue for high usability.
Structuring information content and controlling the interface is
closely related to the principle of understanding the task.
By understanding the task, we identify the sequence of subtasks and
actions.
Aside from such internal structure, it is also important to provide
external means and the right UI for fast and easy navigation.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 35 / 46
Navigation
Navigation refers to the method used to find information within a
website.
Navigation page is primarily used to help users to locate and link to
destination pages.
Generally, navigation scheme and features should allow users to find
and access information effectively and efficiently.
Two design patterns:
What: Put two side-by-side panels on the interface. In the first, show
a set of items that the user can select at, in the other show the content
of the selected item.
When:Presenting a list of objects, categories or events and want to see
the overall structure of the list.
What: Show each of the application’s page within a single window.
While user drills down through a menu option, replace the content with
a completely new page.
When: Application consists of many pages or panels of contents for
the user to navigate through.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 36 / 46
3. Taking user input
Clever designs for retrieving user inputs will improve the overall
performance – both time and accuracy.
Modern interfaces employ GUI elements (text box, button, menu,
etc), support techniques (auto completion, deactivate irrelevant
options, etc) and devices to obtain user input in different ways.
It is up to the designer to find the optimal combination of these
methods for the best performance with repect to the design
constraints.
Guidelines to be used in applying these input methods:
Consistency of data-entry transactions:Similar sequences of actions
should be used under all conditions.
Minimal input actions by user: Fewer input actions mean greater
productivity. Single-key commands, mouse selection, auto-completion
or automatic cursor replacement can be used rather than typing.
Minimal memory load on users: Menus and button choices will more
helpful for the users than lengthy list of codes and command strings.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 37 / 46
Taking user input . . .
Compatibility of data entry with data display: Both data-entry and
display format of the information should be linked closely.
Clear and effective labeling of buttons and data-entry fields: Use
consistent labeling. Required and optional data entry should be
distinguished. Place labels close to the data-entry fields.
Match and place the sequence of data-entry and selection fields in a
natural scanning and hand-movement direction: Top to bottom or left
to right.
Do not place semantically opposing entry/selection options close
together: Save and undo are good examples. If they are close to each
other there are more chances for errors.
Design of form and dialog boxes: Most visual-display layout guidelines
also apply to the design of form and dialog boxes.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 38 / 46
4. User with disability
W3C (WWW Consortium) has led the web accessibity initiatives and
published the Web Content Accessibility Guidelines (WCAG) 2.0.
This explains how to make the web content more accessible to
disabled people.
Web content generally refers to the information in a web page or web
application including text, images, forms, sounds, etc.
Summary of the guidelines:
Perceivable:
Provide text alternatives for nontext content.
Provide captions and other alternatives for multimedia.
Make it easier for users to see and hear content.
Create content that can be presented in different ways including by
assistive technologies without losing meaning.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 39 / 46
User with disability . . .
Operable:
Make all functionality available from a keyboard.
Give users enough time to read and use content.
Do not use content that causes seizures.
Help users navigate and find content.
Undersatndable:
Make text readable and understandable.
Make contents appear and operate in predictable ways.
Help users avoid and correct mistakes.
Robust:
Maximize compatibility with current and future user tools.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 40 / 46
5. Mobile devices (Platform type)
Recently with the spread of smartphones, usability and user experience
of mobile devices and applications has become even more important.
Even though we can apply many conventional principles here, the
following are more specific and important:
Speed information – especially with regard to network services.
Minimizing typing and leverages on input hardware – buttons, touch,
voice, etc.
Fierce task focus – for less confusion in a highly dense information
space.
Large hit targets – for easy and correct selection.
Effective use of screen space – with condensed information.
Similar set of guidelines available from the Nokia developer’s home
page:
Shortcuts for frequently used functions.
Keep the user informed of his or her actions.
Follow the device’s interface patterns – positioning of menus and
buttons.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 41 / 46
6. Icons for Apple iOS and fonts for Windows XP (Vendor)
Apple’s guide on designing and stylizing the icons:
Try to balance eye appeal and clarify of meaning in your icon so that it
is rich and beautiful and clearly conveys the essence of your app’s
purpose.
Investigate how your choice of image and color might be interpreted by
people from different cultures.
Different size of your app icon for different devices.
Choice of fonts/sizes for Windows XP or applications based on it:
Franklin Gothic is used only for text over 14-point size. It is used for
headers and should never be used for body text.
Tahoma is used as the system’s default font. Tahoma should be used
at 8-, 9-, or 11-point sizes.
Verdana (bold, 8 point) is used only for title bars of tear-off/floating
palettes.
Trebuchet MS (bold, 10 point) is used only for the title bars of
Windows.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 42 / 46
7. ”Earcon” design for Aural interface (Modality)
Similar to visual icons, this must capture the underlying meaning and
draw attention for easy recognition.
Earcons should be designed to be intuitive.
Three types of earcons – symbolic, nomic and metaphoric.
Symbolic earcons rely on social conventions – applause for approval.
Nomics are physical such as a door slam.
Metaphorical ones are based on capturing the similarities like falling
pitch for a falling object.
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 43 / 46
8. Cell phones in Automobiles (Tasks)
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 44 / 46
9. E-Commerce (Application)
Kalsbeek has collected very extensive, detailed and structured HCI
guidelines for e-commerce applications.
Totally they have 404 guidelines under four groups – general,
input/output, UI elements and checkout process
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 45 / 46
Summary
This chapter covers human perception and few aspects of ergonomics.
Human information processing
Task modeling and human problem solving model
Human reaction and prediction of cognitive performance
Performance assessment
GOMS
Sensation and perception information
Visual
Aural
Tactile and haptic
Multimodal interaction
HCI guidelines
Pratheeba J (UOJ) EC9540 - Human Computer Interaction 46 / 46