0% found this document useful (0 votes)
415 views6 pages

Human-Computer Interaction Insights

Human-Computer Interaction is the way in which humans interact with technologies, business, managerial, organizations. This article describes the existence and importance of human computer interaction and future research. It is believed that Human Computer Interaction is the subject of a strong research stream in Management Information System, and will continue to be strong in the future.

Uploaded by

VIVA-TECH IJRI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
415 views6 pages

Human-Computer Interaction Insights

Human-Computer Interaction is the way in which humans interact with technologies, business, managerial, organizations. This article describes the existence and importance of human computer interaction and future research. It is believed that Human Computer Interaction is the subject of a strong research stream in Management Information System, and will continue to be strong in the future.

Uploaded by

VIVA-TECH IJRI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 5 (2022)

ISSN(Online): 2581-7280
VIVA Institute of Technology
10th National Conference on Role of Engineers in Nation Building – 2022 (NCRENB-2022)

Role of Human-Computer Interaction


Mr. Aditya S Patel1, Ms. Komal P Gautam2
1
MCA, VIVA Institute of Technology, India
2
MCA, VIVA Institute of Technology, India

Abstract: Human-Computer Interaction is the way in which humans interact with technologies, business,
managerial, organizations. This article describes the existence and importance of human computer interaction
and future research. It is believed that Human Computer Interaction is the subject of a strong research stream in
Management Information System, and will continue to be strong in the future. The topics will include the
psychology of human-computer interaction, psychologically-based design methods and tools, user interface media
and tools, and introduction to user interface architecture.
Keywords - Business, Human-Computer Interaction, Managerial, Organizations, User Interface Architecture

1. INTRODUCTION
Usage of computers had always created the question of interface. The methods by which humans are using to interact
with computers has travelled a long way. The journey has been continued till date and designing of new technologies
and systems seems to be increasing number each day and the study in this area has been growing very rapidly in the
last few decades.
The growth in Human-Computer Interaction (HCI) not only provided quality of interaction, it has also experienced
different branches in its history. Apart from designing common interfaces, the different research branches have
different focus on the concepts of multimodal rather than unimodal, intelligent adaptive interfaces rather than
command ones, and specifically active rather than passive interfaces. The idea of Human-Computer Interaction (HCI)
was created automatically with the demanding growth of computer. The enormous growth of computing has made
effective human-computer interaction essential. It is important for the enormous number of computer users whose
schedules will not allow to increase training and experience that was once necessary to gain advantage of computing.
Increased attention to usability is driven by competitive pressures for high productivity and to reduce costs of training.

2. HUMAN-COMPUTER INTERACTION: OVERVIEW

2.1 OVERVIEW OF HUMAN-COMPUTER INTERACTION(HCI)

The advancement made over last few years in Human-Computer Interaction have almost made it impossible to realize
between fiction and real. The push in research and the constant twists in marketing cause the new technology to
become available to everyone. However, not all existing technologies are accessible or affordable by public.

2.2 DESIGNING FOR HUMAN-COMPUTER INTERACTION(HCI)

Designing of Human-Computer Interaction model is more convoluted than in many other fields of engineering. It is
essentially interdisciplinary, drawing on and influencing diverse areas such as computer graphics, software
engineering, human factors and psychology[15]. The task for developers is to make complex system appear simple
and sensibleis itself difficult and complex task.
Human-Computer Interaction design should consider many aspects of human behaviors [1]. The degree
of difficulty in involvement of a human in interaction with machines is sometime hidden compared to the
simplicity of the interaction method. Development of usable structures attracts on technology from consumer
interface media, software program architecture, process and data modelling, standards, and tools for modeling,
building and testinguser interfaces.

F-238
www.viva-technology.org/New/IJRI
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 5 (2022)
ISSN(Online): 2581-7280
VIVA Institute of Technology
10th National Conference on Role of Engineers in Nation Building – 2022 (NCRENB-2022)
2.3 THE COMPUTER SCIENCE OF HUMAN-COMPUTER INTERACTION(HCI)
As a progress in HCI is making user interfaces easier to learn and use, they are becoming more difficult to build. The
command line interface were difficult to use but easy to program. Modern direct manipulation and virtual environment
interfaces are easier to understand and use, but harder to program, largely because they have more execution paths.

F-239
www.viva-technology.org/New/IJRI
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 5 (2022)
ISSN(Online): 2581-7280
VIVA Institute of Technology
10th National Conference on Role of Engineers in Nation Building – 2022 (NCRENB-2022)

The area of Computer Science in Human-Computer Interaction study and develops the abstractions, techniques,
languages, and tools to address this problem [2].
An important concept in user interface software is to differentiate the design of an interactive system into distinct
levels [3]. Another distinctive conception is that the user interface management system (UIMS), that provides a
separate software component that perform all interactions with the users, distinct from the applicationprogram that
performs the underlying task [4]. It is corresponding to a database management system that it separates a function
used by many applications and moves it to a shared subsystem. This approach divides the problem of programming
the user interface from each individual application and permits some of the effort of designing tools for human-
computer interaction to be amortized over many applications and shared by them.
Since user testing is an important part of good interface design, the techniques for rapidly prototyping and modifying
user interfaces are needed. For this purpose, one requires methods for specifying user interfacesthat are exact, so that
the interface designer can describe and study a variety of possible user interfaces and automatic production of
prototypes for user testing.
In a graphical direct manipulation fashion of user interface, a set of objects is presented on a screen, and the user has
a collection of manipulations that can be performed on them. This means that the user has no commands to remember
beyond the standard set of manipulations, few changes and a reminder of the availableobjects and their states shown
continuously on the display. Examples are spreadsheets, the Xerox Star desktop and its descendants such as the Apple
Macintosh and many video games.

3. METHODOLOGY

3.1 UNIMODAL HUMAN-COMPUTER INTERACTION(HCI) SYSTEMS

As explained an interface mainly relies on number of input and outputs which are communication channels that enable
users to interact with computer through this interface. Each one has independent single channels is called as modality
[19]. Any system that depends on only one modality is known as unimodal. Based on the natureof different modality,
they can be further divided into three categories:
Audio-Based
Visual-Based
Sensor-Based
Audio-Based HCI
The audio-based interaction between a computer and a human is another essential section of Human-
Computer Interaction system. This area deals with information gathered by different audio mediums. While the
behavior of audio medium may not be as variable as visual signals but the information gathered from audio signals
can be more helpful and, in some cases, unique providers of information. Researches in this section can be divided
to the following parts:
1. Speech Recognition
2. Speaker Recognition
3. Auditory Emotion Analysis
4. Human-Made Noise/Sign Detections
5. Musical Interaction
Generally, speech [14] and speaker recognition [17] have been the main focus of research. Recent
attempts to integrate human emotions in human computer interaction initiated the analysis of emotions in audio
signals. Apart from tone and pitch of speech, human auditory signs such as sigh has helped emotion analysis
for designing more intelligent Human-Computer Interaction system [18].

3.1.1 Visual-Based HCI

The visual-based human computer interaction is probably the most broad area in Human-Computer Interaction system
research. Considering the extent of application and variety of open problems and approaches, researchers tried to
handle different aspects of human responses which can be identified as a visual signs. Some of the research in this
section are as follow:
Facial Expression Analysis
2. Body Movement Tracking
3. Gesture Recognition
4. Gaze Detection
Gaze detection is generally an indirect variety of interaction between user and machine that is generally used for
higher understanding of user’s attention, focus in sensitive situation. The exception is eye tracking systems for helping
disabilities in which eye tracking plays a main role e.g., pointer movement, blinking for clicking

F-240
www.viva-technology.org/New/IJRI
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 5 (2022)
ISSN(Online): 2581-7280
VIVA Institute of Technology
10th National Conference on Role of Engineers in Nation Building – 2022 (NCRENB-2022)

[11][16]. For example, lip reading or lip movement tracking is known to be used as an influential aid for speech
recognition error correction [7].

3.1.3 Sensor-Based HCI

Sensor-Based Human-Computer Interaction is combination of different applications. The common part


of these is that at least one physical sensor is used between user and machine to provide the interaction. These
sensors as mentioned as follows:
1. Pen-Based Interaction
2. Mouse and Keyboard
3. Motion Tracking Sensor and Digitizer
4. Test/Smell Sensor
5. Haptic Sensor
6. Pressure Sensor

Pen-Based interaction are used in mobile devices and are related to pen gesture [20] and handwriting identification.
Motion tracking sensors revolutionized movie, animation, art, and video-game industry. They arein the form of cloths
which can used and made computers much more able to interact with reality and human able to create their world
virtually. Figure 1 depicts this sort of device. Haptic and pressure sensors are utilized in robotics and virtual reality
[21]. The humanoid robots include tons of haptic sensors that make the robots sensitive and aware to touch [22].
These kinds of sensors are utilized in medical field [23].

Fig. 1. Wearable motion capture material for creation of video games (Taken from Operation Sport)

3.2 APPLICATION

3.2.1 EMOTION RECOGNITION MULTIMODAL SYSTEMS


Now computers are more essential than machines that perceive and interpret all clues. A natural human-computer
interaction cannot be based stated commands. Computers will be programmed to detect the
various behavioral signals. This is completely unique piece of the puzzle that one has to put together to guess
accurately one’s intentions and future behavior.

People can make prediction about one’s emotional state by observing one’s face, body, and voice. Study says that if
one had access to only one of the modalities, the face modality would give the best predictions. However, this
accuracy can be improved by 35% when human judges are given access to both face and body modalities together
[9]. This suggests affect recognition, which focused on facial expressions, can greatly advantage from multimodal
fusion technique.
Some of the few works which made an attempt to integrate more than one modality for recognition is [9] in which
facial features and body posture features are combined to produce an indicator of one’s frustration. Another work that
integrated face and body modalities is [9] in which the authors confirmed that, similar to humans, machine
F-241
www.viva-technology.org/New/IJRI
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 5 (2022)
ISSN(Online): 2581-7280
VIVA Institute of Technology
10th National Conference on Role of Engineers in Nation Building – 2022 (NCRENB-2022)
classification of emotion is higher whilst primary based totally upon face and body information, in preference to both
modality alone. The [24] authors tried to fuse facial and vocal information for powerful recognition. Once remaining
consistent with human judges, machine classification of emotion as neutral, sad, angry, or happy was most accurate
when the facial and vocal data is combined.

They recorded specifically four emotions i.e., sadness, anger, happiness, and neutral state. The detailed facial motions
were captured with simultaneous speech recordings. Deducted experiments showed that the performance of the facial
recognition-based system overcame the one based on aural information only.

Results show that the emotion recognition system based on aural information only give an overall performance of
70.9 percent, compared to an overall performance of 85 percent for a recognition system based on facial expressions.
This is an fact that the cheek areas give important information for emotion classification.

1. CONCLUSION

Human-Computer Interaction is an essential part of systems design. Quality of system depends on how it is
represented to users and their experience. The new direction of research has replaced common methods of interaction
with intelligent, adaptive, multimodal and natural methods. The Intelligence is trying to embed the technology into
the environment so to make it more natural and invisible at the same time. Virtual reality is also an advancing field of
Human-Computer Interaction which can be the common interface of the future. This paper attempted to give an
overview on these issues and provide a survey of existing research through a extensive reference list.

Acknowledgments

We would like to express our sincere thanks to our guide for providing us with a great deal of help, support and encourage us to work diligently at
every aspect of our research paper. Her views have always been equitably providing a perfect balance between encouragement and constructive
criticism. Her tips and suggestions helped us to decide the correct approach to the research paper. We attempted to find help from a variety of
individuals at various stages of the research paper.

REFERENCES
Books:
[1] Wiklund, M. E. (1994) Usability in Practice: How Companies Develop User-Friendly Products, Cambridge, MA: Academic
Press.
[2] Myers, B. A. (1989) “User-interface Tools: Introductionand Survey,” IEEE Software, vol. 6( 1) pp. 15-23.
[3] Foley, J.D., van Dam, A., Feiner, S.K., and. Hughes, J.F. (1990) Computer Graphics: Principles and Practice, Reading, MA:
Addison- Wesley.
[4] Olsen, D.R. (1998) Introduction to User Interface Software, San Mateo, CA: Morgan Kaufmann.
[5] Cohen, N. Sebe, A. Garg, L. Chen and T.S. Huang, “Facial expression recognition from video sequences: temporal and
static modeling”, Computer Vision and Image Understanding, 91(1-2), pp 160-187 (2003).
[6] B. Fasel and J. Luettin, “Automatic facial expression analysis: a survey”, Pattern Recognition, 36, pp 259-275 (2003).
[7] P. Rubin, E. Vatikiotis-Bateson and C. Benoit (eds.), “Special issue on audio-visual speech processing”, Speech
Communication, 26, pp 1-2 (1998).
[8] M. Pantic, A. Pentland, A. Nijholt and T. Huang, “Human computing and machine understanding of human behavior: a
survey” Proceedings of the 8th International Conference on Multimodal Interfaces, Banff, Alberta, Canada, pp 239-248
(2006).
[9] A. Kapoor, W. Burleson and R.W. Picard, “Automatic prediction of frustration”, International Journal of Human-Computer
Studies, 65, pp 724-736 (2007).
[10] Boff, K. R. and Lincoln, J. E. (1988). Engineering Data Compendium: Human Perception and Performance vols l-3. Harry
G. Armstrong Aerospace Medical Research Laboratory, Wright-Patterson Air Force Base, Ohio.
[11] J.K. Aggarwal and Q. Cai, “Human motion analysis: a review”, Computer Vision and Image Understanding, 73(3), pp 428-
440 (1999).
[12] D. Te’eni, J. Carey and P. Zhang, Human Computer Interaction: Developing Effective Organizational Information Systems,
John Wiley & Sons, Hoboken (2007).
[13] D. Norman, “Cognitive Engineering”, in D. Norman and S. Draper (eds), User Centered Design: New Perspective on Human-
Computer Interaction, Lawrence Erlbaum, Hillsdale (1986).
[14] L.R. Rabiner, Fundamentals of Speech Recognition, Prentice Hall, Englewood Cliffs (1993).
[15] Butler, K. A. (1995) Usability Engineering. In: A. Kent & J. Williams (eds.) Encyclopedia of Computer Science & Technology,
v. 33.New York: Marcel Dekker.
[16] A.T. Duchowski, “A breadth-first survey of eye tracking applications”, Behavior Research Methods, Instruments, and
Computers, 34(4), pp 455-470 (2002).
[17] A.T. Duchowski, “A breadth-first survey of eye tracking applications”, Behavior Research Methods, Instruments, and
Computers, 34(4), pp 455-470 (2002).
[18] M. Schröder, D. Heylen and I. Poggi, “Perception of non-verbal emotional listener feedback”, Proceedings of Speech Prosody
2006, Dresden, Germany, pp 43-46 (2006).
[19] A. Jaimes and N. Sebe, “Multimodal human computer interaction: a survey”, Computer Vision and Image Understanding,
108(1-2), pp 116-134 (2007).
[20] S.L. Oviatt, P. Cohen, L. Wu, J. Vergo, L. Duncan, B. Suhm, J. Bers, T. Holzman, T. Winograd, J. Landay, J. Larson and D.
Ferro, “Designing the user interface for multimodal speech and pen-based gesture applications: state-of-the-art systems and
future research directions”, Human-Computer Interaction, 15, pp 263-322 (2000).
F-242
www.viva-technology.org/New/IJRI
VIVA-Tech International Journal for Research and Innovation Volume 1, Issue 5 (2022)
ISSN(Online): 2581-7280
VIVA Institute of Technology
10th National Conference on Role of Engineers in Nation Building – 2022 (NCRENB-2022)
[21] G. Robles-De-La-Torre, “The Importance of the sense of touch in virtual and real environments”, IEEE Multimedia 13(3),
Special issue on Haptic User Inte rfaces for Multimedia Systems, pp 24-30 (2006).
[22] D. Göger, K. Weiss, C. Burghart and H. Wörn, “Sensitive skin for a humanoid robot”, Human-Centered Robotic Systems
(HCRS’06), Munich, (2006).
[23] C. Burghart, O. Schorr, S. Yigit, N. Hata, K. Chinzei, A. Timofeev, R. Kikinis, H. Wörn and U. Rembold, “A multi-agent system
architecture for man-machine interaction in computer aided surgery”, Proceedings of the 16th IAR Annual Meeting, Strasburg,
pp 117-123 (2001).
[24] C. Busso, Z. Deng, S. Yildirim, M. Bulut, C.M. Lee, A. Kazemzadeh, S. Lee, U. Neumann and S. Narayanan, “Analysis of
emotion recognition using facial expressions, speech and multimodal information”, Proceedings of the 6th International
Conference on Multimodal Interfaces, State College, PA, USA, pp 205-211 (2004).

F-243
www.viva-technology.org/New/IJRI

You might also like