0% found this document useful (0 votes)
50 views4 pages

NLP Assignment 1

Natural Language Processing (NLP) focuses on enabling computers to intelligently process human language, evolving from early rule-based systems to modern machine learning and deep learning techniques. Key areas of NLP include Natural Language Understanding (NLU) and Natural Language Generation (NLG), with applications such as chatbots and virtual assistants. The development of NLP leverages linguistic principles and statistical methods to improve language comprehension and generation capabilities.

Uploaded by

Fro Abera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views4 pages

NLP Assignment 1

Natural Language Processing (NLP) focuses on enabling computers to intelligently process human language, evolving from early rule-based systems to modern machine learning and deep learning techniques. Key areas of NLP include Natural Language Understanding (NLU) and Natural Language Generation (NLG), with applications such as chatbots and virtual assistants. The development of NLP leverages linguistic principles and statistical methods to improve language comprehension and generation capabilities.

Uploaded by

Fro Abera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Introduction

Natural Language Processing is a field of research related to the development of computer


programs that can process native language intelligently [1]. Language is the primary means of
interaction among humans, and enabling computers to understand and recognize human language
is highly beneficial. As computers play an important role in the preparation, storage, analysis,
and transmission of information, providing them with the ability to comprehend and process
information presented in Natural Language will be of paramount importance. Therefore, the
purpose of the NLP is to design and build computer systems capable of analyzing Natural
Languages (such as English, French, etc.) as well as productive output in the Indigenous
language [1]. However, developing tools that can understand, analyze, and generate human
language in the way humans do is far more complex than it may initially appear. From this idea
we know NLP is evolving through years. This indicates that NLP has undergone continuous
development over the years. The history of Natural language processing (NLP) begins in the
early days of Artificial Intelligence (AI) research In 1960s, a natural language processing (NLP)
program called ELIZA was developed by Professor Joseph Weizenbaum from the Massachusetts
Institute of Technology (Weizenbaum, 1966), the first chatbot in the history of computer
science[2]. It used a simple form of pattern matching to respond to certain keywords and phrases,
and although its capabilities were quite limited by today’s standards, its impact was
unmistakable. So it was rule based system. Rule-based models use a set of predefined rules to
analyze and understand the text. These rules are typically created by linguistic experts and can be
used to identify parts of speech, extract entities, and perform other language-processing tasks [3].
In many cases, it is not possible to capture the complexity, subjectivity, and nuances of an entire
grammar using a set of hard-coded rules. Analyzing patterns in text using statistics and
probability theory is an alternative approach that provides more flexibility, and can thus better
describe natural language [4]. Statistics as simple as word counts or the Term Frequency-Inverse
Document Frequency (TF-IDF) can be used to represent the relevancy of words in a corpus (i.e.,
the set of documents being considered). The biggest problem of statistical approach is that they
do not take into account the semantic similarity between words because each word is represented
solely by an importance metric [4]. To address this issue, the concept of machine learning was
introduced. Recently, machine learning (ML) has become very widespread in research and has
been incorporated in a variety of applications, including text mining, spam detection, video
recommendation, image classification, and multimedia concept retrieval [5]. Machine Learning
(ML) improves the drawbacks of traditional statistical methods in NLP by enabling models to
learn richer, more contextual representations of language rather than relying on surface-level
statistics like word frequency or co-occurrence. Among the different ML algorithms, deep
learning (DL) is very commonly employed in these applications [5]. Deep Learning (DL) has
significantly improved upon traditional Machine Learning (ML) algorithms, especially in
handling complex, high-dimensional, and unstructured data like text, images, audio, and video.
Nowadays, NLP is seeing many improvements. Some of the prominent application areas of NLP
are information extraction, machine translation, text summarization, automatic question
answering, and analysis of documents [7]. NLP tasks can be divided into two key areas: Natural
Language Understanding (NLU) and Natural Language Generation (NLG). Speech recognition is
an additional sub-task of NLP which is utilized in some specific applications like virtual
assistants such as SIRI, Alexa, Google Assistant, and others [6].

Natural Language Understanding (NLU)


NLU is the first phase of an NLP application where the computer takes the input directly or
indirectly from the user and attempts to understand the meaning, context, and validity of the text
[6]. The foundation of NLU development is rooted in the theoretical principles of linguistics.
NLU aims to equip machines with the ability to understand human language by leveraging
principles from various linguistic fields. Here are the major branches of linguistics: Phonology,
Morphology, Syntax, Pragmatics, and Semantics.

Phonology: Phonology [6] is the study of how sounds are arranged systematically for a particular
language or its dialect. The organization of sounds, their purpose, and their behavior are analyzed
in this discipline of NLU. Phonology is strongly connected to phonetics, which focuses on the
physical aspects of speech sounds and how people generate and interpret those sounds.

Morphology: Morphology [6] is the study of word-formation and the relation of different words
with each other in a language. A morpheme is the smallest unit of language that carries meaning.
The lexeme refers to the core or root word, while the variations created from it, such as by
adding prefixes or suffixes, are known as word-forms. In morphology, words are analyzed by
breaking them down into their individual morphemes, examining each part from affixes (prefixes
and suffixes) to the root or stem to understand how meaning is constructed.

Syntax: In this phase, the patterns in which sentences are formed are analyzed based on their
grammar rules [6]. Syntax helps to understand the dependency and order of words in a language.
It requires grammar and a parser to determine the syntactical validity of the text.

Pragmatics: In linguistics, pragmatics is the study of how the meaning of a word is related to its
context [6]. It refers to the study and handling of how context influences the interpretation of
language. Pragmatics focuses on understanding the meaning behind spoken or written language
by examining its implications and context. It looks at how people use language in real-life
conversations, interpreting both direct (literal) and indirect (non-literal) meanings based on the
situation, tone, and social setting.

Semantics: Semantics [6] is the study of the grammatical meanings of words and sentences. It is
different from pragmatics as in semantics the basic convention of the word and its meanings are
analyzed not the context in which the speaker uttered those words.

Natural Language Generation (NLG)


Applications of NLP
Chatbots and Virtual Assistants:
Since NLP can understand and generate human language, we can use it to communicate with
humans. Natural Language Processing (NLP) can be widely used in developing Chatbots and
virtual assistants, as it enables them to understand, process, and respond to human language
effectively. Chatbots help customers solve problems, answer commonly asked inquiries, and
provide basic information ([Link] Pajila, 2023).

Bibliography
mdyjkdjy. (n.d.). xfhs.

[Link] Pajila, D. D. (2023). A Survey on Natural Language Processing and its Applications. Proceedings
of the Fourth International Conference on Electronics and Sustainable Communication Systems
(ICESC-2023). IEEE.

You might also like