0% found this document useful (0 votes)
32 views7 pages

NLP Module 5 Notes

Module 5 focuses on pragmatics in NLP, covering key concepts such as discourse and reference resolution, syntactic and semantic constraints, contextual inference, information retrieval, question answering systems, text summarization, and dialogue systems. It emphasizes the importance of understanding meaning in context and the challenges associated with ambiguity and indirect references. The module also discusses practical applications like chatbots and QA systems, highlighting techniques for effective communication and information processing.

Uploaded by

nikyadav456
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views7 pages

NLP Module 5 Notes

Module 5 focuses on pragmatics in NLP, covering key concepts such as discourse and reference resolution, syntactic and semantic constraints, contextual inference, information retrieval, question answering systems, text summarization, and dialogue systems. It emphasizes the importance of understanding meaning in context and the challenges associated with ambiguity and indirect references. The module also discusses practical applications like chatbots and QA systems, highlighting techniques for effective communication and information processing.

Uploaded by

nikyadav456
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Module 5: Pragmatics

- Introduction to pragmatics analysis: Key aspects and challenges

- Discourse and reference resolution

- Coreference resolution and anaphora resolution

- Constraints in meaning interpretation:

- Syntactic constraints

- Semantic constraints

- Contextual inference

- Information retrieval and question answering systems:

- Open-domain QA

- Closed-domain QA

- Text summarization and generation:

- Extractive summarization

- Abstractive summarization

- Dialogue systems and conversational AI:

- Dialogue systems

- Chatbots

- Task-oriented assistants

Question Marks

Define discourse reference resolution. 2

- Discourse reference resolution identifies what words or expressions refer to in a larger context or
conversation.
- It connects pronouns or noun phrases (like “he” or “this”) to the correct entity.
- It ensures the meaning of a sentence is coherent in the entire text.
- Example: In "Ravi went to the shop. He bought bread", “he” refers to Ravi.
Question Marks

What is a reference phenomenon? 2

- Reference phenomenon involves how words like “he,” “she,” “it,” or “this” point to actual people or
things in context.
- It helps identify what the speaker or writer is talking about.
- It includes coreference, anaphora, and deixis.
- Example: “It is raining.” – “It” refers to the weather or current situation.

Explain syntactic constraints. 2

- Syntactic constraints are grammar rules that affect sentence structure and meaning.
- They determine which sentence patterns are acceptable.
- For example, “The cat sat on the mat” follows subject-verb-object structure.
- They help ensure clarity and avoid ambiguity in sentence formation.

What are semantic constraints? 2

- Semantic constraints are limits based on the meaning of words and their combinations.
- They prevent illogical sentences, like “The table sang a song.”
- They ensure that words used together make sense.
- These constraints guide correct word usage based on meaning.

What is information retrieval? 2

- Information retrieval (IR) is the process of finding relevant documents or data based on a user query.
- Examples include search engines like Google retrieving web pages.
- It involves indexing, ranking, and matching content to user intent.
- IR helps in locating information quickly from large datasets.

Define a question answering system. 2

- A Question Answering (QA) system answers natural language questions automatically.


- It finds accurate responses using documents, databases, or the internet.
- QA systems are of two types: open-domain (any topic) and closed-domain (specific field).
- Example: Siri, Alexa, and Google Assistant use QA systems.

What is text summarization? 2

- Text summarization reduces a long text into a concise version without losing key information.
- It can be extractive (pulling sentences from the original) or abstractive (rephrasing content).
- It helps users understand large content quickly.
- Common in news, legal, and academic documents.
Question Marks

Question Marks

What is the difference between syntactic and semantic constraints? 5

- Syntactic constraints deal with sentence structure and grammar rules; they define how words must be
ordered for a sentence to be grammatically correct.
- Semantic constraints focus on meaning; they ensure that the word combinations make logical sense.
- Example of syntactic: “He go store” is wrong due to structure.
- Example of semantic: “The rock sang” is grammatically fine but semantically incorrect.
- Syntactic constraints are language-specific, while semantic constraints are meaning-specific.
- Syntax ensures the sentence is formally correct; semantics ensures it conveys real-world logic.
- Both works together to ensure effective communication.
- Violating either may cause confusion or misinterpretation in NLP tasks.

Explain the significance of discourse reference resolution. 5

- It connects pronouns and phrases to their actual referents in text (e.g., “he,” “she,” “this”).
- Maintains coherence across multiple sentences or conversation turns.
- Helps machines understand who or what is being discussed.
- Supports accurate summarization, translation, and chatbot understanding.
- Essential for anaphora resolution, coreference chains, and pronoun linking.
- Example: In “Sara called Emma. She was crying,” resolving “she” is critical.
- Makes dialogue systems and QA systems more human-like.
- Enhances user experience by avoiding ambiguity in responses.

Describe how text summarization is used in NLP. 5

- Text summarization shortens long documents into brief, meaningful summaries.


- Used in applications like news summarization, academic abstracts, and legal case briefs.
- Two main types: Extractive (selects key sentences) and Abstractive (generates new concise sentences).
- Helps users grasp key points without reading entire texts.
- Improves efficiency in search engines, QA systems, and chatbots.
- Summarization requires understanding context, importance, and redundancy.
- NLP models use techniques like TF-IDF, BERT, and Transformers for summarization.
- Useful in real-time scenarios like news apps, legal tools, and research assistants.
Question Marks

Implement a simple question answering system. 5

- QA systems take user queries and return accurate, relevant answers.


- They are of two types: Closed-domain (e.g., medical QA) and Open-domain (e.g., Google search).
- Use techniques like keyword matching, named entity recognition, and semantic similarity.
- A simple system can use a predefined document and extract answers from it.
- Python libraries like spaCy, NLTK, or HuggingFace Transformers can be used.
- Implementation steps: preprocess text → identify question type → extract answer span.
- Models like BERT can be fine-tuned for answer prediction.
- QA systems are used in virtual assistants, chatbots, and customer service bots.

Apply text summarization techniques to a document. 5

- First, decide between extractive and abstractive summarization.


- Extractive method selects important sentences using scores (e.g., TF-IDF, TextRank).
- Abstractive summarization rewrites content using neural models like T5 or GPT.
- Use libraries like Sumy, spaCy, HuggingFace, or BART.
- Start with preprocessing: remove stopwords, tokenize, and segment sentences.
- Identify the main topic or focus sentence(s).
- Apply summarization and evaluate by comparing with a human-written summary.
- Ensure the final summary is coherent, non-redundant, and contextually accurate.
Question Marks

Use discourse reference resolution in a given text. 5

- Identify pronouns, named entities, and noun phrases that may refer to the same entity.
- Determine coreference chains (e.g., “Ravi” → “he” → “the boy”).
- Tools like spaCy, NeuralCoref, or AllenNLP help resolve references.
- Context matters: sentence position and narrative flow influence resolution.
- Use entity recognition + context windows to link references accurately.
- Helps in summarization, QA, and machine translation tasks.
- Example: “The dog chased the cat. It escaped.” → “It” = “the cat”.
- Accurate resolution improves coherence and reader understanding in NLP outputs.

Develop a basic dialogue system or chatbot. 5

- A dialogue system or chatbot interacts with users via natural language.


- Basic components: input understanding, dialogue management, and response generation.
- Use rule-based, retrieval-based, or generative models.
- Python tools: Dialogflow, ChatterBot, Rasa, or transformer-based models.
- Add intent recognition (e.g., “Book ticket”) and entity extraction (e.g., “Mumbai”).
- Maintain context to handle follow-up queries.
- Train using sample dialogues, FAQs, or conversation datasets.
- Chatbots are widely used in customer support, booking, and educational tools.

Last-minute revision cheat sheet for Module 5: Pragmatics in NLP

NLP Module 5 – Pragmatics Cheat Sheet

Pragmatics Overview

• Pragmatics: Study of meaning in context—how speakers and listeners interpret language beyond literal
words.
• Challenges: Ambiguity, context sensitivity, indirect references, and speaker intention.

Discourse & Reference Resolution

• Discourse: Continuous stretch of language (like paragraphs or dialogue).

• Reference Resolution: Linking pronouns/phrases to correct entities.

• Coreference: Identifying multiple expressions referring to the same entity.


Example: “Ravi went home. He was tired.” → He = Ravi

• Anaphora Resolution: Resolving backward references (pronouns → nouns).

• Tools: spaCy, NeuralCoref, AllenNLP.

Constraints in Meaning

• Syntactic Constraints: Based on sentence structure (grammar).


Example: “She go market” → syntactically invalid.

• Semantic Constraints: Based on logical meaning.


Example: “The rock sang a song” → semantically invalid.

Contextual Inference

• Understanding meaning requires context: previous sentences, speaker intent, or world knowledge.

• Important in dialogue systems, QA, and chatbots.

Question Answering (QA) Systems

• Open-domain QA: Any topic (e.g., Google, ChatGPT).

• Closed-domain QA: Specific field (e.g., medical QA).

• Conversational QA: Context-aware, multi-turn QA.

Text Summarization

• Extractive: Selects key sentences from original text.

• Abstractive: Generates new sentences with key ideas.

• Tools: BART, T5, HuggingFace Transformers.

Dialogue Systems & Conversational AI

• Dialogue Systems: Manage multi-turn conversations.

• Chatbots: Rule-based or generative AI (e.g., Rasa, GPT).

• Task-Oriented Assistants: Solve specific problems (e.g., book tickets).


Quick Tips:

• Coreference ≠ Anaphora (coref can be backward or forward).

• Use context to disambiguate meaning.

• Syntactic = grammar; Semantic = logic.

• Extractive = original text; Abstractive = reworded.

• QA = Retrieval + Answer Extraction.

• Chatbot = Input → Intent → Response.

You might also like