NLP Course Details for B.E. CSE
NLP Course Details for B.E. CSE
1. To know the basic concepts of natural language processing and language modeling
2. To learn the word level and syntax analysis techniques
3. To understand the working of semantic analysis and discourse processing
4. To familiar with natural language generation and machine translation
5. To study about information retrieval techniques and lexical resources
Course Outcomes:
PO PSO
Weightage
Cognitive
Delivery
Mode Of
Highest
Assessment
for AC
Level
CO Components
(AC) 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3
Concept Map:
1
COURSE DESCRIPTION: (As per Mepco Regulation-2015 Syllabus)
REFERENCES:
1.Daniel Jurafsky and James H Martin, “Speech and Language Processing: An introduction to
Natural Language Processing, Computational Linguistics and Speech Recognition”, 2nd Edition,
Prentice Hall, 2008.
2. James Allen, “Natural Language Understanding”, 2nd edition, Benjamin /Cummings publishing
company, 1995.
WEB REFERENCES
1. http://nptel.ac.in/courses/106101007/
2. http://nlp.stanford.edu/
3. http://ocw.mit.edu/courses/electrical-engineering-and-computer-science
Course Schedule
3
S. CO No. of Mode of Date of
Topic
No. No. Periods Reference Delivery coverage
Realization – Discourse Planning
31. Metaphor 4 1 Ch 21.2 of R1 1
Machine Translation- Syntactic Transformations
32. 4 1 Ch 21.2 of R1 1
– Lexical Transfers
33. Direct/Statistical Translation 4 2 Ch 21.4 of R1 1
Total: 10
Unit V NATURAL LANGUAGE GENERATION AND MACHINE TRANSLATION
Natural Language Generation: Architecture of
34. 5 1 Ch 7.2 of T 1
NLG Systems
35. Generation Tasks and Representations 5 2 Ch 7.3 of T 1
36. Applications of NLG 5 1 Ch 7.4 of T 1
Machine Translation: Problems in Machine
37. 5 1 Ch 8.2 of T 1
Translation
38. Characteristics of Indian Languages 5 1 Ch 8.3 of T 1
39. Machine Translation Approaches 5 1 Ch 8.4-8.8 of T 1
40. Translation involving Indian Languages 5 1 Ch 8.9 of T 1
41. Lexical Resources: WordNet – FrameNet 5 1 Ch 12.2,12.3 of T 1
42. Stemmers - POS Tagger 5 1 Ch 12.4,12.5 of T 1
43. Research Corpora 5 1 Ch 12.6 of T 1
Total: 11
Total Periods: 52
Cognitive Level for Test
This Module explains the origin and challenges of natural language processing and how to
process Indian languages. It details the applications of natural language processing and
information retrieval. It gives the language modeling and applies grammar based and statistical
language modeling.
5
LU -6 Language Modeling: Introduction Period: 1
LU Outcomes Level: U CO Number : 1
Analyze different language modeling approaches for natural languages
PossibleAssessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Define language modeling. R
2. Write short notes on language modeling approaches. U
3. Differentiate grammar based and statistical language modeling. U
4. How is the structure of a sentence different from the structure of a U
phase?
LU -7 Various Grammar-based Language Models Periods: 2
LU Outcomes Level: A CO Number : 1
1. Analyze the sentences based on grammar based language models
2. Represent the transformational grammar for the sentences
3. Apply different theory for grammar based language model
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What is generative grammar and hierarchical grammar? R
2. What are the different levels of representation in government and U
binding?
3. Construct the transformational grammar (TG) representation for the A
sentence “Mukesh was killed”. After construct the surface and deep
structure for the same sentence.
4. Explain in detail about components and organization of government and U
binding.
5. Explain in detail about theory. U
6. Construct the NP, VP, AP and PP structure for the sentence “the food in a A
dhaba” using theory.
7. Discuss in detail theta role and theta criterion. U
8. Write short notes on binding and case theory. U
9. What is meant by lexical functional grammar (LFG)? Give the C-structure U
and f-structure of LFG.
10. What are three conditions of f-structure in LFG? What inputs to an U
algorithm create an f-structure?
11. What is Paninian Grammar? Explain in detail about layered U
representation in Paninian Grammar.
12. What are issues in Paninian Grammar? U
13. Discuss empty-category principle and give two examples of the creation
of empty categories.
14. Find f-structures for the various phrases and the complete sentence, ‘I A
saw Aparna in the market at night’.
15. What are Karaka relations? Explain the difference between Karta and U
Agent.
16. What are lexical rules? Give the complete entry for a verb in the lexicon, U
to be used in LFG.
17. Compare GB and PG. Why is PG called syntactico-semantic theory? U
LU -8 Statistical Language Model Periods: 2
LU Outcomes Level: A CO Number : 1
1. Construct the n-gram model for the sentence
2. Apply smoothing technique for the language model
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What is meant by n-gram model? Give the conditional probabilities and U
chain rule of n-gram model.
2. Construct the bi-gram model for the training set: “The Arabian Knights, A
These are the fairy tales of the east, The stories of the Arabian Knights
are translated in many languages”
3. What is the need for smoothing technique in n-gram language model? U
4. What is Add-one smoothing and Good-Turing smoothing technique? U
5. What is caching technique? Compare caching technique with other U
techniques.
6. What are the problems associated with n-gram model? How are these U
problems handled?
7. How does caching improve the n-gram model? U
8. Create a collection of 10 documents. Write a program to find the A
frequency of bi-grams in this collection. Find the number of bi-grams that
occur more than three times in this collection.
This module explains the word level analysis using regular expressions and finite state
automata. It deals with morphological parsing for the words and sentences. It also performs the
spelling error detection and correction using minimum distance algorithm. It gives the details
about word classes and part of speech tagging. It deals with syntactic analysis using context free
grammar and also performs probabilistic parsing.
7
4. Explain in detail about stemming algorithms for morphological U
processing.
5. Construct the Finite state transducer that accepts two input strings hot A
and cat, and maps them onto cot and bat.
6. Explain in detail about transducer mapping for stem and morphological U
features.
LU-12 Spelling Error Detection and correction Period: 1
LU Outcome Level: A CO Number:2
Apply minimum edit distance algorithm for spelling error detection and correction
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What typing errors were identified in single error misspellings? U
2. What is meant by optical character recognition (OCR)? R
3. Distinguish between non-word and real-word error. U
4. Compute the minimum edit distance between paecflu and peaceful. A
5. Write a program to find minimum edit distance between two input A
strings.
6. What are the types of error detection and correction in particular U
strings?
7. What is real word error? R
8. What are neural nets? Write the drawbacks of neural nets. U
9. Write the algorithm for minimum edit distance in error detection and A
correction.
LU-13 Words and Word classes - Part-of Speech Tagging Periods: 2
LU Outcome Level: A CO Number:2
1. Classify the words into part of speech
2. Use different part of speech techniques for sentences
PossibleAssessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What is meant by open and closed word classes? R
2. What is tag set? Write the different tags for Penn Treebank tag set? U
3. Explain in detail about methods for part of speech tagging. U
4. Apply the rule based tagger for the sentence “The show must go on”. A
5. Consider the sentence “The bird can fly”, using bi-gram model find the A
probability for the sentence.
6. What is transformation based learning (TBL)? Draw the diagram for U
TBL learner.
7. Write an algorithm for TBL tagging in part of speech tagging. A
8. Assume that in a corpus fish is most likely to be a noun A
P(NN/fish) = 0.91
P(VB/fish) = 0.09
Consider the following two sentences and their initial tags and apply
TBL method for solving these sentences.
I/PRP like/VB to/TO eat/VB fish/NNP.
I/PRP like/VB to/TO fish/NNP.
9. Comment on the validity of the following statements: U
a. Rule-based taggers are non-deterministic
b. Stochastic taggers are language independent
c. Brill’s tagger is a rule based tagger.
10. How can unknown words be handled in the tagging process? U
11. Use any tagger available in your lab to tag a text file. Now write a A
program to find the most likely tag in the tagged text.
12. Write a program to find the probability of a tag given previous two tags
i.e. P(t3/t2t1).
LU-14 Syntactic Analysis: Context- free Grammar Period: 1
LU Outcome Level: A CO Number:2
1 Perform syntactic analysis using grammatical arrangements of words in a sentence
2 Construct parse trees using Context Free Grammar
PossibleAssessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What is meant by syntactic parsing? R
2. Define Context Free Grammar? What are the components of CFG? U
3. Construct the parse tree for the following sentence “Ram reads a book” A
and use the following CFG:
S->NP VP, NP->N, NP->Det N, VP->V NP, VP->V, N->Ram|He,
V->reads|sings|sleeps
4. Consider the sentence “Jatin brought the glass”, construct the tree A
format and perform the syntactic analysis.
5. Construct the two possible parse trees for the sentence, Stolen painting A
found by tree.
LU-15 Constituency Period: 1
LU Outcome Level: A CO Number:2
1. Build phrase structure rules
2. Apply phrase and sentence level constructions for constituency
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What are the phrase structure rules for the noun phrase? R
2. Give the example for following Noun Phrases: A
NP->(Det) (Adj) Noun, NP->(Det) (AP) Noun (PP)
3. Give the example for following Verb and Prepositional Phrases: A
VP->Verb (NP)(NP)(PP)*, PP->Prep (NP)
4. Build the sentence level constructions for the following sentences A
a. Do you have a red pen?
b. Is there a vacant quarter?
c. Is the game over?
d. Can you show me your album?
5. Define feature structures. Give the attribute value matrix (AVM) for U
feature structures.
6. Identify the noun and verb phrases in the sentence, My soul answers in A
music.
LU-16 Parsing Periods: 2
LU Outcome Level: A CO Number:2
1. Apply top down and bottom up parsing for syntactic analysis
2. Solve the ambiguity problem using parsing
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What are the two analysis performed in the parsing? U
2. Write short notes on process of natural language understanding. U
3. Explain in detail about parsing algorithms. U
4. Write the top down parses for the sentence “Ram ate the apple” and A
construct the top down parsing tree for the same.
5. Write the bottom up parses for the sentence “Ram ate the apple” and A
construct the bottom up parsing tree for the same.
6. Compare top down parsing and bottom up parsing. U
7. Develop a parse tree structure for the sentence “I heard the story A
listening to the radio”.
8. How to implement the parser? U
9. What are the different levels of syntax identified for the parsing U
technique?
10. Explain in detail about various types of ambiguity. U
11. What is early parser? Write an algorithm for early parsing. U
12. What is CYK parser? Write an algorithm for CYK parsing. U
13. Construct the CYK algorithm and parsing the sentence Ram wrote an A
essay.
14. Discuss the disadvantages of the basic top-down parser with the help U
of an appropriate example.
9
LU-17 Probabilistic Parsing Period: 1
LU Outcome Level: A CO Number:2
1. Perform probabilistic parsing in the grammar
2. Use PCFG for developing algorithms in probabilistic methods
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What is meant by chunking and grammar induction? U
2. What are the three ways parsing the disambiguation? U
3. What is meant by tree bank? How to construct the Penn Treebank tree? U
4. Differentiate the parsing models and language models. U
5. Explain in detail about Weakening the independence assumptions of U
PCFGs
6. Explain in detail about LC stack parser. U
7. Write brief notes on phrase structure grammars and dependency U
grammars.
8. How to evaluate probabilistic parsing? U
9. Can one combine a leftmost derivation of a CFG with an n-gram model U
to produce a probabilistically sound language model that uses phrase
structure? If so, what kinds of independence assumptions does one
have to make? (If the approach you work out seems interesting, you
could try implementing it!)
10. The second sentence in the Wall Street Journal article referred to at
the start of the chapter is: The agency sees widespread use of the A
codes as a way of handling the rapidly growing mail volume and con-
trolling labor costs. Find at least five well-formed syntactic structures
for this sentence
11. Define PCFG. What is the need for that? U
12. What are the terms and notations used in the PCFG? U
13. Write short notes on features of PSFG. U
14. What is CNF? How to apply CNF in PCFG? U
15. Draw the diagrammatic representation for PRG. U
16. Explain in detail about inside and outside probabilities in PCFG. U
17. How to train the PCFG? U
18. What types of problems are identified in the PCFG? U
19. Construct the lexicalized tree for the following sentences A
a. The boy jumped over the fence.
b. Ram wrote an essay.
20. What does lexicalized grammar mean? How can lexicalization be U
achieved? Explain with the help of suitable examples.
21. List the characteristics of a garden path sentence. Give an example of a U
garden path sentence and show its correct parse.
22. Use the following grammar: A
S->NP VP, S->VP, NP->Det Noun, NP->Noun, NP->NP PP, VP->VP
NP, VP->Verb, VP->VP PP, PP->Preposition NP
Give two possible parse of the sentence: ‘Pluck the flower with the
stick’. Introduce lexicon rules for words appearing in the sentence. Us-
ing these parse trees obtain maximum likelihood estimates for the
grammar rules used in the tree. Calculate the probability of any one
parse tree using these estimates.
MODULE III - SEMANTIC ANALYSIS AND DISCOURSE PROCESSING
This module describes the meaning representation of languages and its characteristics. It
deals with lexical semantics relationships and internal structure of words. It explains various types
of ambiguities and its examples. This module also explains different methods of word sense
disambiguation such as supervised, unsupervised and dictionary based disambiguation. It also
helps us to perform discourse processing and discourse structures.
11
5. Evaluate the one sense per discourse constraint on a corpus. Find A
sections with multiple uses of an ambiguous word and determine how
many of them refer to a different meaning.
6. Find out the number of senses of ‘still’, ’bat’, and ‘cricket’ in WordNet. A
7. Create a small corpus from a news paper. Consider articles from A
specific domains (sports, whether, finance, politics, etc). using wordnet
determine how many senses there are for each of open class words in
each sentence.
LU-21 Word Sense Disambiguation Periods: 2
LU Outcome Level: A CO Number:3
1. Use different methods for eliminating ambiguity
2. Compare supervised and unsupervised learning process
PossibleAssessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What is meant by ambiguity and disambiguate? Give example. U
2. What are the different artificial methods applied for disambiguation? U
3. Compare supervised and unsupervised learning. U
4. Define pseudo words. R
5. What is the use of lower bound and upper bound baseline? U
6. Describe Bayes decision rule. R
7. Explain in detail about Bayesian disambiguation algorithm U
8. Explain in detail about information theoretic approach. U
9. The lower bound of disambiguation accuracy depends on how much A
information is available. Describe a situation in which the lower bound
could be lower than the performance that results from classifying all
occurrences of a word as instances of its most frequent sense. (Hint:
What knowledge is needed to calculate that lower bound?)
10. Create an artificial training and test set using pseudo words. Evaluate A
one of the supervised algorithms on it.
11. Explain in detail about Disambiguation based on sense definitions. U
12. Explain in detail about thesaurus-based disambiguation. U
13. How to perform disambiguation based on translations in a second- U
language corpus
14. Explain in detail about One sense per discourse, one sense per U
collocation.
15. How to find the log likelihood value using EM algorithm in unsupervised U
disambiguation?
16. Download a version of Roget’s thesaurus from the web (see the A
website), and implement and evaluate a thesaurus-based algorithm.
17. Discuss the validity of the “one sense per discourse” constraint for A
different types of ambiguity (types of usages, homonyms etc.).
Construct examples where the constraint is expected to do well and
examples where it is expected to do poorly.
18. Explain in detail about Lesk’s and walkers algorithm for sense U
disambiguation.
LU-22 Discourse Processing: Cohesion Period: 1
LU Outcome Level: A CO Number:3
1. Bind the text together using cohesion
2. Find the various types of reference in sentences
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What is meant by cohesion in discourse processing? U
2. Define reference? Give an example. U
3. Find the different types of reference for the following sentences: A
a. Suha bought a printer. It cost her Rs.20,000
b. I bought a printer today.
c. I bought a printer today. The printer didn’t work properly
d. Zuha forgets her pen drive in lab.
4. What do you mean by ellipsis? Give an example. U
5. What is lexical cohesion in discourse processing? Give example. U
6. Which aspects of discourse are language problems and which are U
general AI/knowledge representation problems?
7. Difference between ‘cohesion’ and ‘coherence’. Do you think that a U
non-cohesive text can be coherent? Is it possible for text to be
cohesive but not coherent? Give appropriate example.
LU-23 Reference Resolution Periods: 1
LU Outcome Level: A CO Number:3
1. Apply reference resolution algorithm for discourse processing
2. Solve various types of referents using constraints and preferences
PossibleAssessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Explain in detail about different types of agreement. U
2. What do you mean by selectional restrictions? Give an example. U
3. Consider the following text: A
Suha saw a laptop in the shop. She enquired about it.
She bought it.
We begin with the first sentence and assign weights to references using
anaphora resolution procedure.
4. Explain in detail about centering and Mitkov’s Pronoun resolution U
algorithms.
5. Draw architecture of a sequenced model. Explain each steps in detail. U
6. Given the sentence, The shopkeeper showed a Dell laptop to Ram and A
he liked it very much, find possible referents for the pronouns ‘he’ and
‘it’, and give the score of these referents for the following antecedent
indicators: definiteness, referential distance, indicating verbs, term
preference, section heading, non-prepositional noun phrase,
collocation.
LU-24 Discourse Coherence and Structure Periods: 2
LU Outcome Level: A CO Number:3
1. Find coherence relations using discourse structure
2. Identify conjunctions for coherence relation
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Identify the different coherence relations for the following sentences: A
a. At 9.00 am, the train arrived at Allahabad. The conference was
inaugurated at 10 a.m.
b. Suha ate all rice in boul. She was very hungry.
c. Suha likes reading novels. Zuha enjoys reading science fiction.
d. Ram does not like cricket. But he likes cricket more than any
other game.
e. Ravi bought a printer today. It is a laser printer.
2. Explain in detail about discourse interpretation. U
3. Consider the following text: A
The local administration stopped the trade union from meeting.
They feared violence.
Construct the interpretation structure for the above sentence.
4. Write short notes on discourse structure and discourse segmentation. U
5. Difference between elaboration and explanation relations with the help U
of suitable examples.
6. Give an abductive interpretation of the following text: A
The farmer was worried. There was little rain this year. List all the
assumption and world knowledge used in this process.
MODULE IV - PRAGMATICS
This module deals with reference resolution and reference Phenomena. There is also
mention about Pronoun Interpretation and Pronoun Resolution Algorithm. This module estimates
text coherence discourse structure and planning. This module also explains dialogue acts and
metaphors. It also deals with machine translation and its techniques.
13
LU-25 Reference Resolution Period: 1
LU Outcome Level: A CO Number:4
1. Apply reference resolution algorithm for discourse processing
2. Solve various types of referents using constraints and preferences
PossibleAssessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Explain in detail about different types of agreement. U
2. What do you mean by selectional restrictions? Give an example. U
3. Consider the following text: A
Suha saw a laptop in the shop. She enquired about it.
She bought it.
We begin with the first sentence and assign weights to references using
anaphora resolution procedure.
4. Explain in detail about centering and Mitkov’s Pronoun resolution U
algorithms.
5. Draw architecture of a sequenced model. Explain each steps in detail. U
6. Given the sentence, The shopkeeper showed a Dell laptop to Ram and A
he liked it very much, find possible referents for the pronouns ‘he’ and
‘it’, and give the score of these referents for the following antecedent
indicators: definiteness, referential distance, indicating verbs, term
preference, section heading, non-prepositional noun phrase,
collocation.
7. What is referring expression and referent? R
8. Explain in detail about coreference resolution and pronominal anaphora U
resolution.
9. Implement the Hobbs algorithm. Test it on a sample of the Penn A
TreeBank. You will need to modify the algorithm to deal with
differences between the Hobbs and TreeBank grammars.
LU-26 Reference Phenomena Period: 1
LU Outcome Level: A CO Number:4
Identify five types of referring expressions
15
PossibleAssessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What do you mean by metaphor? U
2. Why the metaphor is pervasive? Give example. U
3. What is meant by conventional metaphor? U
4. Consider the following example: A
The stock exchange wouldn’t talk publicly, but a spokesman said a
news conference is set for today to introduce a new technology product.
Assuming that stock exchanges are not the kinds that can literally talk,
give a sensible account for this phrase in terms of a metaphor or
metonymy.
Machine Translation- Syntactic Transformations – Lexical
LU-32 Period: 1
Transfers
LU Outcomes Level: A CO Number : 4
Perform machine translation for converting text or speech from one language to another
1.
language.
2. Perform syntactic transformations from one language order into another language
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Why the machine translation is hard? U
2. What is typology? Give example. U
3. Explain in detail about lexical divergences in machine translation. U
4. Construct the Vauquois triangle for machine translation. U
5. Difference between syntactic transfer and lexical transfer. U
6. Construct the syntactic transformations from English Order to Japanese A
Order for the sentence “He adores listening to music”
7. Give some informal description of syntactic transformations. U
LU-33 Direct/Statistical Translation Periods: 2
LU Outcomes Level: A CO Number : 4
1. Perform direct translation in bilingual dictionary.
2. Apply noisy channel model in statistical machine translation
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Translate the following sentence from English into Spanish using direct A
translation: “Mary didn’t slap the green witch”.
2. Write the procedure for direct translation with word sense U
disambiguation.
3. Using the formula P (f|e) compute P(Jean aime Marie| John loves Mary). A
4. Explain in detail about problems of statistical machine model. U
5. Explain in detail about noisy channel model of statistical alignment with U
neat diagram.
This module deals with architecture and applications of Natural Language Generation
systems. There is also mention about the different translation methods such as direct and
statistical translation. This module also explains translation involving in Indian languages. It also
helps us to explain lexical resources, stemmers and POS taggers.
LU-34 Natural Language Generation: Architecture of NLG Systems Period: 1
LU Outcomes Level: U CO Number : 5
1. Use different types of architecture in NLG system
2. Enumerate isolated NLP systems
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Give some examples of interaction with ELIZA. U
2. What are the two actions performed in the ELIZA system? U
3. What are the modules of NLG systems? R
4. Explain in detail about different architecture of NLG systems. U
5. Compare Pipelined, Interleaved and Integrated architecture for NLG U
systems.
6. In what ways does NLG differ from the earlier applications? U
7. How is the interleaved architecture different from integrated U
architecture?
LU-35 Generation Tasks and Representations Periods: 2
LU Outcomes Level: A CO Number : 5
1. Process three NLP tasks and its representations
2. Construct discourse plan of sentences
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Explain in detail about generation task of NLG systems. U
2. Construct the discourse planner for the following sentences: A
Savitha sang a song. The song was good although some people do
not like it.
3. Write short notes on augmented transition network. U
4. Define micro planning and sentence aggregation. R
5. Generate the referring expression for the sentence “Savitha sang a A
song”.
6. Discuss in detail approaches based on systemic grammar and functional U
unification grammar.
7. What is the role of micro-planner in NLG? U
8. Extend the systemic grammar to handle the following sentence: “Savitha A
danced at the party”
9. Extend the FUF grammar to handle the following sentences: A
a. Savitha danced in the party
b. Will Savitha sing a song?
c. Sing a song.
LU-36 Applications of NLG Periods: 1
LU Outcomes Level: U CO Number : 5
Use NLG systems for natural language interfaces
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What are the applications of NLG systems? U
2. How the NLG system is used for whether reporter system? U
3. What are the major tasks performed in NLP applications? U
4. Which NLP version is used to generate the web applications? U
LU-37 Machine Translation: Problems in Machine Translation Period: 1
LU Outcomes Level: A CO Number : 5
Perform machine translation for converting text or speech from one language to another
1.
language.
2. Identify the problems in machine translation
17
LU Outcomes Level: U CO Number : 5
1. Categorize Indian languages into different families
2. Identify common characteristics for Indian languages
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. What are four broad families of Indian languages? U
2. How are Indian languages related to the morphological variants? U
3. What do you mean by language divergence problem? Explain with the U
help of appropriate examples.
4. List characteristics that are common among Indian languages. U
LU-39 Machine Translation Approaches Periods: 1
LU Outcomes Level: A CO Number : 5
1. Use different machine translation techniques
2. Perform direct translation in bilingual dictionary.
3. Apply noisy channel model in statistical machine translation
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Translate the following sentence from English into Spanish using direct A
translation: “Mary didn’t slap the green witch”.
2. Write the procedure for direct translation with word sense U
disambiguation.
3. Using the formula P (f|e) compute P(Jean aime Marie| John loves Mary). A
4. Explain in detail about problems of statistical machine model. U
5. Explain in detail about noisy channel model of statistical alignment with U
neat diagram.
6. Consider the following English sentence: A
Ram slept in the garden
To translate this sentence into Hindi using direct translation.
7. Draw and explain schematic diagram for the transfer based model. U
8. Discuss in detail interlingua based machine translation. U
9. Apply the example based machine translation for the following A
sentences:
a. Rohit sings a song
b. Sheela is playing
c. Sheela is singing a song
d. Sheela sings a Hindhi song
10. Explain in detail semantic or knowledge based machine translation U
systems.
LU-40 Translation involving Indian Languages Period: 1
LU Outcomes Level: A CO Number : 5
Work with translation of English to Hindi language pair
Possible Assessment Questions: (Rating the level of questions – R, U, A, L, E, C)
Sl.No Test Questions Level
1. Discuss in detail ANGLABHARTI and SHAKTI translation of Indian U
languages.
2. What are the stages performed in SHAKTI implementation? U
3. Write short notes on MaTra language translation. U
4. Explain in detail about MANTRA and Anusaarak language translation. U
5. Write a program that takes an English sentence and reorders it to match A
word order in Hindi.
LU-41 Lexical Resources: WordNet – FrameNet Period: 1
LU Outcomes Level: A CO Number : 5
1. Use lexical resources such as WordNet and Framenet in NLP
2. Find different relations in WordNet and FrameNet
19