0% found this document useful (0 votes)
11 views2 pages

Applied Learning Assignment2

The document demonstrates text cleaning techniques using Python, specifically focusing on stemming and lemmatization. It utilizes the NLTK library for stemming with the PorterStemmer and the spaCy library for lemmatization. The code processes a list of words to produce their stems and lemmas, printing the results.

Uploaded by

limanimamc
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views2 pages

Applied Learning Assignment2

The document demonstrates text cleaning techniques using Python, specifically focusing on stemming and lemmatization. It utilizes the NLTK library for stemming with the PorterStemmer and the spaCy library for lemmatization. The code processes a list of words to produce their stems and lemmas, printing the results.

Uploaded by

limanimamc
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

//text cleaning

from [Link] import PorterStemmer


import spacy

words = ["running", "flies", "studies", "easily", "studying", "better"]

# Stemming
ps = PorterStemmer()
stems = [[Link](w) for w in words]
print("Stems:", stems)

# Lemmatization (spaCy)
nlp = [Link]("en_core_web_sm")
lemmas = [token.lemma_ for token in nlp(" ".join(words))]
print("Lemmas:", lemmas)

//tokenization

from [Link] import PorterStemmer


import spacy

words = ["running", "flies", "studies", "easily", "studying", "better"]

# Stemming
ps = PorterStemmer()
stems = [[Link](w) for w in words]
print("Stems:", stems)

# Lemmatization (spaCy)
nlp = [Link]("en_core_web_sm")
lemmas = [token.lemma_ for token in nlp(" ".join(words))]
print("Lemmas:", lemmas)

//stemitization
from [Link] import PorterStemmer
import spacy

words = ["running", "flies", "studies", "easily", "studying", "better"]

# Stemming
ps = PorterStemmer()
stems = [[Link](w) for w in words]
print("Stems:", stems)
# Lemmatization (spaCy)
nlp = [Link]("en_core_web_sm")
lemmas = [token.lemma_ for token in nlp(" ".join(words))]
print("Lemmas:", lemmas)

You might also like