Tokenizing

Other topics

Sentence and word tokenization from user given paragraph

from nltk.tokenize import sent_tokenize, word_tokenize
example_text = input("Enter the text:  ")

print("Sentence Tokens:")
print(sent_tokenize(example_text))

print("Word Tokens:")
print(word_tokenize(example_text))

Contributors

Topic Id: 8749

Example Ids: 27284

This site is not affiliated with any of the contributors.