In text mining, tokenizing is the process of

A) categorizing a block of text in a sentence.
B) reducing multiple words to their base or root.
C) transforming the term-by-document matrix to a manageable size.
D) creating new branches or stems of recorded paragraphs.

A

Business

You might also like to view...

Internet marketing practices have raised a number of ethical and legal questions. Why is invasion of privacy perhaps the number one online marketing concern?

What will be an ideal response?

Business

What is a secured transaction?

What will be an ideal response?

Business