Systematic Comparison of Vectorization Methods in Classification Context

Natural language processing has been the subject of numerous studies in the last decade. These have focused on the various stages of text processing, from text preparation to vectorization to final text comprehension. The goal of vector space modeling is to project words in a language corpus into a...

Full description

Saved in:
Bibliographic Details
Published inApplied sciences Vol. 12; no. 10; p. 5119
Main Authors Krzeszewska, Urszula, Poniszewska-Marańda, Aneta, Ochelska-Mierzejewska, Joanna
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.05.2022
Subjects
Online AccessGet full text
ISSN2076-3417
2076-3417
DOI10.3390/app12105119

Cover

More Information
Summary:Natural language processing has been the subject of numerous studies in the last decade. These have focused on the various stages of text processing, from text preparation to vectorization to final text comprehension. The goal of vector space modeling is to project words in a language corpus into a vector space in such a way that words that are similar in meaning are close to each other. Currently, there are two commonly used approaches to the topic of vectorization. The first focuses on creating word vectors taking into account the entire linguistic context, while the second focuses on creating document vectors in the context of the linguistic corpus of the analyzed texts. The paper presents the comparison of different existing text vectorization methods in natural language processing, especially in Text Mining. The comparison of text vectorization methods is possible by checking the accuracy of classification; we used the methods NBC and k-NN, as they are some of the simplest methods. They were used for the classification in order to avoid the influence of the choice of the method itself on the final result. The conducted experiments provide a basis for further research for better automatic text analysis.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2076-3417
2076-3417
DOI:10.3390/app12105119