Latent semantic analysis (LSA): Reduction of dimensions

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

2 Scopus citations

Abstract

After building the vector space model, we can represent and compare any type of objects of our study. Now we can discuss the question whether we can improve the vector space we have built. The importance of this question is related to the fact that the vector space model can have thousands of features, and possibly many of these features are redundant. Is there any way to get rid of the features that are not that important? Latent Semantic Analysis allows constructing new vector space model with smaller number of dimensions.

Original languageEnglish
Title of host publicationSpringerBriefs in Computer Science
PublisherSpringer
Pages17-19
Number of pages3
DOIs
StatePublished - 2019

Publication series

NameSpringerBriefs in Computer Science
ISSN (Print)2191-5768
ISSN (Electronic)2191-5776

Fingerprint

Dive into the research topics of 'Latent semantic analysis (LSA): Reduction of dimensions'. Together they form a unique fingerprint.

Cite this