Latent semantic analysis (LSA): Reduction of dimensions

Research output: Chapter in Book/Report/Conference proceedingChapter

1 Citation (Scopus)

Abstract

After building the vector space model, we can represent and compare any type of objects of our study. Now we can discuss the question whether we can improve the vector space we have built. The importance of this question is related to the fact that the vector space model can have thousands of features, and possibly many of these features are redundant. Is there any way to get rid of the features that are not that important? Latent Semantic Analysis allows constructing new vector space model with smaller number of dimensions.

Original languageEnglish
Title of host publicationSpringerBriefs in Computer Science
PublisherSpringer
Pages17-19
Number of pages3
DOIs
StatePublished - 1 Jan 2019

Publication series

NameSpringerBriefs in Computer Science
ISSN (Print)2191-5768
ISSN (Electronic)2191-5776

Fingerprint

Vector spaces
Semantics

Cite this

Sidorov, G. (2019). Latent semantic analysis (LSA): Reduction of dimensions. In SpringerBriefs in Computer Science (pp. 17-19). (SpringerBriefs in Computer Science). Springer. https://doi.org/10.1007/978-3-030-14771-6_4
Sidorov, Grigori. / Latent semantic analysis (LSA): Reduction of dimensions. SpringerBriefs in Computer Science. Springer, 2019. pp. 17-19 (SpringerBriefs in Computer Science).
@inbook{f11ab2bd28ea48f0a8d4c2ad57e1b2ef,
title = "Latent semantic analysis (LSA): Reduction of dimensions",
abstract = "After building the vector space model, we can represent and compare any type of objects of our study. Now we can discuss the question whether we can improve the vector space we have built. The importance of this question is related to the fact that the vector space model can have thousands of features, and possibly many of these features are redundant. Is there any way to get rid of the features that are not that important? Latent Semantic Analysis allows constructing new vector space model with smaller number of dimensions.",
author = "Grigori Sidorov",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/978-3-030-14771-6_4",
language = "Ingl{\'e}s",
series = "SpringerBriefs in Computer Science",
publisher = "Springer",
pages = "17--19",
booktitle = "SpringerBriefs in Computer Science",

}

Sidorov, G 2019, Latent semantic analysis (LSA): Reduction of dimensions. in SpringerBriefs in Computer Science. SpringerBriefs in Computer Science, Springer, pp. 17-19. https://doi.org/10.1007/978-3-030-14771-6_4

Latent semantic analysis (LSA): Reduction of dimensions. / Sidorov, Grigori.

SpringerBriefs in Computer Science. Springer, 2019. p. 17-19 (SpringerBriefs in Computer Science).

Research output: Chapter in Book/Report/Conference proceedingChapter

TY - CHAP

T1 - Latent semantic analysis (LSA): Reduction of dimensions

AU - Sidorov, Grigori

PY - 2019/1/1

Y1 - 2019/1/1

N2 - After building the vector space model, we can represent and compare any type of objects of our study. Now we can discuss the question whether we can improve the vector space we have built. The importance of this question is related to the fact that the vector space model can have thousands of features, and possibly many of these features are redundant. Is there any way to get rid of the features that are not that important? Latent Semantic Analysis allows constructing new vector space model with smaller number of dimensions.

AB - After building the vector space model, we can represent and compare any type of objects of our study. Now we can discuss the question whether we can improve the vector space we have built. The importance of this question is related to the fact that the vector space model can have thousands of features, and possibly many of these features are redundant. Is there any way to get rid of the features that are not that important? Latent Semantic Analysis allows constructing new vector space model with smaller number of dimensions.

UR - http://www.scopus.com/inward/record.url?scp=85064639669&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-14771-6_4

DO - 10.1007/978-3-030-14771-6_4

M3 - Capítulo

AN - SCOPUS:85064639669

T3 - SpringerBriefs in Computer Science

SP - 17

EP - 19

BT - SpringerBriefs in Computer Science

PB - Springer

ER -

Sidorov G. Latent semantic analysis (LSA): Reduction of dimensions. In SpringerBriefs in Computer Science. Springer. 2019. p. 17-19. (SpringerBriefs in Computer Science). https://doi.org/10.1007/978-3-030-14771-6_4