An introduction to latent semantic analysis

Latent Semantic Analysis (LSA) is a theory and method for extracting and representing the contextual-usage meaning of words by statistical computations applied to a large corpus of text (Landauer & Dumais, 1997). The underlying idea is that the aggregate of all the word contexts in which a given...

Full description

Saved in:
Bibliographic Details
Published inDiscourse processes Vol. 25; no. 2-3; pp. 259 - 284
Main Authors Landauer, Thomas K, Foltz, Peter W., Laham, Darrell
Format Journal Article
LanguageEnglish
Published Taylor & Francis Group 01.01.1998
Subjects
Online AccessGet full text
ISSN0163-853X
1532-6950
DOI10.1080/01638539809545028

Cover

More Information
Summary:Latent Semantic Analysis (LSA) is a theory and method for extracting and representing the contextual-usage meaning of words by statistical computations applied to a large corpus of text (Landauer & Dumais, 1997). The underlying idea is that the aggregate of all the word contexts in which a given word does and does not appear provides a set of mutual constraints that largely determines the similarity of meaning of words and sets of words to each other. The adequacy of LSA's reflection of human knowledge has been established in a variety of ways. For example, its scores overlap those of humans on standard vocabulary and subject matter tests; it mimics human word sorting and category judgments; it simulates word-word and passage-word lexical priming data; and, as reported in 3 following articles in this issue, it accurately estimates passage coherence, learnability of passages by individual students, and the quality and quantity of knowledge contained in an essay.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0163-853X
1532-6950
DOI:10.1080/01638539809545028