Semantic similarity measurement based on knowledge mining: an artificial neural net approach

This article presents a new approach to automatically measure semantic similarity between spatial objects. It combines a description logic based knowledge base (an ontology) and a multi-layer neural network to simulate the human process of similarity perception. In the knowledge base, spatial concep...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of geographical information science : IJGIS Vol. 26; no. 8; pp. 1415 - 1435
Main Authors Li, Wenwen, Raskin, Robert, Goodchild, Michael F.
Format Journal Article
LanguageEnglish
Published Abingdon Taylor & Francis 01.08.2012
Taylor & Francis LLC
Subjects
Online AccessGet full text
ISSN1365-8816
1362-3087
1365-8824
DOI10.1080/13658816.2011.635595

Cover

More Information
Summary:This article presents a new approach to automatically measure semantic similarity between spatial objects. It combines a description logic based knowledge base (an ontology) and a multi-layer neural network to simulate the human process of similarity perception. In the knowledge base, spatial concepts are organized hierarchically and are modelled by a set of features that best represent the spatial, temporal and descriptive attributes of the concepts, such as origin, shape and function. Water body ontology is used as a case study. The neural network was designed and human subjects' rankings on similarity of concept pairs were collected for data training, knowledge mining and result validation. The experiment shows that the proposed method achieves good performance in terms of both correlation and mean standard error analysis in measuring the similarity between neural network prediction and human subject ranking. The application of similarity measurement with respect to improving relevancy ranking of a semantic search engine is introduced at the end.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1365-8816
1362-3087
1365-8824
DOI:10.1080/13658816.2011.635595