Self-similarity for accurate compression of point sampled surfaces

Most surfaces, be it from a fine‐art artifact or a mechanical object, are characterized by a strong self‐similarity. This property finds its source in the natural structures of objects but also in the fabrication processes: regularity of the sculpting technique, or machine tool. In this paper, we pr...

Full description

Saved in:
Bibliographic Details
Published inComputer graphics forum Vol. 33; no. 2; pp. 155 - 164
Main Authors Digne, Julie, Chaine, Raphaëlle, Valette, Sébastien
Format Journal Article
LanguageEnglish
Published Oxford Blackwell Publishing Ltd 01.05.2014
Wiley
Subjects
Online AccessGet full text
ISSN0167-7055
1467-8659
DOI10.1111/cgf.12305

Cover

More Information
Summary:Most surfaces, be it from a fine‐art artifact or a mechanical object, are characterized by a strong self‐similarity. This property finds its source in the natural structures of objects but also in the fabrication processes: regularity of the sculpting technique, or machine tool. In this paper, we propose to exploit the self‐similarity of the underlying shapes for compressing point cloud surfaces which can contain millions of points at a very high precision. Our approach locally resamples the point cloud in order to highlight the self‐similarity of the shape, while remaining consistent with the original shape and the scanner precision. It then uses this self‐similarity to create an ad hoc dictionary on which the local neighborhoods will be sparsely represented, thus allowing for a light‐weight representation of the total surface. We demonstrate the validity of our approach on several point clouds from fine‐arts and mechanical objects, as well as a urban scene. In addition, we show that our approach also achieves a filtering of noise whose magnitude is smaller than the scanner precision.
Bibliography:ark:/67375/WNG-JSRM6WXK-M
ArticleID:CGF12305
istex:0EA7F02323A58B460244D41D1E11EFA75755E5EF
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.12305