A robust Graph Transformation Matching for non-rigid registration
In this paper, we propose a simple and highly robust point-matching method named Graph Transformation Matching (GTM) relying on finding a consensus nearest-neighbour graph emerging from candidate matches. The method iteratively eliminates dubious matches in order to obtain the consensus graph. The p...
        Saved in:
      
    
          | Published in | Image and vision computing Vol. 27; no. 7; pp. 897 - 910 | 
|---|---|
| Main Authors | , , , , , | 
| Format | Journal Article | 
| Language | English | 
| Published | 
            Elsevier B.V
    
        01.06.2009
     | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 0262-8856 1872-8138  | 
| DOI | 10.1016/j.imavis.2008.05.004 | 
Cover
| Summary: | In this paper, we propose a simple and highly robust point-matching method named
Graph Transformation Matching (GTM) relying on finding a consensus nearest-neighbour graph emerging from candidate matches. The method iteratively eliminates dubious matches in order to obtain the consensus graph. The proposed technique is compared against both the
Softassign algorithm and a combination of RANSAC and epipolar constraint. Among these three techniques, GTM demonstrates to yield the best results in terms of elimination of outliers. The algorithm is shown to be able to deal with difficult cases such as duplication of patterns and non-rigid deformations of objects. An execution time comparison is also presented, where GTM shows to be also superior to RANSAC for high outlier rates. In order to improve the performance of GTM for lower outlier rates, we present an optimised version of the algorithm. Lastly, GTM is successfully applied in the context of constructing mosaics of retinal images, where feature points are extracted from properly segmented binary images. Similarly, the proposed method could be applied to a number of other important applications. | 
|---|---|
| Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23  | 
| ISSN: | 0262-8856 1872-8138  | 
| DOI: | 10.1016/j.imavis.2008.05.004 |