Fast Asymmetric and Discrete Cross-Modal Hashing With Semantic Consistency
Hashing has attracted widespread attention in the field of supervised cross-modal retrieval due to its advantages in search and storage. However, there are still some issues to be addressed, e.g.: 1) how to effectively combine sample and label semantics to learn hash codes; 2) how to reduce high com...
        Saved in:
      
    
          | Published in | IEEE transactions on computational social systems Vol. 10; no. 2; pp. 577 - 589 | 
|---|---|
| Main Authors | , , , , | 
| Format | Journal Article | 
| Language | English | 
| Published | 
        Piscataway
          IEEE
    
        01.04.2023
     The Institute of Electrical and Electronics Engineers, Inc. (IEEE)  | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 2329-924X 2373-7476  | 
| DOI | 10.1109/TCSS.2022.3195704 | 
Cover
| Summary: | Hashing has attracted widespread attention in the field of supervised cross-modal retrieval due to its advantages in search and storage. However, there are still some issues to be addressed, e.g.: 1) how to effectively combine sample and label semantics to learn hash codes; 2) how to reduce high computational requirements brought by computing a pairwise similarity matrix; and 3) how to effectively solve discrete optimization problems. To cope with them, a fast asymmetric and discrete cross-modal hashing (FADCH) method is proposed in this article. First, matrix factorization is leveraged to collaboratively construct a common semantic subspace between different modalities. Second, semantic consistency is preserved by aligning the common semantic subspace with the semantic representation constructed from labels, which effectively exploits the semantic complementarity of labels and samples. Third, we embed labels into hash codes and keep the correlation between different modal samples by using a pairwise similarity matrix. Fourth, we use an asymmetric strategy with relaxation to associate hash codes with semantic representation, which not only avoids the difficulty of symmetric frame optimization but also embeds more semantic information into the Hamming space. In addition, a strongly orthogonal constraint is introduced to optimize the hash codes. Finally, an effective optimization algorithm is developed to directly generate discrete hash codes while reducing the complexity from <inline-formula> <tex-math notation="LaTeX">O(n^{2}) </tex-math></inline-formula> to <inline-formula> <tex-math notation="LaTeX">O(n) </tex-math></inline-formula>. The experimental results on three benchmark datasets illustrate the superiority of the FADCH method. | 
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14  | 
| ISSN: | 2329-924X 2373-7476  | 
| DOI: | 10.1109/TCSS.2022.3195704 |