U.S. flag

An official website of the United States government, Department of Justice.

NCJRS Virtual Library

The Virtual Library houses over 235,000 criminal justice resources, including all known OJP works.
Click here to search the NCJRS Virtual Library

Ensemble Subspace Segmentation Under Blockwise Constraints

NCJ Number
304131
Author(s)
Handong Zhao; Zhengming Ding; Yun Fu
Date Published
2018
Length
14 pages
Annotation

Since the graph-based subspace segmentation technique has received a lot of attention in visual data representation but has the critical problem of how to build a block-diagonal affinity matrix, the current project proposes a novel graph-based method.

Abstract

The proposed method is called Ensemble Subspace Segmentation under Blockwise constraints (ESSB), which unifies least squares regression and a locality-preserving graph regularizer into an ensemble learning framework. Specifically, compact encoding using least squares regression coefficients helps achieve a block-diagonal representation matrix among all samples. Meanwhile, the locality preserving regularizer tends to capture the intrinsic local structure, which further enhances the block-diagonal property. Both the blockwise efforts, i.e., least squares regression and the sparse regularizer, work jointly and are formulated in the ensemble learning framework, making ESSB more robust and efficient, especially when handling high-dimensional data. Finally, an efficient optimization solution based on inexact augmented Lagrange multiplier is derived with theoretical time complexity analysis. To demonstrate the effectiveness of the proposed method, the project considered three applications: face clustering, object clustering, and motion segmentation. Extensive results of both accuracy and normalized mutual information on four benchmarks, i.e., YaleB, ORL, COIL and Hopkins155, are reported. Also, the evaluations of computational cost are provided, based on which the superiority of proposed method in both accuracy and efficiency is demonstrated compared with 12 baseline algorithms. (publisher abstract modified)