Iterative Reconstrained Low-Rank Representation via Weighted Nonconvex Regularizer

Authors: Zheng, J., Lu, C., Yu, H., Wang, W. and Chen, S.

Journal: IEEE Access

Volume: 6

Pages: 51695-51707

eISSN: 2169-3536

DOI: 10.1109/ACCESS.2018.2870371

Abstract:

Benefiting from the joint consideration of geometric structures and low-rank constraint, graph low-rank representation (GLRR) method has led to the state-of-the-art results in many applications. However, it faces the limitations that the structure of errors should be known a prior, the isolated construction of graph Laplacian matrix, and the over shrinkage of the leading rank components. To improve GLRR in these regards, this paper proposes a new LRR model, namely iterative reconstrained LRR via weighted nonconvex regularization, using three distinguished properties on the concerned representation matrix. The first characterizes various distributions of the errors into an adaptively learned weight factor for more flexibility of noise suppression. The second generates an accurate graph matrix from weighted observations for less afflicted by noisy features. The third employs a parameterized rational function to reveal the importance of different rank components for better approximation to the intrinsic subspace structure. Following a deep exploration of automatic thresholding, parallel update, and partial SVD operation, we derive a computationally efficient low-rank representation algorithm using an iterative reconstrained framework and accelerated proximal gradient method. Comprehensive experiments are conducted on synthetic data, image clustering, and background subtraction to achieve several quantitative benchmarks as clustering accuracy, normalized mutual information, and execution time. Results demonstrate the robustness and efficiency of IRWNR compared with other state-of-the-art models.

Source: Scopus