Connecting the Dots in Multi-Class Classification: From Nearest Subspace to Collaborative Representation

TR Image

We present a novel multi-class classifier that strikes a balance between the nearest-subspace classifier, which assigns a test sample to the class that minimizes the distance between the test sample and its principal projection in the selected class, and a collaborative representation based classifier, which classifies a sample to the class that minimizes the distance between the collaborative components of the test sample by using all training samples from all classes as the dictionary and its projection in the selected class. In our formulation, the sparse representation based classifier [1] and nearest subspace classifier become special cases under different regularization parameters. We show that the classification performance can be improved by optimally tuning the regularization parameter, which can be done at almost no extra computational cost. We give extensive numerical examples for digit identification and face recognition with performance comparisons of different choices of collaborative representations, in particular when only a partial observation of the test sample is available via compressive sensing measurements.