TR2012-036

Coverage Optimized Active Learning for k - NN Classifiers



Fast image recognition and classification is extremely important in various robotics applications such as exploration, rescue, localization, etc. k-nearest neighbor (kNN) classifiers are popular tools used in classification since they involve no explicit training phase, and are simple to implement. However, they often require large amounts of training data to work well in practice. In this paper, we propose a batchmode active learning algorithm for efficient training of kNN classifiers, that substantially reduces the amount of training required. As opposed to much previous work on iterative single sample active selection, the proposed system selects samples in batches. We propose a coverage formulation that enforces selected samples to be distributed such that all data points have labeled samples at a bounded maximum distance, given the training budget, so that there are labeled neighbors in a small neighborhood of each point. Using submodular function optimization, the proposed algorithm presents a nearoptimal selection strategy for an otherwise intractable problem. Further we employ uncertainty sampling along with coverage to incorporate model information and improve classification. Finally, we employ locality sensitive hashing for fast retrieval of nearest neighbors during classification, which provides 1-2 orders of magnitude speedups thus allowing real-time classification with large datasets.