TR2013-073

Quantized Embeddings: An Efficient and Universal Nearest Neighbor Method for Cloud-based Image Retrieval


    •  Rane, S., Boufounos, P., Vetro, A., "Quantized Embeddings: An Efficient and Universal Nearest Neighbor Method for Cloud-based Image Retrieval", SPIE Conference on Applications of Digital Image Processing, August 2013.
      BibTeX TR2013-073 PDF
      • @inproceedings{Rane2013aug2,
      • author = {Rane, S. and Boufounos, P. and Vetro, A.},
      • title = {Quantized Embeddings: An Efficient and Universal Nearest Neighbor Method for Cloud-based Image Retrieval},
      • booktitle = {SPIE Conference on Applications of Digital Image Processing},
      • year = 2013,
      • month = aug,
      • url = {https://www.merl.com/publications/TR2013-073}
      • }
  • MERL Contacts:
  • Research Areas:

    Computational Sensing, Digital Video

Abstract:

We propose a rate-efficient, feature-agnostic approach for encoding image features for cloud-based nearest neighbor search. We extract quantized random projections of the image features under consideration, transmit these to the cloud server, and perform matching in the space of the quantized projections. The advantage of this approach is that, once the underlying feature extraction algorithm is chosen for maximum discriminability and retrieval performance (e.g., SIFT, or eigen-features), the random projections guarantee a rate-efficient representation and fast server-based matching with negligible loss in accuracy. Using the Johnson-Lindenstrauss Lemma, we show that pair-wise distances between the underlying feature vectors are preserved in the corresponding quantized embeddings. We report experimental results of image retrieval on two image databases with different feature spaces; one using SIFT features and one using face features extracted using a variant of the Viola-Jones face recognition algorithm. For both feature spaces, quantized embeddings enable accurate image retrieval