TR2002-55

Boosted Dyadic Kernel Discriminants


    •  Baback Moghaddam, Gregory Shakhnarovich, "Boosted Dyadic Kernel Discriminants", Tech. Rep. TR2002-55, Mitsubishi Electric Research Laboratories, Cambridge, MA, December 2002.
      BibTeX TR2002-55 PDF
      • @techreport{MERL_TR2002-55,
      • author = {Baback Moghaddam, Gregory Shakhnarovich},
      • title = {Boosted Dyadic Kernel Discriminants},
      • institution = {MERL - Mitsubishi Electric Research Laboratories},
      • address = {Cambridge, MA 02139},
      • number = {TR2002-55},
      • month = dec,
      • year = 2002,
      • url = {https://www.merl.com/publications/TR2002-55/}
      • }
Abstract:

We introduce a novel learning algorithm for binary classification with hyperplane discriminants based on pairs of training points from opposite classes (dyadic hypercuts). This algorithm is further extended to nonlinear discriminants using kernel functions satisfying Mercer\'s conditions. An ensemble of simple dyadic hypercuts is learned incrementally by means of a confidence-rated version of AdaBoost, which provides a sound strategy for searching through the finite set of hypercut hypotheses. In experiments with real-world datasets from the UCI repository, the generalization performance of the hypercut classifiers was found to be comparable to that of SVMs and k-NN classifiers. Furthermore, the computational cost of classification (at run time) was found to be similar to, or better than, that of SVM. Similarly to SVMs, boosted dyadic kernel discriminants tend to maximize the margin (via AdaBoost). In contrast to SVMs, however, we offer an on-line and incremental learning machine for building kernel discriminants whose complexity (number of kernel evaluations) can be directly controlled (traded off for accuracy).