Theory and algorithm for learning with dissimilarity functions

Liwei Wang, Masashi Sugiyama, Cheng Yang, Kohei Hatano, Jufu Feng

Research output: Contribution to journalLetterpeer-review

17 Citations (Scopus)

Abstract

We study the problem of classification when only a dissimilarity function between objects is accessible. That is, data samples are represented not by feature vectors but in terms of their pairwise dissimilarities. We establish sufficient conditions for dissimilarity functions to allow building accurate classifiers. The theory immediately suggests a learning paradigm: construct an ensemble of simple classifiers, each depending on a pair of examples; then find a convex combination of them to achieve a large margin. We next develop a practical algorithm referred to as dissimilarity- based boosting (DBoost) for learning with dissimilarity functions under theoretical guidance. Experiments on a variety of databases demonstrate that the DBoost algorithm is promising for several dissimilarity measures widely used in practice.

Original languageEnglish
Pages (from-to)1459-1484
Number of pages26
JournalNeural Computation
Volume21
Issue number5
DOIs
Publication statusPublished - May 2009

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Theory and algorithm for learning with dissimilarity functions'. Together they form a unique fingerprint.

Cite this