Design method of triplet decision tree classifier with division-wait mechanism

Masanobu Yoshikawa, Sadao Fujimura, Shojiro Tanaka, Ryuei Nishii

Research output: Contribution to journalConference articlepeer-review


A multistep method for segmentation of feature space using triplet decision tree is developed, and another approach to cope with uncertain samples by extended Bayesian discriminant function is introduced. The latter has the lower limit for posterior probability of classification. The triplet-decision tree includes a division-wait mechanism that postpone the decision about uncertain samples which are in marginal area and not able to be classified to any categories definitely. The third node is generated for such samples. Improvement of the triplet tree method is made by introducing linearly-combined variables related to principal components. Flexible and effective segmentation is accomplished by this refinement. Results of experiments by simulation data and real remotely-sensed data are compared by the two methods in the viewpoint of cutting of feature space and classification accuracy. When the normality or representability of sample is hold, classifier with extended quadratic discrimination function has the best performance. The advantage of triplet tree appears when categories are diversified in nature or training samples have poor representabilities.

Original languageEnglish
Pages (from-to)52-62
Number of pages11
JournalProceedings of SPIE - The International Society for Optical Engineering
Publication statusPublished - Dec 1 1996
EventImage and Signal Processing for Remote Sensing III - Taormina, Italy
Duration: Sept 23 1996Sept 23 1996

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering


Dive into the research topics of 'Design method of triplet decision tree classifier with division-wait mechanism'. Together they form a unique fingerprint.

Cite this