Boosting over non-deterministic ZDDs

Takahiro Fujita, Kohei Hatano, Eiji Takimoto

Research output: Contribution to journalArticlepeer-review

Abstract

We propose a new approach to large-scale machine learning, learning over compressed data: First compress the training data somehow and then employ various machine learning algorithms on the compressed data, with the hope that the computation time is significantly reduced when the training data is well compressed. As a first step toward this approach, we consider a variant of the Zero-Suppressed Binary Decision Diagram (ZDD) as the data structure for representing the training data, which is a generalization of the ZDD by incorporating non-determinism. For the learning algorithm to be employed, we consider a boosting algorithm called AdaBoost⁎ and its precursor AdaBoost. In this paper, we give efficient implementations of the boosting algorithms whose running times (per iteration) are linear in the size of the given ZDD.

Original languageEnglish
Pages (from-to)81-89
Number of pages9
JournalTheoretical Computer Science
Volume806
DOIs
Publication statusPublished - Feb 2 2020

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Boosting over non-deterministic ZDDs'. Together they form a unique fingerprint.

Cite this