Boosting over non-deterministic ZDDs

Takahiro Fujita, Kohei Hatano, Eiji Takimoto

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)


We propose a new approach to large-scale machine learning, learning over compressed data: First compress the training data somehow and then employ various machine learning algorithms on the compressed data, with the hope that the computation time is significantly reduced when the training data is well compressed. As the first step, we consider a variant of the Zero-Suppressed Binary Decision Diagram (ZDD) as the data structure for representing the training data, which is a generalization of the ZDD by incorporating non-determinism. For the learning algorithm to be employed, we consider boosting algorithm called AdaBoost* and its precursor AdaBoost. In this work, we give efficient implementations of the boosting algorithms whose running times (per iteration) are linear in the size of the given ZDD.

Original languageEnglish
Title of host publicationWALCOM
Subtitle of host publicationAlgorithms and Computation - 12th International Conference, WALCOM 2018, Proceedings
EditorsM. Sohel Rahman, Wing-Kin Sung, Ryuhei Uehara
PublisherSpringer Verlag
Number of pages12
ISBN (Print)9783319751719
Publication statusPublished - 2018
Event12th International Conference and Workshop on Algorithms and Computation, WALCOM 2018 - Dhaka, Bangladesh
Duration: Mar 3 2018Mar 5 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10755 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Other12th International Conference and Workshop on Algorithms and Computation, WALCOM 2018

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)


Dive into the research topics of 'Boosting over non-deterministic ZDDs'. Together they form a unique fingerprint.

Cite this