Magnum: Tackling high-dimensional structures with self-organization

Poyuan Mao, Yikfoong Tham, Heng Zhang, Danilo Vasconcellos Vargas

研究成果: ジャーナルへの寄稿学術誌査読

抄録

A big challenge in dealing with real-world problems is scalability. In fact, this is partially the reason behind the success of deep learning over other learning paradigms. Here, we tackle the scalability of a novel learning paradigm proposed in 2021 based solely on self-organizing principles. This paradigm consists of only dynamical equations which self-organize with the input to create attractor-repeller points that are related to the patterns found in data. To achieve scalability for such a system, we propose the Magnum algorithm, which utilizes many self-organizing subsystems (Inertia-SyncMap) each with subsets of the problem's variables. The main idea is that by merging Inertia-SyncMaps, Magnum builds over time a variable correlation by consensus, capable of accurately predicting the structure of large groups of variables. Experiments show that Magnum surpasses or ties with other unsupervised algorithms in all of the high-dimensional chunking problems, each with distinct types of shapes and structural features. Moreover, Inertia-SyncMap alone outperforms or ties with other unsupervised algorithms in six out of seven basic chunking problems. Thus, this work sheds light on how self-organization learning paradigms can be scaled up to deal with high-dimensional structures and compete with current learning paradigms.

本文言語英語
論文番号126508
ジャーナルNeurocomputing
550
DOI
出版ステータス出版済み - 9月 14 2023

!!!All Science Journal Classification (ASJC) codes

  • コンピュータ サイエンスの応用
  • 認知神経科学
  • 人工知能

フィンガープリント

「Magnum: Tackling high-dimensional structures with self-organization」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル