Bayesian Likelihood Free Inference using Mixtures of Experts

Hien Duy Nguyen, Trungtin Nguyen, Florence Forbes

研究成果: 書籍/レポート タイプへの寄稿会議への寄与

抄録

We extend Bayesian Synthetic Likelihood (BSL) methods to non-Gaussian approximations of the likelihood function. In this setting, we introduce Mixtures of Experts (MoEs), a class of neural network models, as surrogate likelihoods that exhibit desirable approximation theoretic properties. Moreover, MoEs can be estimated using Expectation-Maximization algorithm-based approaches, such as the Gaussian Locally Linear Mapping model estimators that we implement. Further, we provide theoretical evidence towards the ability of our procedure to estimate and approximate a wide range of likelihood functions. Through simulations, we demonstrate the superiority of our approach over existing BSL variants in terms of both posterior approximation accuracy and computational efficiency.

本文言語英語
ホスト出版物のタイトル2024 International Joint Conference on Neural Networks, IJCNN 2024 - Proceedings
出版社Institute of Electrical and Electronics Engineers Inc.
ISBN(電子版)9798350359312
DOI
出版ステータス出版済み - 2024
イベント2024 International Joint Conference on Neural Networks, IJCNN 2024 - Yokohama, 日本
継続期間: 6月 30 20247月 5 2024

出版物シリーズ

名前Proceedings of the International Joint Conference on Neural Networks

会議

会議2024 International Joint Conference on Neural Networks, IJCNN 2024
国/地域日本
CityYokohama
Period6/30/247/5/24

!!!All Science Journal Classification (ASJC) codes

  • ソフトウェア
  • 人工知能

フィンガープリント

「Bayesian Likelihood Free Inference using Mixtures of Experts」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル