Properties of jeffreys mixture for markov sources

Jun'Ichi Takeuchi, Tsutomu Kawabata, Andrew R. Barron

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)

Abstract

We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1\} is not of the form (nx\s + α)/(ns + β), where nx\s is the number of occurrences of the symbol x following the context s {0,1} and ns = n0\s + n1\s. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.

Original languageEnglish
Article number6307868
Pages (from-to)438-457
Number of pages20
JournalIEEE Transactions on Information Theory
Volume59
Issue number1
DOIs
Publication statusPublished - 2013

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'Properties of jeffreys mixture for markov sources'. Together they form a unique fingerprint.

Cite this