## Abstract

We discuss the properties of Jeffreys mixture for a Markov model. First, we show that a modified Jeffreys mixture asymptotically achieves the minimax coding regret for universal data compression, where we do not put any restriction on data sequences. Moreover, we give an approximation formula for the prediction probability of Jeffreys mixture for a Markov model. By this formula, it is revealed that the prediction probability by Jeffreys mixture for the Markov model with alphabet {0,1\} is not of the form (n_{x\s} + α)/(n_{s} + β), where n_{x\s} is the number of occurrences of the symbol x following the context s {0,1} and n_{s} = n_{0\s} + n_{1\s}. Moreover, we propose a method to compute our minimax strategy, which is a combination of a Monte Carlo method and the approximation formula, where the former is used for earlier stages in the data, while the latter is used for later stages.

Original language | English |
---|---|

Article number | 6307868 |

Pages (from-to) | 438-457 |

Number of pages | 20 |

Journal | IEEE Transactions on Information Theory |

Volume | 59 |

Issue number | 1 |

DOIs | |

Publication status | Published - 2013 |

## All Science Journal Classification (ASJC) codes

- Information Systems
- Computer Science Applications
- Library and Information Sciences