Relative entropy under mappings by stochastic matrices

Joel E. Cohen, Yoh Iwasa, Gh Rautu, Mary Beth Ruskai, Eugene Seneta, Gh Zbaganu

Research output: Contribution to journalArticlepeer-review

53 Citations (Scopus)


The relative g-entropy of two finite, discrete probability distributions x = (x1,...,xn) and y = (y1,...,yn) is defined as Hg(x,y) = Σkxkg (yk/kk - 1), where g:(-1,∞)→R is convex and g(0) = 0. When g(t) = -log(1 + t), then Hg(x,y) = Σkxklog(xk/yk), the usual relative entropy. Let Pn = {x ∈ Rn : σixi = 1, xi > 0 ∀i}. Our major results is that, for any m × n column-stochastic matrix A, the contraction coefficient defined as ηǧ(A) = sup{Hg(Ax,Ay)/Hg(x,y) : x,y ∈ Pn, x ≠ y} satisfies ηg(A) ≤1 - α(A), where α(A) = minj,kΣi min(aij, aik) is Dobrushin's coefficient of ergodicity. Consequently, ηg(A) < 1 if and only if A is scrambling. Upper and lower bounds on αg(A) are established. Analogous results hold for Markov chains in continuous time.

Original languageEnglish
Pages (from-to)211-235
Number of pages25
JournalLinear Algebra and Its Applications
Issue numberC
Publication statusPublished - Jan 15 1993

All Science Journal Classification (ASJC) codes

  • Algebra and Number Theory
  • Numerical Analysis
  • Geometry and Topology
  • Discrete Mathematics and Combinatorics


Dive into the research topics of 'Relative entropy under mappings by stochastic matrices'. Together they form a unique fingerprint.

Cite this