TY - JOUR

T1 - Relative entropy under mappings by stochastic matrices

AU - Cohen, Joel E.

AU - Iwasa, Yoh

AU - Rautu, Gh

AU - Beth Ruskai, Mary

AU - Seneta, Eugene

AU - Zbaganu, Gh

N1 - Funding Information:
Gerald S. Goodman, John Hajnal, M&us Iosifescesacun,d a referee made helpful commentso n previous dra$s. Gerald S. Goodmanf irst made us aware of the work of Csisza’r (1963) and produced a counterexampleo rz which Remark 4.3 is based. The work ($J.E.C. was supported in part by U.S. National Science Foundation grant BSR 87-05047 and the Japan Society$ )r the Promotion of Science,a nd by the hospitality <>Mf r. and Mrs. William T. Golden, the Institute j& Advanced Study, Princeton, New Jersey, and the Departments of Biophysics and Zoology, Kyoto University, Kyoto, Japan. Most of M. B.R.‘s contributions to this work were per_-@mecwl hile visiting AT& T Bell Laboratories and the Courant Institute of Mathematical Science.s of Neu? York University; her work was partially supported by these instihl-tions and by National Science Foundation grunts DMS 88-08112 and DMS 89-08125.

PY - 1993/1/15

Y1 - 1993/1/15

N2 - The relative g-entropy of two finite, discrete probability distributions x = (x1,...,xn) and y = (y1,...,yn) is defined as Hg(x,y) = Σkxkg (yk/kk - 1), where g:(-1,∞)→R is convex and g(0) = 0. When g(t) = -log(1 + t), then Hg(x,y) = Σkxklog(xk/yk), the usual relative entropy. Let Pn = {x ∈ Rn : σixi = 1, xi > 0 ∀i}. Our major results is that, for any m × n column-stochastic matrix A, the contraction coefficient defined as ηǧ(A) = sup{Hg(Ax,Ay)/Hg(x,y) : x,y ∈ Pn, x ≠ y} satisfies ηg(A) ≤1 - α(A), where α(A) = minj,kΣi min(aij, aik) is Dobrushin's coefficient of ergodicity. Consequently, ηg(A) < 1 if and only if A is scrambling. Upper and lower bounds on αg(A) are established. Analogous results hold for Markov chains in continuous time.

AB - The relative g-entropy of two finite, discrete probability distributions x = (x1,...,xn) and y = (y1,...,yn) is defined as Hg(x,y) = Σkxkg (yk/kk - 1), where g:(-1,∞)→R is convex and g(0) = 0. When g(t) = -log(1 + t), then Hg(x,y) = Σkxklog(xk/yk), the usual relative entropy. Let Pn = {x ∈ Rn : σixi = 1, xi > 0 ∀i}. Our major results is that, for any m × n column-stochastic matrix A, the contraction coefficient defined as ηǧ(A) = sup{Hg(Ax,Ay)/Hg(x,y) : x,y ∈ Pn, x ≠ y} satisfies ηg(A) ≤1 - α(A), where α(A) = minj,kΣi min(aij, aik) is Dobrushin's coefficient of ergodicity. Consequently, ηg(A) < 1 if and only if A is scrambling. Upper and lower bounds on αg(A) are established. Analogous results hold for Markov chains in continuous time.

UR - http://www.scopus.com/inward/record.url?scp=0242380938&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0242380938&partnerID=8YFLogxK

U2 - 10.1016/0024-3795(93)90331-H

DO - 10.1016/0024-3795(93)90331-H

M3 - Article

AN - SCOPUS:0242380938

SN - 0024-3795

VL - 179

SP - 211

EP - 235

JO - Linear Algebra and Its Applications

JF - Linear Algebra and Its Applications

IS - C

ER -