The Mutual Information In The Vicinity of Capacity-Achieving Input Distributions


Creative Commons License

Cheng H., Nakiboğlu B.

2023 IEEE International Symposium on Information Theory, ISIT 2023, Taipei, Tayvan, 25 - 30 Haziran 2023, cilt.2023-June, ss.2111-2116 identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Cilt numarası: 2023-June
  • Doi Numarası: 10.1109/isit54713.2023.10206497
  • Basıldığı Şehir: Taipei
  • Basıldığı Ülke: Tayvan
  • Sayfa Sayıları: ss.2111-2116
  • Orta Doğu Teknik Üniversitesi Adresli: Evet

Özet

The mutual information is analyzed as a function of the input distribution using an identity due to Topsoe for channels with (possibly multiple) linear constraints and finite input and output sets. The mutual information is bounded above by a function decreasing quadratically with the distance to the set of all capacity-achieving input distributions for the case when the distance is less than a certain threshold. Explicit expressions for the threshold and the coefficient of the quadratic decrease are derived. A counter-example is provided demonstrating the non-existence of such a quadratic bound in the case of infinitely many linear cost constraints. Implications of these observations for the channel coding problem and applications of the proof technique to related problems are discussed.