Riemann surface
matlab

放心吧!我们的信息论专家团队将专业地解决您在信息论学习中遇到的所有挑战。我们拥有广泛的专业知识和丰富的经验,可以协助您完成高水平的作业和论文,确保您的学习道路顺利前行!

以下是一些我们可以帮助您解决的问题:

信息论基础概念:涵盖信息、熵、冗余等各种常用信息论概念的定义、性质和分类。

信息论结构:研究和应用于信道编码、误码纠正、信息压缩等的信息论结构。

证明与推理:常见的证明技巧和推理方法,如直接证明、归纳证明、反证法等。

信息论算法:信息论在算法设计和分析中的应用,包括编码算法、解码算法、优化算法等。

概率与统计:介绍信息论中的概率和统计概念与方法,如马尔科夫链、熵率、信息增益等。

信息优化:建模和求解信息优化问题,例如信道容量、码率优化、数据压缩等。

信息论与计算机科学:介绍信息论在计算机科学中的应用,例如数据压缩、信号处理、密码学等。

无论您面临的信息论问题是什么,我们都会竭尽全力提供专业的帮助,确保您的学习之旅顺利无阻!

问题 1.


Exercise 1
(a) Prove that the information measure is additive: that the information gained from observing the combination of $N$ independent events, whose probabilities are $p_i$ for $i=1 \ldots . . N$, is the sum of the information gained from observing each one of these events separately and in any order.

.


Solution:
(b) By definition, $H=-\sum_i p_i \log _2 p_i$ is the entropy in bits for a discrete random variable distributed over states whose probabilities are $p_i$. So:
(i) In this case each $p_i=1 / 256$ and the ensemble entropy summation extends over 256 such equiprobable grey values, so $H=-(256)(1 / 256)(-8)=8$ bits.
(ii) Since all humans are in this category (humans $\subset$ mammals), there is no uncertainty about this classification and hence the entropy is 0 bits.
(iii) The entropy of this distribution is $-(1 / 4)(-2)-(1 / 4)(-2)-(1 / 2)(-1)=1.5$ bits.
(iv) By the definition of median, both classes have probability 0.5 , so the entropy is 1 bit.

(c) Consider two independent integer-valued random variables, $X$ and $Y$. Variable $X$ takes on only the values of the eight integers ${1,2, \ldots, 8}$ and does so with uniform probability. Variable $Y$ may take the value of any positive integer $k$, with probabilities $P{Y=k}=2^{-k}, k=1,2,3, \ldots$
(i) Which random variable has greater uncertainty? Calculate both entropies $H(X)$ and $H(Y)$.
(ii) What is the joint entropy $H(X, Y)$ of these random variables, and what is their mutual information $I(X ; Y)$ ?

问题 2.

(a) Suppose that women who live beyond the age of 80 outnumber men in the same age group by three to one. How much information, in bits, is gained by learning that a person who lives beyond 80 is male?

Solution:
(a) Rewriting “live beyond the age of 80 ” simply as “old”, we have the conditional probabilities $p($ female $\mid$ old $)=3 p($ male $\mid$ old $)$ and also of course $p($ female $\mid$ old $)+p($ male $\mid$ old $)=1$. It follows that $p($ male|old $)=1 / 4$. The amount of information (in bits) gained from an observation is $-\log _2$ of its probability. Thus the information gained by such an observation is 2 bits worth.
(b) Consider $n$ discrete random variables, named $X_1, X_2, \ldots, X_n$, of which $X_i$ has entropy $H\left(X_i\right)$, the largest being $H\left(X_L\right)$. What is the upper bound on the joint entropy $H\left(X_1, X_2, \ldots, X_n\right)$ of all these random variables, and under what condition will this upper bound be reached? What is the lower bound on the joint entropy $H\left(X_1, X_2, \ldots, X_n\right)$ ?

E-mail: help-assignment@gmail.com  微信:shuxuejun

help-assignment™是一个服务全球中国留学生的专业代写公司
专注提供稳定可靠的北美、澳洲、英国代写服务
专注于数学,统计,金融,经济,计算机科学,物理的作业代写服务

发表回复

您的电子邮箱地址不会被公开。 必填项已用*标注