-
- 信息论基础:信息量、信息熵、互信息等。
-
- 香农定理:通信信道的容量、无噪声信道的香农定理、有噪声信道的香农定理等。
-
- 熵与信息增益:熵、相对熵、条件熵、信息增益等。
-
- 编码与解码:香农编码、哈夫曼编码、信道编码、错误检测与纠正编码等。
-
- 其他相关主题,如:香农限定、信息论在机器学习中的应用等。
问题 1.
(a) Prove that the information measure is additive: that the information gained from observing the combination of $N$ independent events, whose probabilities are $p_i$ for $i=1 \ldots . N$, is the sum of the information gained from observing each one of these events separately and in any order.
(b) Calculate the entropy in bits for each of the following random variables:
(i) Pixel values in an image whose possible grey values are all the integers from 0 to 255 with uniform probability.
(ii) Humans classified according to whether they are, or are not, mammals.
(iii) Gender in a tri-sexed insect population whose three genders occur with probabilities $1 / 4,1 / 4$, and $1 / 2$.
(iv) A population of persons classified by whether they are older, or not older, than the population’s median age.
(c) Consider two independent integer-valued random variables, $X$ and $Y$. Variable $X$ takes on only the values of the eight integers $\{1,2, \ldots, 8\}$ and does so with uniform probability. Variable $Y$ may take the value of any positive integer $k$, with probabilities $P\{Y=k\}=2^{-k}, k=1,2,3, \ldots$
(i) Which random variable has greater uncertainty? Calculate both entropies $H(X)$ and $H(Y)$.
(ii) What is the joint entropy $H(X, Y)$ of these random variables, and what is their mutual information $I(X ; Y)$ ?
(d) What is the maximum possible entropy $H$ of an alphabet consisting of $N$ different letters? In such a maximum entropy alphabet, what is the probability of its most likely letter? What is the probability of its least likely letter? Why are fixed length codes inefficient for alphabets whose letters are not equiprobable? Discuss this in relation to Morse Code.
证明 .
这是信息论中熵的计算问题:
(i) 像素值,采用均匀分布的熵公式: H = log2(256); (ii) 人类是否为哺乳动物,二元熵公式: H = – ∑p log2(p),这里只有两个概率; (iii) 三性别昆虫,使用多元熵公式; (iv) 人口年龄分类,二元熵公式,两种情况的概率都是1/2。
问题 2.
2. Show that the average codeword length of $C_1$ under $p$ is equal to $H(p)$, and thus $C_1$ is optimal for $p$. Show that $C_2$ is optimal for $q$.
证明 . 这是计算不同随机变量的熵问题: (i) 用均匀分布熵公式H = log2(256); (ii) 二分类问题,确定性结果,熵为0; (iii) 三种性别,用熵公式H = – ∑p log2(p),其中p分别为1/4, 1/4, 1/2; (iv) 二分类问题,每种情况概率都是1/2,用二元熵公式计算。
问题 3.
3. Now assume that we use code $C_2$ when the distribution is $p$. What is the average length of the codewords? By how much does it exceed the entropy $H(p)$ ? Relate your answer to $D(p \| q)$.
证明 . (i) 用熵的定义计算$H(X)$和$H(Y)$,$H(X) = -\sum P(x) \log P(x)$, $H(Y) = -\sum P(y) \log P(y)$。 (ii) 先计算联合熵$H(X, Y) = -\sum P(x, y) \log P(x, y)$,然后用公式$I(X;Y) = H(X) + H(Y) – H(X, Y)$求互信息。
问题 4.
4. If we use code $C_1$ when the distribution is $q$, by how much does the average codeword length exceed $H(q)$ ? Relate your answer to $D(q \| p)$.证明 . (i) 最大熵$H$出现在所有字母等概率的情况,$H = \log N$。 (ii) 最可能和最不可能的字母在最大熵的情况下概率相同,为$1/N$。 (iii) 非等概率字母用固定长度编码效率低,因为频繁字母编码长,浪费编码空间。摩尔斯码是一种有效应对方法。
为什么选择我们的help-assignment代写团队?
1. 我们的专家具有丰富的经验,可以为您提供高质量的作业和论文代写服务。
2. 保证准时完成,让您无后顾之忧。
3. 保证100%原创,无抄袭。
4. 保密服务,您的隐私得到充分保护。
5. 7×24小时客户支持,随时解答您的疑问。
立即联系我们的help-assignment代写团队,让我们为您解决调和分析方面的难题,助您轻松学习!

help-assignment™是一个服务全球中国留学生的专业代写公司 专注提供稳定可靠的北美、澳洲、英国代写服务 专注于数学,统计,金融,经济,计算机科学,物理的作业代写服务
您好,这是一条评论。若需要审核、编辑或删除评论,请访问仪表盘的评论界面。评论者头像来自 Gravatar。