1. 信息论基础：信息量、信息熵、互信息等。
1. 香农定理：通信信道的容量、无噪声信道的香农定理、有噪声信道的香农定理等。
1. 熵与信息增益：熵、相对熵、条件熵、信息增益等。
1. 编码与解码：香农编码、哈夫曼编码、信道编码、错误检测与纠正编码等。
1. 其他相关主题，如：香农限定、信息论在机器学习中的应用等。
5. 其他相关主题，如：奇异积分、调和分析在偏微分方程中的应用等。

(a) Prove that the information measure is additive: that the information gained from observing the combination of $N$ independent events, whose probabilities are $p_i$ for $i=1 \ldots . N$, is the sum of the information gained from observing each one of these events separately and in any order. (b) Calculate the entropy in bits for each of the following random variables: (i) Pixel values in an image whose possible grey values are all the integers from 0 to 255 with uniform probability. (ii) Humans classified according to whether they are, or are not, mammals. (iii) Gender in a tri-sexed insect population whose three genders occur with probabilities $1 / 4,1 / 4$, and $1 / 2$. (iv) A population of persons classified by whether they are older, or not older, than the population’s median age. (c) Consider two independent integer-valued random variables, $X$ and $Y$. Variable $X$ takes on only the values of the eight integers $\{1,2, \ldots, 8\}$ and does so with uniform probability. Variable $Y$ may take the value of any positive integer $k$, with probabilities $P\{Y=k\}=2^{-k}, k=1,2,3, \ldots$ (i) Which random variable has greater uncertainty? Calculate both entropies $H(X)$ and $H(Y)$. (ii) What is the joint entropy $H(X, Y)$ of these random variables, and what is their mutual information $I(X ; Y)$ ? (d) What is the maximum possible entropy $H$ of an alphabet consisting of $N$ different letters? In such a maximum entropy alphabet, what is the probability of its most likely letter? What is the probability of its least likely letter? Why are fixed length codes inefficient for alphabets whose letters are not equiprobable? Discuss this in relation to Morse Code.

2. Show that the average codeword length of $C_1$ under $p$ is equal to $H(p)$, and thus $C_1$ is optimal for $p$. Show that $C_2$ is optimal for $q$.

3. Now assume that we use code $C_2$ when the distribution is $p$. What is the average length of the codewords? By how much does it exceed the entropy $H(p)$ ? Relate your answer to $D(p \| q)$.

4. If we use code $C_1$ when the distribution is $q$, by how much does the average codeword length exceed $H(q)$ ? Relate your answer to $D(q \| p)$.

help-assignment™是一个服务全球中国留学生的专业代写公司 专注提供稳定可靠的北美、澳洲、英国代写服务 专注于数学，统计，金融，经济，计算机科学，物理的作业代写服务

### 一个回复

1. 您好，这是一条评论。若需要审核、编辑或删除评论，请访问仪表盘的评论界面。评论者头像来自 Gravatar