Riemann surface
matlab

Exercise 1
(a) Prove that the information measure is additive: that the information gained from observing the combination of $N$ independent events, whose probabilities are $p_i$ for $i=1 \ldots . . N$, is the sum of the information gained from observing each one of these events separately and in any order.

.

Solution:
(a) The information measure assigns $\log 2(p)$ bits to the observation of an event whose probability is $p$. The joint probability of a combination of $N$ independent events whose probabilities are $p_1 \ldots p_N$ is $\prod{i=1}^N p_i$. Thus the information content of such a combination is:
$$\log 2\left(\prod{i=1}^N p_i\right)=\log _2\left(p_1\right)+\log _2\left(p_2\right)+\cdots+\log _2\left(p_N\right)$$
which is the sum of the information content of all of the separate events.
(b) Calculate the entropy in bits for each of the following random variables:
(i) Pixel values in an image whose possible grey values are all the integers from 0 to 255 with uniform probability.
(ii) Humans classified according to whether they are, or are not, mammals.
(iii) Gender in a tri-sexed insect population whose three genders occur with probabilities $1 / 4,1 / 4$, and $1 / 2$.
(iv) A population of persons classified by whether they are older, or not older, than the population’s median age.

(b) Calculate the entropy in bits for each of the following random variables:
(i) Pixel values in an image whose possible grey values are all the integers from 0 to 255 with uniform probability.
(ii) Humans classified according to whether they are, or are not, mammals.
(iii) Gender in a tri-sexed insect population whose three genders occur with probabilities $1 / 4,1 / 4$, and $1 / 2$.
(iv) A population of persons classified by whether they are older, or not older, than the

Solution:
(b) By definition, $H=-\sum_i p_i \log _2 p_i$ is the entropy in bits for a discrete random variable distributed over states whose probabilities are $p_i$. So:
(i) In this case each $p_i=1 / 256$ and the ensemble entropy summation extends over 256 such equiprobable grey values, so $H=-(256)(1 / 256)(-8)=8$ bits.
(ii) Since all humans are in this category (humans $\subset$ mammals), there is no uncertainty about this classification and hence the entropy is 0 bits.
(iii) The entropy of this distribution is $-(1 / 4)(-2)-(1 / 4)(-2)-(1 / 2)(-1)=1.5$ bits.
(iv) By the definition of median, both classes have probability 0.5 , so the entropy is 1 bit.

E-mail: help-assignment@gmail.com  微信:shuxuejun

help-assignment™是一个服务全球中国留学生的专业代写公司