## Transcribed Text

1.) Let X and Y be random variables that take on values x1, x2, …, xr and y1, y2, …, ys
respectively. Let Z = X + Y.
(i) Show that H(Z|X) = H(Y|X)
(ii) If X and Y are independent, then show how H(Y) < = H(Z) and H(X) < = H(Z)
(iii)Under what condition will H(Z) = H(X) + H(Y)
(v) Calculate the probability that if somebody is “tall” (meaning taller than 6 ft), that
person must be male. Assume that the probability of being male is p(M) = 0.5 and so
likewise for being female p(F) = 0.5. Suppose that 20% of males are T (i.e. tall): p(T|M)
= 0.2; and that 6% of females are tall: p(T|F) = 0.06. Find p(M|T). If you know that
somebody is male, how much information do you gain (in bits) by learning that he is
also tall? How much do you gain by learning that a female is tall? Finally, how much
information do you gain from learning that a tall person is female?
2.)Consider the Binary Symmetric Channel (BSC) . It is a channel that transports 1’s and 0’s from
the transmitter (Tx) to the receiver (Ry). It makes an error occasionally, with probability p. A BSC
flips a 1 to 0 and vice-versa with equal probability. Let X and Y be binary random variables that
represent the input and output of this BSC respectively. Let the input symbols be equally likely and
the output symbols depend upon the input according to the channel transition probabilities.
i) What is the entropy H(X) of this binary source?
ii) What is the conditional entropy H(X|Y) ?
iii)Find the values of H(X|Y) for the following
values of p: a) 0.5 b) 0.2 c) 0.8 d) 0.1 e)0.9
iv)Find the values of Average Mutual Information I(X;Y) for the following
values of p: a) 0.1 b) 0.5 c) 0.9 d) 0.3 e)0.7
v) Fill in the blanks:
For this BSC, as we increase the value of p from 0 to 0.5 the value of I(X;Y)
(increases/decreases). At p = 0.5 the value of H(X) = and the value of
I(X;Y) = . For this BSC as we increase the value of p from 0.5 to 1 the value of
I(X;Y) (increases/decreases). At p = 0, the value of H(X) =
and the value of I(X;Y) =
. At p = 1, the value of H(X) = and the value of
I(X;Y) = .
vi.) Now, consider a binary symmetric communication channel, whose input source is
the alphabet
X = {0, 1} with probabilities {0.5, 0.5}; output alphabet Y = { 0, 1}; and with channel
matrix:
1 – α
α
where α is the probability of transmission error.
(i ) What is the entropy of the source, H(X)?
(ii ) What is the probability distribution of the outputs, p(Y), and what is the
entropy of this output distribution, H(Y)?
(iii)What is the joint probability distribution for the source and the output,
p(X,Y), and what is the joint entropy, H(X; Y )?
(iv) What is the mutual information of this channel, I(X; Y )?
(v) How many values are there for α for which the mutual information of this
channel is maximal and what are those values ?

These solutions may offer step-by-step problem-solving explanations or good writing examples that include modern styles of formatting and construction
of bibliographies out of text citations and references. Students may use these solutions for personal skill-building and practice.
Unethical use is strictly forbidden.