Transcribed Text
Assume suitable data wherever necessary
The relative entropy is : measure of the distance two distributions statistics
The
that when For example
we
knew
distribution
average
Definition
two
probability
mass
xex
Relative entropy always nonnegative is
zero and only if p = q. However, it is not a true distance between
distributions since not symmetric and does not satisfy the triangle
inequality. Nonetheless often useful think relative
"distance"
between
a)Specify
the
value
of the
following
If there symbol Y such that
i) p(s)0 q(x)0 q(x)>0 iii)(p(x)>0 g(x)0
b)
Let
and
q(0)
q(1)=
find the values D(pliq) and D(qlip)
2.
H(X1.
X1)

lf the average conditional mutual information of random variables x
and given Zis defined b
1(X:Y\Z)= H(XZ)H(XIY.Z)
3. Prove that
I(X1.X2
i=]
Differential Entropy and Conditional Entropy for continuous random
variables
(Extending
defisations
attached)
The Differential Entropy of continuous random variable Xi defined
as
The Average Conditional Entropy of continuous random variable X
given Y is defined as
The Average Mutual Information between two continuous random variables x
and Y
Let and be random variables with joint probability density function
(pdf) p(x.y) and marginal pdfs p(x) and p(y)
The Average Mutual Information between two continuous random
variables and Y defined as
I(X:Y)
log( p(y)x)p(x)((p(x)p(y))) dx dy
The average mutual information can be expressed as
Explain why the definitions entropy. average conditional entropy. average
mutual information can carried over from discrete random variables to
randorn variables, but the concept and physical interpretation
cannot?
b) variables with joint PDF
otherwise
Find the marginal PDFs fx(z) and fx(y)
Suppose Discrete Memoryless Source (DMS) outputs symbol every
seconds
Each symbol
finite
set
of
symbols,
occurring
with
What
bits
source
symbol
Explain
how
d)
Wheni
represent
26
letters
the
English
number
bits
required
to
letter
represented
using
the
c)
Write
will
segment
that
Same
except
which
:
i) p(x) = 0 , q(x) = 0 ii) p(x) = 0 , q(x) > 0 iii) p(x) > 0 , q(x) = 0
Find D(pq) and D(qp). Also, find the values of D(pq) and D(qp) when r = s.
:
c) Write a code to call functions which will play the role of tables for z, t, chisquare and F
distributions, namely, for z, write a code segment that will take:
(a) z0 as input and give P(z<z0) as output;
(b) alpha as input and give z0 as output where P(z<z0)=alpha. Same for the rest, except it will take
degree(s) of freedom (dof) also as input.
:
4 a) Explain why the definitions of entropy, average conditional entropy, average mutual information can be carried over from discrete random variables to continuous random variables, but the concept and physical interpretations cannot?
:
c) What is the entropy H(X) of this DMS in bits per source symbol ? Explain
how this H(X) < = log2L
d) When is this H(X) = log2L ? Suppose we wish to represent 26 letters of the
English alphabet in bits, what is the minimum number of bits required to uniquely represent each of the letters (such that every letter is represented using the same number of bits) ?
c) Write a code in python or in any programming language: to call functions which will play the role of tables for z, t, chisquare and F distributions,namely, for z, write a code segment that will take:
(a) z0 as input and give P(z<z0) as output;
(b) alpha as input and give z0 as output where P(z<z0)=alpha. Same for the rest, except it will take degree(s) of freedom (dof) also as input.
(This part is related to sampling distributions of statistics, which should be implemented on personal computer.)
***
This material may consist of stepbystep explanations on how to solve a problem or examples of proper writing, including the use of citations, references, bibliographies, and formatting. This material is made available for the sole purpose of studying and learning  misuse is strictly forbidden.
4) C.
Z table
i) When z0 as input>>>>>>>>>>
from scipy.stats import norm
z0 = float(input("Z0="))
value = norm.cdf(z0)
print(value)
ii) When alpha as the input>>>>>>>>
from scipy.stats import norm
alpha = float(input("Alpha="))
value = norm.ppf(alpha)
print(value)
t Table
i) When z0 as input>>>>>>>>>
from scipy.stats import t
Z0 = float(input("Z0="))
df =int(input("Degree of Freedom="))
value = t.cdf(Z0, df)
print (value)
ii) When alpha as input>>>>>>
from scipy.stats import t
alpha = float(input("Alpha="))
df =int(input("Degree of Freedom="))
value = t.ppf(alpha, df)
print (value)
Chi Square Distribution
i) When z0 as input>>>>>>>>>...