0% found this document useful (0 votes)
104 views16 pages

Lec3 Entropy 1

Given the probability distribution of the source is: P1 = 1/3 P2 = 1/3 P3 = 1/3 To find the entropy H: H = - Σ Pi log2 Pi i Where Pi is the probability of the ith symbol. H = - (1/3)log2(1/3) - (1/3)log2(1/3) - (1/3)log2(1/3) = - 3(1/3)log2(1/3) = log23 = 1.585 bits Therefore, the entropy H of the given source is 1.585 bits.

Uploaded by

Mission JEE
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
104 views16 pages

Lec3 Entropy 1

Given the probability distribution of the source is: P1 = 1/3 P2 = 1/3 P3 = 1/3 To find the entropy H: H = - Σ Pi log2 Pi i Where Pi is the probability of the ith symbol. H = - (1/3)log2(1/3) - (1/3)log2(1/3) - (1/3)log2(1/3) = - 3(1/3)log2(1/3) = log23 = 1.585 bits Therefore, the entropy H of the given source is 1.585 bits.

Uploaded by

Mission JEE
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

ENTROPY

Presented By:
Vibha Jain
Assistant Professor
Manipal University, Jaipur
Entropy
▪ Entropy is a measure of the average information content per source symbol.
▪ Entropy is average of information provided by source.
▪ Let source transmitting messages M = {m1, m2, …, mq} with probability of occurrence is
P = {p1, p2, …, pq}.
▪ Total probability of all messages will p1+p1+ … + Pq = σ𝑖=1 𝑃𝑖 should be 1
𝑞

▪ Where n = total number of messages transmitted by source, Out of n messages,


▪ Let n1 be the number of time m1 is transmitted.
▪ Let n2 be the number of time m2 is transmitted.
. . .
▪ Let nq be the number of time mq is transmitted.
▪ i.e n= n1 + n2 + … + nq
Entropy…

𝟏
▪ Information transmitted by single message is, Ik= 𝒍𝒐𝒈𝟐( ) bits
𝑷𝒌
𝟏
▪ Information transmitted by some message nk times is, Ik= 𝒏𝒌. 𝒍𝒐𝒈𝟐( ) bits
𝑷𝒌

▪ This means that,


𝟏
▪ total information transmitted by m1 (transmitted n1 number of time) is 𝒏𝟏. 𝒍𝒐𝒈𝟐( )
𝑷𝟏
𝟏
▪ total information transmitted by m2 (transmitted n2 number of time) is 𝒏𝟐. 𝒍𝒐𝒈𝟐( )
𝑷𝟏

. . .
𝟏
▪ total information transmitted by mq (transmitted nq number of time) is 𝒏𝒒. 𝒍𝒐𝒈𝟐( )
𝑷𝒒
Thus, Total information transmitted by source is:
𝟏 𝟏 𝟏
= 𝒏𝟏. 𝒍𝒐𝒈𝟐( ) + 𝒏𝟐. 𝒍𝒐𝒈𝟐( ) + . . . + 𝒏𝒒. 𝒍𝒐𝒈𝟐( )
𝑷𝟏 𝑷𝟐 𝑷𝒒
𝟏
= σ𝑞𝑖=1 𝒏𝒊. 𝒍𝒐𝒈𝟐( ) Where, ni is number of time symbol transmitted
𝑷𝒊
Therefor, Total Information
Entropy = ------------------------------
Number of messages
𝟏 𝟏
= )
𝑞
σ𝑖=1 𝒏𝒊. 𝒍𝒐𝒈𝟐(
𝒏 𝑷𝒊
𝑞 𝒏𝒊 𝟏
= σ𝑖=1 . 𝒍𝒐𝒈𝟐( )
𝒏 𝑷𝒊
𝒏𝒊
Since, is probability ➔ Pi
𝒏
𝟏
= )
𝑞
σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐(
𝑷 𝒊
Thus,
𝟏
Entropy H(S) = ) bits/ symbol
𝑞
σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐(
𝑷 𝒊
Entropy is average amount of Information that we can understand from source.
Example 1 Entropy…
Find the entropy of the following:
1) S = {s1, s2, s3, s4} ; P= {1/4, 1/4, 1/4, 1/4}
2) S = {s1, s2, s3, s4} ; P= {49/100, 49/100, 1/100, 1/100}
3) S = {s1, s2, s3, s4} ; P= {299/300, 1/300, 1/300, 1/300}
𝟏
Solution: Entropy H(S) = σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐( )
𝑞
𝑷𝒊
1) H(S) = 4 x (1/4 log2 (4))
= 2 bit/ symbols
2) H(S) = 2 x (49/100 log2 (100/49)) + 2 x (1/100 log2 (100))
= 0.98 log2 (100/49) + 0.02 log2 (100)
= 1.141 bit/ symbols
3) H(S) = 299/300 log2 (300/299)) + 3 x (1/300 log2 (300))
= 0.99 log2 (300/299) + 0.01 log2 (300)
= 0.097 bit/ symbols
Example 2 Entropy…

Find the entropy H(S) for the probabilities {1/2, 1/4, 1/4}
𝟏
Solution: Entropy H(S) = σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐( )
𝑞
𝑷𝒊
H(S) = 1/2 log2 (2)) + 2 x (1/4 log2 (4))
= 0.5 + 1
= 1.5 bit/ symbols

Example 3 Entropy…
Find the entropy H(S) for the probabilities {1/2, 1/4, 1/8, 1/8}
𝟏
Solution: Entropy H(S) = σ𝑖=1 𝑷𝒊. 𝒍𝒐𝒈𝟐( )
𝑞
𝑷𝒊
H(S) = 1/2 log2 (2)) + 1/4 log2 (4) + 2 x (1/8 log2 (8))
= 0.5 + 0.5 + 3/4
= 7/4 = 1.75 bit/ symbols
Properties of Entropy

▪ The entropy of discreate memoryless source is bounded as fallows:

0<= H(S) <= Log 2 m

Where m is the number of symbols of the source.

1) H(S) =0 if probability P=1. This Lower Bound on entropy correspond to no


uncertainty.

2) H(S) = Log 2m if Pk = 1/M for all symbols M then symbol are equally probable. This
Upper Bound on entropy correspond to maximum uncertainty.
Properties of Entropy… LOWER BOUND

▪ 1] Entropy is zero, i.e H(S) = 0 if event is sure.


If event is sure, then probability may be P=0 or P=1.
let we have two messages of P1=0 and P2=1
If P=0
𝟏
Entropy H(S) = σ 𝑞𝑖 = 𝑷𝒊. 𝒍𝒐𝒈𝟐( ) ➔ H = 0 --------- (1)
𝑷𝒊
1
If P=1
𝟏
Entropy H(S) = σ 𝑞𝑖 = 𝑷𝒊. 𝒍𝒐𝒈𝟐( )
𝑷𝒊
1 𝟏
= 1. 𝒍𝒐𝒈𝟐( ) == 1. 𝟎 ➔ H = 0 --------- (2)
𝟏
If event is sure there is no information i.e. information is
already known thus Entropy is zero.
▪ The maximum value of H can be obtained if P1=1/2 and P2=1/2
𝟏
Entropy H(S) = σ𝑞 𝑷 . 𝒍𝒐𝒈 )
𝑖= 𝒊 𝟐 (𝑷
𝒊
1
= ½ 𝒍𝒐𝒈𝟐(2) + ½ 𝒍𝒐𝒈𝟐(2)
= 𝒍𝒐𝒈𝟐(2) = 1 ➔ Hmax
Plot and H can be shown as:
Basic Properties of Entropy… UPPER BOUND

▪2] When Pk = 1/M for all symbols M then


symbol are equally probable. So, H = log2M
Given probability of all symbols Pk = 1/M
𝟏
Entropy H(S) = σ 𝑀 𝑷
𝑖 =1 𝒊. 𝒍𝒐𝒈 𝟐( )
𝑷𝒊
𝟏
Entropy H(S) = 𝑀
σ 𝑖=1(𝟏/𝑴). 𝒍𝒐𝒈𝟐( )
𝟏/𝑴
= 1. 𝒍𝒐𝒈𝟐( M )
= 𝒍𝒐𝒈𝟐 M = Hmax bits/ Symbol
Upper Bound of Entropy
ENTROPY…
Source Efficiency, Redundancy, and Information Rate

▪ Source Efficiency (η)


𝑯
η= 𝑯𝒎𝒂𝒙
Where,
H = Calculated Entropy of Source
Hmax = log2m ➔ Maximum Entropy

▪ Redundancy Rate (Re)


Re = 1 – η
▪ Information Rate (R)
R = r.H ➔ Bits/ Second
Where,
H= Entropy → bits/ message
r = Rate of which messages are generated → message/ Second
Assignment 2.1
For discrete memoryless source there are three sumbols with P1= α and P2 = P3. Find
the Entropy of the source.
Solution:
Given, P1= α and P2 = P3
as we know total probability =1
i.e. P1 + P2 + P3 = 1
α + P2 + P3 = 1
α + 2.P2 = 1
P2 = (1 – α)/ 2
Thus, P2 = P3 = (1 – α)/ 2
𝟏
Since, Entropy H = σ 𝒊=𝟏 𝑷𝒊. 𝒍𝒐𝒈𝟐( 𝑷 )
𝑴
𝒊
1 1 1
= 𝑃1. 𝑙𝑜𝑔2( ) + 𝑃2. 𝑙𝑜𝑔2( ) + 𝑃3. 𝑙𝑜𝑔2( )
𝑃1 𝑃2 𝑃3
1 1 – α 2
= α. 𝑙𝑜𝑔2(α ) +2. ( . 𝑙𝑜𝑔2( ))
2 1 – α
1 2
= α. 𝑙𝑜𝑔2( ) + (1 − α). 𝑙𝑜𝑔 ( )
1 – α
Assignment 2.2
S how that Entropy of the source with following probability distribution is [2 - 𝟐+𝒏
𝟐𝒏
]
S S1 S2 S3 ... Sn
P ½ ¼ 1/8 ... 1/2 n
Solution:
as we know taylor series expression, so
2 3 𝑛
[2 - 2+𝑛 ] = 1
+ + + . . . + and Given S S1 S2 S3 ... Sn
2𝑛 2 21 22 2
Since, P ½ ¼ 1/8 ... 1/2 n
𝟏
Entropy H = σ 𝒊=𝟏 𝑷𝒊. 𝒍𝒐𝒈𝟐( )
𝑴
𝑷𝒊
1 1 1 1
= 𝑃1. 𝑙𝑜𝑔2( ) + 𝑃2. 𝑙𝑜𝑔2( ) + 𝑃3. 𝑙𝑜𝑔2( ) + . . . +𝑃𝑖. 𝑙𝑜𝑔2( )
𝑃1 𝑃2 𝑃3 𝑃𝑖
= (21). 𝑙𝑜𝑔2(2) + (1). 𝑙𝑜𝑔2(4) + (1). 𝑙𝑜𝑔2(8) + . . . + (21𝑛 ). 𝑙𝑜𝑔2(2n)
4 8
1 2 3 𝑛
= + + + ...+
2 21 22 2𝑛
taylor series expression, so
= [2 - 2+𝑛
2𝑛
]
Assignment 2.3
The source emits three messages with probabilities P1=0.7, P2=0.2, P3=0.1 Calculate…
1) Source Entropy
2) Maximum Entropy
3) Source Efficiency
4) Redundancy
Solution:
1) H = σ 𝑴 𝑷 . 𝒍𝒐𝒈 𝟏 )
𝒊=𝟏 𝒊 𝟐(
𝑷
1 𝒊 1 1
= (0.7). 𝑙𝑜𝑔2( ) + (0.2). 𝑙𝑜𝑔 2( ) + (0.1). 𝑙𝑜𝑔 2( )
0.7 0.2 0.1
= 1.1568 bits/ message
2) Hmax = log2m
= log 2 3 = log 3 = 1.585 bits/ message
log 2
𝑯
3) η = 𝑯𝒎𝒂𝒙

= 1.1568 = 0.73 bits/ message


1.585
4) Re =1– η
= 1 – 0.73 = 0.27 bits/ message
Assignment 2.4
A discrete source emits one to six symbols once every m. sec. (mili second). The
symbol probabilities are ½, ¼, 1/8, 1/16, 1/32 and 1/32. Find the source Entropy
and Information Rate.
Solution:
𝟏
▪ Since Entropy H = σ 𝒊=𝟏 𝑷𝒊. 𝒍𝒐𝒈𝟐( )
𝑴
𝑷 𝒊
1 1 1 1 1
= (21). 𝑙𝑜𝑔2(2) + ( ). 𝑙𝑜𝑔2(4) + ( ). 𝑙𝑜𝑔2(8) + ( ). 𝑙𝑜𝑔2(16) + ( ). 𝑙𝑜𝑔2(32)+ ( ). 𝑙𝑜𝑔2(32)
4 8 16 32 32
1 3 5
= + + +
2 4
+ 2 x
2 4 8 16 32
= 1.9375 bits/ message
▪ Since Information Rate R = r.H
Given rate at which messages are generated r = 1/T where T = Time Period
r =1/10 -3 = 10 3 message/ second
R = r.H = 103 x 1.9375am=it.ga1rg9@ j3aip7ur..m5anibpail.tesdu/ second
Type of Entropy

Different Type of Entropy are:

▪ Marginal Entropy

▪ Joint Entropy

▪ Conditional Entropy

You might also like