0% found this document useful (0 votes)
334 views

Itc Imp Questions

The document discusses information theory concepts like entropy, joint entropy, conditional entropy, mutual information, and Shannon's source coding and channel coding theorems. It asks questions about calculating entropies for various probability distributions, applying coding schemes like Huffman coding and Shannon-Fano coding to data sources, describing channels like binary symmetric and erasure channels, explaining linear block codes using generator and parity check matrices, describing Hamming codes, cyclic codes, and convolutional codes using techniques like syndrome decoding, Viterbi algorithm, trellises, and state diagrams. It asks to determine generator and parity check matrices, encode and decode codewords, and find advantages and disadvantages of different coding schemes.

Uploaded by

Asa
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
334 views

Itc Imp Questions

The document discusses information theory concepts like entropy, joint entropy, conditional entropy, mutual information, and Shannon's source coding and channel coding theorems. It asks questions about calculating entropies for various probability distributions, applying coding schemes like Huffman coding and Shannon-Fano coding to data sources, describing channels like binary symmetric and erasure channels, explaining linear block codes using generator and parity check matrices, describing Hamming codes, cyclic codes, and convolutional codes using techniques like syndrome decoding, Viterbi algorithm, trellises, and state diagrams. It asks to determine generator and parity check matrices, encode and decode codewords, and find advantages and disadvantages of different coding schemes.

Uploaded by

Asa
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

1. What do you mean by entropy?

2. What do you mean by differential entropy and joint entropy?


3. Explain Shannon’s channel coding theorem.
4. Define joint entropy and conditional entropy and also derive their equations.
5. A discrete source transmits message x1, x2, and x3 with the probabilities 0.3, 0.4 and 0.3.
The source is connected to the channel given in figure. Calculate all the entropies.

6. Explain mutual information and derive its expression.


7. Find the mutual information for the channel shown.

8. A discrete memory less source is producing data with following probability. Apply
Shannon Fano coding procedure and calculate efficiency. Take M=2.

P(X) = {0.4, 0.2, 0.12, 0.08, 0.08, 0.08, 0.04}


9. Explain Shannon’s source coding theorem.
10. State and prove Shannon-Hartley Theorem, and explain noise BW trade-off.

11. Apply Huffman coding procedure for the following message ensemble. Take M=2.

P(X) = {0.4, 0.2, 0.12, 0.08, 0.08, 0.08, 0.04}


12. What do you mean by Binary symmetric channel and Binary Erasure channel?
13. Define the following terms:
Code word, block length, code rate, channel data rate, code vector.
14. What are perfect and systematic block codes?
15. Give the matrix description linear block codes.
16. For a (7,4) block code generated by [G] below, explain how the error syndrome helps in
correcting single bit error.

17. Describe syndrome decoding method to correct errors with suitable diagram and
description
18. What are Hamming codes?
19. For the (7,4) Hamming code the parity check matrix H is given by.

a) Construct the generator matrix.


b) The code word that begins with1010
c) If the received code word Y is 0111100, then decode this received code word

20. What do you mean by cyclic codes? Write down the basic properties of cyclic codes
21. Draw the block diagram of encoder for systematic (n,k) cyclic codes and explain briefly
22. Let the polynomial G(x)=x8 + x5 + x4 + x +1 Be the generator polynomial of a cyclic
code over GF(2) with block length 15
a) Find the generator matrix G
b) Find the parity check matrix H
23. Explain the decoding process for cyclic codes
24. Describe Viterbi Algorithm for decoding convolutional codes .
25. State the advantages and disadvantages of convolutional codes
26. Explain code tree, Trellis and state diagram for convolutional encoder with suitable
diagram.
27. Give the matrix description and generating function for convolutional codes .

You might also like