Information Theory
The uncertainly function as a measure of information. Properties of the uncertainty function. Source coding-Huffman codes. Optimum codes. Shannon's first theorem. Channel capacity. Shannon,s second theorem. Error correcting codes. Lower and upper bounds. The sphere-packing bound. Hamming codes. Linear codes. Cyclic codes, BCH codes. Error correcting capability of BCH codes. Shift registers for encoding and decoding cyclic codes. Continous channel with Gaussian noise. Average error probability.