**Course title: **Information Theory

**Code: **3ФЕИТ10З040

**Number of credits (ECTS): **6

**Weekly number of classes: **3+1+1+0** **

**Prerequisite for enrollment of the subject:** Taken course: Mathematics 2

**Course Goals (acquired competencies): **Knowledge of properties of random signals, their auto-correlation functions and spectra and their transmission through telecommunication systems. Setting a statistical model for the basic components for information transmission and processing through a telecommunications system.

**Total available number of classes: **180

**Course Syllabus:** Introduction. Probability. Random variables. Description of random variables. Functional transformation of random variables. Statistical ensemble of random signals. Statistical mean values and their physical interpretation. Basic types of distribution of random variables. Distribution of sum and product of random variables. Correlation functions and spectra of random signals. Definitions and properties. Wiener-Khinchine’s theorem. Procedures for the experimental determination of correlations and spectra of random signals. Correlations and spectra of selected random signals. White Gaussian noise. Transmission of random signals through a linear transmission system. General statistical model of communication system. Definition of information. Information sources. Entropy. Types of information sources and their entropy. Information flux. Source coding. Principles and basic characteristics. The basic theorem of source coding. First Shannon theorem. Procedures for optimal coding (Fano, Huffman). Efficiency of entropic coding. Statistical model of the transmission communication channel. Mutual information. Channel capacity. Properties of symmetrical channels. Confidentiality of the channel transmission. Probability of error. Statistical theory of decision making. Optimum decision rule. Decision criteria (Bayes, minmax, Neyman-Pearson). Channel coding. Principles and basic characteristics. Second Shannon theorem. Basic examples.