Information, broadcasting and protection of error codes

Degree course: 
Corso di First cycle degree in COMPUTER SCIENCE
Academic year when starting the degree: 
2016/2017
Year: 
2
Academic year in which the course will be held: 
2017/2018
Course type: 
Compulsory subjects, characteristic of the class
Credits: 
6
Period: 
Second semester
Standard lectures hours: 
52
Detail of lecture’s hours: 
Lesson (40 hours), Exercise (12 hours)
Requirements: 

Basics of algebra, linear algebra, and calculus. Ability to read/understand English textbooks.

Final Examination: 
Orale

Exam is both written and oral. During the written exam, the students are prompted
to show their understanding of the topics covered during the course, by applying
theoretical results to simple (and idealized) practical scenarios. If passed,
the written exam grants access to the subsequent oral examination whose first question
is always an open question at the student's choice. During the oral examination, the
student is required to show significant understanding of the discrete probabilistic
modeling related to information and coding, and to be able to motivate their conceptual
underpinning. The overall examination is passed if a score of 18 (out of 30) or higher is
obtained.

Assessment: 
Voto Finale

This course introduces the basics of Shannon's information theory (entropy, mutual information, information compression, lossless transmission), along with its connections to the theory of error-correcting codes. The needed tools in basic probability are also introduced here. In summary, the course objectives and expected outcomes are the following:
• Get familiar with the basics of discrete probability, and discrete probabilistic models.
• Understanding the connections between information and compression;
• Understanding the connections between information and transmission;
• Understanding the basics of error-correcting codes.

Introduction to probability (20h lectures + 6h exercises):
Introduction to discrete probability and combinatorics: Sample spaces, events, probability functions, conditional probabilities, independence; discrete random variables and densities; Bernoulli, binomial, geometric random variables; expectation and variance of a random variable; Chebychev inequality; expectation and variance of functions of random variables; (weak) law of large numbers.
Introduction to information theory and error-correcting codes (20h lectures + 6h exercises):
Uncertainty and associated information; additivity of information; discrete and memoryless Information sources; entropy and its properties; source coding; uniquely decodable codes; prefix codes; Kraft and McMillan inequalities; entropy and average code length; noiseless coding; compact coding; mutual information and information channels; conditional and joint entropy; channel capacity; decoding and error probability; minimal Hamming distance; reliability vs. redundancy trade-off; channel capacity as upper bound on reliable transmission. Introduction to linear error-correcting codes: encoding, decoding; systematic codes; shrinking and stretching; Hamming codes; Reed-Muller codes.

- Paolo Baldi, Calcolo delle probabilita', Mc Graw-Hill (Chapters 1 and 2);
- Norman Abramson, Information theory and coding, Mc Graw-Hill (Chapters 1-6);
- E. Angeleri, Informazione: Significato e universalita', Utet 2000 (Section 6.3);
- Slides from lecturer, available at the e-learning website

Lectures (52h), including 12h of exercises.

Professors