Document Type

Syllabus

Description

Information theory deals with encoding data in order to transmit it correctly and effectively. Statistics and machine learning deal with estimating models of data and predicting future observations. ls there any relationship between the two? It turns out, perhaps not surprisingly, that the most compact encoding of the data is by the probabilistic model that describes it best. In other words, there is a fundamental link between information and probability.

This course starts with the basic notions of information theory and explores its relationship to machine learning and statistics. The course will have a strong theoretical component, but will also focus on applications and computing. The topics to be covered are:

• Entropy, mutual information, Kullback-Leibler and Bregman divergences;
• Source-channel models; boosting and optimal betting strategy;
• Maximum likelihood and Bayesian inference;
• Channel capacity, rate distortion and information bottleneck method;
• Maximum entropy principle, information geometry and alternating algorithms;
• Large deviations, coding theory and approximate inference in graphical models.

Publication Date

Fall 2007

College

College of Engineering and Computer Science

Department

Computer Science

Course Number

CS 790


Share

COinS