PSU Mark
Eberly College of Science Mathematics Department

Meeting Details

For more information about this meeting, contact Alexei Novikov, Anna Mazzucato, Victor Nistor, Manfred Denker.

Title:Useful information and small probabilities
Seminar:Probability and Financial Mathematics Seminar
Speaker:Yuri Suhov, Penn State University
One of most famous (and practically useful) results of Shannon's information theory is the Noiseless coding theorem providing a basis for Data-compression. In short, the theorem says that discarding data with low information enables us to reduce the amount of the used memory by a factor involving the information/entropy rate of the source (but not more). However, in modern practices, we are often inundated with information that is not particularly useful to us (if at all). Consequently, we may be interested in storing only those data which carry a certain weight/utility which is, typically, context-dependent. This leads to an idea of a {\it selected} Data-compression and a problem of a further reduction of the used memory. The concept of {\it weighted} information/entropy emerges, in conjunction with Large deviation probabilities, which allows us to assess the amount of memory needed to store data that are relevant (i.e., have a high utility rate). This is a joint work with I. Stuhl (University of Denver). I will not require preliminary knowledge of Probability or Information theory and plan to introduce the related concepts and facts in the course of the presentation.

Room Reservation Information

Room Number:MB106
Date:02 / 05 / 2016
Time:02:30pm - 03:30pm