3,613 views
Information, simplicity and relevance Jean-Louis Dessalles (Télécom ParisTech) Claude Shannon based the notion of information on the idea of surprise, measured as the inverse of probability (in bits). His definition enabled the revolution in digital telecommunications. On the other hand, extending the notion of information to fields such as biology or human communication has proven problematic. Probability is not always calculable, or even definable. Its replacement by Kolmogorov complexity has proven useful for addressing structured fields. However, this leads to considering that random objects are maximally informative. However, for a biologist, random DNA contains no information. I propose to remain faithful to Shannon's basic hypothesis and to define relevant information based on surprise. Surprise is defined as a shift in Kolmogorov complexity (in limited resources). This definition proves useful for extending the notion of information to non-human observers (e.g. in biology). It is also essential to define the notion of relevance and to make predictions about human communication. Jean-Louis Dessalles is a Lecturer at Télécom ParisTech, Université Paris-Saclay. For a decade, he has been developing the Theory of Simplicity, which serves as a basis for modeling narrative interest and argumentative relevance. He is the author or co-author of several books, including "La pertinence et ses origines cognitives" (Hermes-science, 2008) and "Le fil de la vie" (Odile Jacob, 2016).