2,309 views
2016 marks the centenary of the birth of Claude Shannon, who was one of the leaders of the digital revolution we are currently experiencing. John von Neumann, Alan Turing, and many other visionaries gave us computers that could process information. But it was Claude Shannon who introduced the modern concept of information. Information theory grew out of a two-part paper that Shannon published in 1948, when he was a researcher at Bell Laboratories. In this paper, Shannon showed how the previously vague notion of information could be precisely defined and quantified. He demonstrated the essential unity of all information media, pointing out that texts, telephone signals, radio waves, photographs, films, etc. could be encoded in the universal language of binary digits, or bits, a term that his paper was the first to use. Shannon put forward the idea that once information had become digital, it could be transmitted (or stored) with an error rate that could be made arbitrarily small, even if the channel itself was imperfect, a source of noise and error. This was a conceptual leap that led directly to robust and fast means of communication (and storage). Conference of the cycle "A text, a mathematician" of the Mathematical Society of France. April 13, 2016 at the National Library of France.