52,104 views
Here's a quick introduction to Claude Shannon's information theory, and why it's fascinating from both a philosophical and engineering perspective. • Introduction - 0:00 • Quiz 1 - 8:21 • Magic trick - 12:39 • Explanation by information theory - 20:37 • Fundamental postulate of information theory - 27:21 • Quiz 2 - 32:00 • Shannon's entropy - 37:06 • Quiz 3 - 42:38 • Calculating log2 on a calculator - 51:35 • Quiz 4 - 57:15 • Joint and conditional entropy - 1:06:36 • Quiz 5 - 1:21:58 • The average length of a code is greater than the entropy - 1:37:53 • Quiz 6 - 1:58:54 • Jensen's inequality - 2:10:50 • Kullback-Leibler (KL) divergence - 2:17:18 • Quiz 7 - 2:29:30 • Mutual information - 2:36:17 • Quiz 8 - 2:46:40 • The entropy diagram - 2:56:30 • Quiz 9 - 2:59:34 • Quiz 10 - 3:08:21 • The final result of the live audience - 3:16:28 • Mutual information does not imply correlation - 3:19:00 Link for the quizzes: bit.ly/S4A_live2_Q1 Link for the correction: bit.ly/score_bayes #COVID19 #cds #cafédessciences Twitter: / le_science4all Facebook: / science4allorg Tipeee: https://www.tipeee.com/science4all My goodies: https://shop.spreadshirt.fr/science4all My upcoming dates: https://www.dropbox.com/s/t3abghdmh59... The formula of knowledge (my 1st book): https://laboutique.edpsciences.fr/pro... The fabulous construction site (my 2nd book, with El Mahdi El Mhamdi): https://laboutique.edpsciences.fr/pro... Probably? in audio: http://playlists.podmytube.com/UC0NCb... Me in podcast with Mr Phi: YouTube version: / @axiome7403 Audio version: http://feeds.feedburner.com/Axiome Subtitles on other videos: http://www.youtube.com/timedtext_cs_p...