## ← Information.9.TheManifoldThingsInformationMeasures

• 1 Follower
• 166 Lines

### Get Embed Code x Embed video Use the following code to embed this video. See our usage guide for more details on embedding. Paste this in your document somewhere (closest to the closing body tag is preferable): ```<script type="text/javascript" src='https://amara.org/embedder-iframe'></script> ``` Paste this inside your HTML body, where you want to include the widget: ```<div class="amara-embed" data-url="http://www.youtube.com/watch?v=34mONTTxoTE" data-team="complexity-explorer"></div> ``` 3 Languages

Showing Revision 4 created 04/16/2017 by seonaid.

1. I've told you a bit about information.
2. Bits. Labels. Probabilities.
3. I is equal to minus the sum over i of
P sub i, log to the base 2 of P sub i
4. The fundamental formula of
information theory
5. I told you about mutual information
6. which is if I have two variables,
7. such as the input and output to a channel.
8. The mutual information tells you... is
9. equal to the amount of information that is
10. shared in common between input and output.
11. It is the information
12. that passes through
13. or gets through the channel.
14. And in fact, from Claude Shannon,
15. it's actually equal to
16. the practical channel capacity.
17. Or if I take the input probabilities, or
18. frequencies that maximize the mutual
19. information, that mutual information is the
20. rate at which information can be sent
21. reliably down this channel. You cannot
22. send information at a higher rate, and
23. you can send information at that rate.
24. This is a tremendously practical
25. application of information theory. Because
26. it tells us that if we have noisy channels
27. or lossy channels, channels where we're
28. using sound, channels where we're using light,
29. chanels if we're using electromagnetic
30. radiation, channels where we send information
31. through the mail, any such channel has
32. a capacity, and Shannon's theorem tells us
33. what that capacity is, it tells us that we
34. can't surpass it, and it tells us how to
35. achieve it. And this is at the basis of
36. the application of information theory to
37. practical communications, for instance via
38. fiber-optic cables.
39. So, there are some fun examples of this.
40. A nice way to look at this picture is that
41. here we have this channel. We have x in...
42. we have P of x sub i. Here we have
43. the output. We have P of y sub j,
y out, given x sub i in.
44. And then we have the associated mutual
information.
45. So here we have I(x), this is the
information in.

46. Here we have I(y), this is the
information that comes out.
47. The information that goes through
48. the channel like this is the mutual
49. information between the input and
50. the output. We can also look at
51. some things that I'm going to call "loss,"
52. and another thing that I'm going to call
53. "noise." So, what is loss? Loss is
54. information that goes into the channel,
55. but does not come out. Like the roaches
56. going into a roach motel. So, what is that?
57. It's information that we don't know about
58. the input, given that we know the output.
59. So, if we know the output, this is
60. residual stuff that went in, bits that
61. went in, that never came out. Similiarly,
62. the noise is information that came out
63. that didn't go in. So noise is stuff where
64. if we know exactly what went in, it's
65. residual bits of information that came
66. out that had no explanation in terms of
67. what went in. So we have a nice picture
68. in terms of the whole set of processes
69. that are going on in information. We have
70. the information going in, we have the
71. information going out. We have the loss,
72. which is information that goes in that
73. doesn't come out. We have noise, which is
74. informaiton that came from nowhere
75. that didn't go in - of course, it actually
76. comes from physical processes. And finally
77. we have the mutual information, which is
78. the information that actually goes through
79. the channel and that represents the
80. channel capacity.
81. So, I also talked a bit about computation.
82. So, if you have a digital computer. Here is
83. what digital computers looked like when
84. I was a kid... You had, like, a tape thing,
85. you had a bunch of little lights on the
86. front and switches, and then you
87. read the tape, and then it spewed out some
88. output, maybe on some paper tape -
you could even
89. put some input on paper tape - it would
90. have some memory like this. All a digital
91. computer is doing is
92. breaks up information
93. into bits which are the smallest chunks of
94. information, typically called 0 and 1, or
95. true and false, in a digital computer.
96. And then flips those bits
97. in a systematic fashion.
98. So for all their power and
99. all their stupidity, all that these
100. digital computers that we have, including
101. things like our smart phones, as well as
102. our desktops and supercomputers, all
103. they're doing is registering and storing
104. information as bits and then flipping
105. those bits in a systematic fashion.
108. which is that any digital computation
109. can be written in some kind of
circuit diagram.
110. Here's x, here's y, here's z. Here's
111. something where I make a copy of x,
112. I take an OR gate... This is "OR",
you will recall.
113. Here's a copy of X, here's X here.
114. This is X or Y.
115. Also known as X or Y.
116. And here i can say
117. for example, take an AND gate, and
118. I can here send this through a NOT gate
119. And then I can combine them in another
120. AND gate, And in the end, I think that
121. what I have is NOT X AND Z AND
(X OR Y).
122. So, when I have a digital computer,
123. what happens is that it takes bits of
124. information, it performs simple AND, OR,
NOT, and copy operations, and
125. by doing these sequentially, in whatever
126. order you wanted to do it, you end uo
127. evaluating arbitrary logical expressions...
128. NOT X and Z AND X or Y... whatever
129. that means, I have no idea what it means.
130. But it is what it is, it means what it is.
131. So, if we talk about digital computation,
132. all digital computers are is taking
133. information and processing it.
134. And if we put together computation
135. and communication,
136. and probabilities,
137. what we find is that taking together
138. the idea of information, processing
information as computation,
139. sending information reliably from
140. one place to another is communication
141. this information refers at bottom to the
142. probabilities of events... being sunny,
143. being rainy. Probability that a photon
144. going into a channel makes it out the
145. other side. Probability of 0,
146. probability of 1, probability of heads,
147. probability of tails... but when we put
together these three pieces
148. interlocking, what we get is the theory
149. of information.
150. And I hope that in the course
151. of these brief lectures here, I've been
152. able to convince you that these remarkable
153. processes that are going on all
154. around us, the fault, or result of the
155. information processing revolution that began
156. in the mid-twentieth century and continues
157. in fact, continues at an accelerating rate
158. to this day, can be understood with
159. a simple set of mathematical ideas that
160. are interlinked with each other, and give
161. a set of ideas of very profound richness
162. and impact on human society with
163. implications for... I don't know what!
164. Thank you for your attention,
Do well on the homework,
165. Exam will be multiple choice, I am sure
166. you will all do well.