Got a YouTube account?

New: enable viewer-created translations and captions on your YouTube channel!

English subtitles

← 04_Unsupervised Learning

Get Embed Code
2 Languages

Showing Revision 1 created 01/18/2014 by Cogi-Admin.

  1. >> What about unsupervised learning?
  2. >> Right, so unsupervised learning we don't get those
  3. examples. We have just essentially something like inputs and
  4. we have to derive some structure from them just
  5. by looking at the relationship between the inputs themselves.
  6. >> Right, so give me an example of that.
  7. >> So, when you are studying different kinds of animals say, even as a kid.
  8. >> Mm-hm.
  9. >> You might start to say oh, there's these animals that all look kind of the
  10. same, they're four legged. I'm going to call all
  11. of them them dogs. Even if they happen to
  12. be horses or cows or whatever, but I
  13. have developed, without anyone telling me, this sort
  14. of notion that all these belong into the
  15. same class and it's different from things like trees.
  16. >> Which don't have 4 legs.
  17. >> Well some do, but I mean they have, they both bark, is all I'm saying.
  18. >> [LAUGH] Did I really set you up for that?
  19. >> Not on purpose.
  20. >> I'm sorry, I want to
  21. apologize to each and every one of you for that. But that was pretty good.
  22. Michael is very good at word play, which I guess is often unsupervised as well.
  23. >> No, I get a lot of supervision! [LAUGH]
  24. >> You certainly get a lot of feedback.
  25. >> Yeah, that's right. It's like, please stop doing that.
  26. >> So if supervised learning is about function approximation,
  27. then unsupervised learning is about description. It's about taking
  28. a set of data and figuring out how you
  29. might divide it up in one way or the other.
  30. >> Or maybe even summarization. It's not just
  31. a description, but it's a shorter description.
  32. >> Yeah, it's usually a concise. Compression.
  33. >> Compact, description. So I might take a bunch
  34. of pixels like I have here, and might say, male.
  35. >> [LAUGH] Wait, wait, wait, wait. I am pixels now?
  36. >> As far as we can tell.
  37. >> That's fine.
  38. >> I however, am not pixels. I know I am not
  39. pixels. I am pretty sure that rest of you are pixels.
  40. >> That's right.
  41. >> So, I have a bunch of pixels and I might say, male. And or I
  42. might say female, or I might say dog, or I might say tree, but the point
  43. is, I don't have a bunch of labels that say
  44. dog, tree, male or female. I just decide that pixels
  45. like this belong with pixels like this, as opposed to
  46. pixels like something else that I'm pointing to behind me.
  47. >> Yeah, we're living in a world right now
  48. that is devoid of any other objects. Oh, chairs.
  49. >> Chairs right?
  50. >> Chairs.
  51. >> So these pixels are very different from those pixels, because of where
  52. they are relative to the other pixels, say. Right? So if you were looking.
  53. >> I'm not sure that's helping me understand unsupervised learning.
  54. >> Go out and, go outside and
  55. look at a crowd of people and try to decide how you might divide
  56. them up. Maybe you'll divide them up
  57. by ethnicity, maybe you'll divide them up by
  58. whether they have purposely shaven their hair in order to mock the bald, or
  59. whether they have curly hair. Maybe you'll
  60. divide them up by whether they have goatees.
  61. >> Facial hair.
  62. >> Or whether they have grey hair. There's lots
  63. of things that you might do in order to
  64. >> Did you just point at me and say grey hair?
  65. >> I was pointing and your head happened to be there.
  66. >> Pixels it's, its a two dimensional.
  67. >> Oh, come on, where's the
  68. grey hair?
  69. >> Right there. It's right where your spit curl is.
  70. >> All right.
  71. >> Okay, so, imagine you're dividing the world up that way. You could divide
  72. it up male/female, you could divide it
  73. up short/tall, wears hats/doesn't wear hats. All kinds
  74. of ways you can divide it up, and no one's telling you the right way
  75. to divide it up, at least not
  76. directly. That's unsupervised learning, that's description, because now,
  77. >> Mm.
  78. >> Rather than having to send pixels of everyone, or having
  79. to do a complete description of this crowd, you can say there
  80. were 57 males, and 23 females, say. Or,
  81. there were mostly people with beards, or whatever.
  82. >> I like summarization for that.
  83. >> I like summarization for that, it's
  84. a nice concise description. That's unsupervised learning.
  85. >> > Good, very good. And it's different than supervised learning.
  86. >> In fact. It's different than supervised learning. And it's
  87. different in a couple of ways. One way that it's different
  88. is, all of those ways that we could have just
  89. divided up the world, in some sense they're all equally good.
  90. So I could divide up by sex. Or I could divide up by height. Or I can
  91. divide up by clothing or whatever and they're
  92. all equally good, absent some other signal later telling
  93. you how you should be dividing up the
  94. world. But supervised learning directly tells you there's a
  95. signal, this is what it ought to be,
  96. and that's how you train. Those are very different.
  97. >> Now, but I can ways that unsupervised
  98. learning could be helpful in the supervised setting.
  99. Alright, so if I do get a nice
  100. description and it's the right kind of description,
  101. it might help me map to, it might help me do the function approximation better.
  102. >> Right. So instead of taking pixels at input, as
  103. input and then labels like male or female, I can
  104. just simply take a summarization of you like, how much
  105. hair you have, your relative height to weight. And various things
  106. like that that might help me do it. That's right.
  107. And by the way, in practice, this tends to turn out
  108. to be things like density estimation. We do end up
  109. turning it into statistics at the end of the day. Often.
  110. >> It was statistics from the beginning. But when you say
  111. density estimation.
  112. >> Yes.
  113. >> Are you saying I'm stupid?
  114. >> No.
  115. >> Alright, so what is density estimation?
  116. >> Well, they'll have to take the class to find out.
  117. >> I see.
  118. >> Okay.