English subtitles

← Decision Tree Parameters - Intro to Machine Learning

Get Embed Code
5 Languages

Showing Revision 3 created 05/25/2016 by Udacity Robot.

  1. And the answer of course,
  2. is this one because I only have one sample in this node.
  3. Any of these other guys I can continue to split because they all
  4. have more than two samples in them.
  5. So I can keep drawing my tree if I liked.
  6. I can continue to keep going.
  7. Maybe to quite some depth.
  8. Until I get a very complicated decision boundary.
  9. So you can see how once it starts to get really deep, really complex.
  10. That's where you start to get some of these decision boundaries that do all
  11. these complicated things.
  12. And probably end up overfitting your data.