YouTube

Got a YouTube account?

New: enable viewer-created translations and captions on your YouTube channel!

English subtitles

← BiasVariance and Features Quiz Solution - Intro to Machine Learning

Get Embed Code
4 Languages

Showing Revision 4 created 05/25/2016 by Udacity Robot.

  1. In the case where you're using lots of features to get the most tightly fitting
  2. regression or,
  3. or a classifier that you can, that's a classic high variance situation.
  4. You want to be careful that you're not overfitting to
  5. the data when you're doing this.
  6. And so, another way to frame this whole discussion about the number of features
  7. that you should be using can be phrased in terms of the bias variance dilemma,
  8. or how many features should you be using so
  9. that you're balancing these two concerns.
  10. You want to have the accurate description that comes with having enough
  11. variance to your model, you want it to be able to fit your data in a,
  12. in an accurate and true way.
  13. But you want to do it
  14. with the minimum number of features that's needed to do that.
  15. So there's this tradeoff between sort of the goodness of the fit and
  16. the simplicity of the fit,
  17. the number of features that you have to use to achieve that goodness of fit.
  18. And so what that means is you want to fit an algorithm with few features.
  19. But using the case of a regression as a large r squared or
  20. conversely a low sum of the squared residual errors.
  21. This is the sweet spot that you want to find.