Search Results

Bias-variance tradeoff

The bias-variance tradeoff refers to the problem of minimizing two different sources of error when training a supervised learning model:

  1. Bias - Bias is a consistent error, possibly from the algorithm having made an incorrect assumption about the training data. Bias is often related to underfitting.

  2. Variance - Variances comes from a high sensitivity to differences in training data. Variance is often related to overfitting.

It is typically difficult to simultaneously minimize bias and variance.