DEV Community

Super Kai (Kazuya Ito)
Super Kai (Kazuya Ito)

Posted on • Updated on

Overfitting vs Underfitting in PyTorch

Buy Me a Coffee

Overfitting:

  • is the problem which a model can make accurate predictions for train data a lot but a little for new data(including test data) so the model fits train data much more than new data.
  • occurs because:
    • train data is small(not enough) so the model can only learn a small number of patterns.
    • train data is imbalanced(biased) having a lot of specific(limitted), similar or same data but not a lot of various data so the model can only learn a small number of patterns.
    • train data has a lot of noise(noisy data) so the model learns the patterns of the noise a lot but not the patters of normal data. *Noise(noisy data) means outliers, anomalies or sometimes duplicated data.
    • the training time is too long with a too large number of epochs.
    • the model is too complex.
  • can be mitigated by:
    1. larger train data.
    2. having a lot of various data.
    3. reduceing noise.
    4. stopping training early.
    5. Ensemble learning.
    6. Regularization to reduce model complexity: *Memos:

Underfitting:

  • is the problem which a model cannot make accurate predictions both for train data and new data(including test data) a lot so the model doesn't fit both train data and new data.
  • occurs because:
    • the model is too simple(not complex enough).
    • the training time is too short with a too small number of epochs.
    • Excessive regularization is applied.
  • can be mitigated by:
    1. Increasing model complexity.
    2. Increasing the training time with a larger number of epochs.
    3. Decreasing regularization.

Overfitting and Underfitting are trade-off:

Too much overfitting mitigation(4., 5. and 6.) leads to underfitting with high bias and low variance while too much underfitting mitigation(1., 2. and 3.) leads to overfitting with low bias and high variance so their mitigation should be balanced as shown below:

*Memos:

  • You can say Bias and Variance are trade-off because reducing bias increases variance while reducing variance increases bias so they should be balanced.
  • Low bias means high accuracy while high bias means low accuracy.
  • Low variance means high precision while high variance means low precision.

Image description

Image description

Top comments (0)