When an AI Model is too Good or too Bad…and How to Correct it

Overfitting

  • Overfitting occurs when the model is accurate with the data it is supplied to predict and the model is quite accurate, so the training set and validation set agree. However the model doesn’t predict well on data it hasn’t seen before, and that includes the test set.
  • To correct overfitting the following can be tried:
    • Adding more data: that increases the size of the training set and allows the algorithm to learn with more of it = data creation.
      • The only problem is that it is not always possible to obtain new data so data augmentation techniques can be attempted (rotating, cropping, flipping, zooming)
    • Reduce the complexity of the model:
      • removing some layers from the model.
      • reduce the number of nodes / neurons in the layers. If the reduction is done randomly, the technique is called dropout: these dropped out nodes from participating in producing a prediction on the data.

Underfitting

  • Here, the training set and validation set do not agree and the prediction is low. Often this is a problem and it usually is more of a challenge than overfitting.
  • The main solution to attempt is to increase the complexity of the model. On the artificial neural networks, this equates to adding neurons on each layer or increasing the number of layers.
Scroll to Top