Underfitting, Overfitting—And Finally Recovering

So I have not been feeling well, but thankfully I just recovered. I'm just going to give a summary of what I have learnt so far.

  • Underfitting: This occurs when the model doesn't fit the training set well. We can also say that there's a high bias thingy going on

  • "Normal": note that the normal is in quotation, since I'm not certain there's a term for it. But it's simply when the training set fits the model just fine. The beautiful thing about this is that the more model can be used and will work well on test set that are novel to it. Hence, making the model more generalizable.

  • Overfitting: This when the model fits the data extremely well -- too much of everything is bad isn't it? High variance also occurs here.