Revision as of 01:42, 17 April 2008 by Ebernard (Talk)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

In statistics, overfitting means that some of the relationships that appear statistically significant are actually just noise. A model with overfitting has much more freedom degrees than the data that we have. Consequently, this model doesn't replicate well and does a lousy job when predicting future responses.

In machine learning, overfiting training occurs when the training is too long or the training set is rare. As a result, the learner may adjust to very specific random features that has no causal relation to the target function. This may lead to the case when the training error is reducing while the testing is increasing.

Alumni Liaison

Ph.D. on Applied Mathematics in Aug 2007. Involved on applications of image super-resolution to electron microscopy

Francisco Blanco-Silva