Tag: error

Bias and Variance – the struggle of daily life

Bias refers to the error that is introduced by approximating a real-life problem, which may be extremely complicated, by a much simpler model. Variance refers to the amount by which the prediction of model would change if we estimated it using a different training data set. The challenge lies in finding a method for which…
Read more


February 14, 2019 0

Understanding the Bias-Variance Trade-off

George Box once said, “All models are wrong, but some are useful.” From a supervised machine learning perspective, all models have errors, and to make our models useful, we have to minimize such errors. More specifically, we have to minimize two major sources of error: bias and variance. Prior to applying a machine learning algorithm,…
Read more


February 14, 2019 0

Discuss the trade off between bias and variance

Bias is error due to erroneous or overly simplistic assumptions in the learning algorithm you’re using. This can lead to the model underfitting your data, making it hard for it to have high predictive accuracy and for you to generalize your knowledge from the training set to the test set. Variance is error due to too much…
Read more


February 14, 2019 0

Bias/Variance Trade-off

Bias/Variance trade-off is an important concept in learning theory. This post discusses topic from both – theoretical and practical perspective. General Main goal of any learning algorithm is to predict and generalise well. More formally, this goal is equivalent to ‘minimise expected error on unseen data’ – thus taking closer look at the components of…
Read more


February 13, 2019 0