Mean Squared Error aka MSE - short version

Expected value of the square of the difference between the estimator and the parameter. The MSE measures how far the estimator is off from what it is trying to estimate, on the average in repeated experiments. It is a summary measure of the accuracy of the estimator. It combines any tendency of the estimator to overshoot or undershoot the truth (bias), and the variability of the estimator (se).

Mean Squared Error aka MSE - long version

Mean squared error (MSE) of an estimator is one of many ways to quantify the difference between values implied by a kernel density estimator and the true values of the quantity being estimated. MSE is a risk function, corresponding to the expected value of the squared error loss or quadratic loss. MSE measures the average of the squares of the "errors." The error is the amount by which the value implied by the estimator differs from the quantity to be estimated. The difference occurs because of randomness or because the estimator doesn't account for information that could produce a more accurate estimate.

Definition in Russian| Definition in French| Definition in Japanese| Definition in Vietnamese| Definition in Greek| Definition in Polish| Definition in Turkish| Definition in Portuguese| Definition in Hindi| Definition in Swedish| Definition in Arabic| Definition in Chinese| Definition in Dutch| Definition in Hebrew| Definition in German| Definition in Korean| Definition in Italian| Definition in Spanish| Definition in Thai|