Expected value of the square of the difference between the estimator and the parameter. The MSE measures how far the estimator is off from what it is trying to estimate, on the average in repeated experiments. It is a summary measure of the accuracy of the estimator. It combines any tendency of the estimator to overshoot or undershoot the truth (bias), and the variability of the estimator (se).
Mean squared error (MSE) of an estimator is one of many ways to quantify the difference between values implied by a kernel density estimator and the true values of the quantity being estimated. MSE is a risk function, corresponding to the expected value of the squared error loss or quadratic loss. MSE measures the average of the squares of the "errors." The error is the amount by which the value implied by the estimator differs from the quantity to be estimated. The difference occurs because of randomness or because the estimator doesn't account for information that could produce a more accurate estimate.