للكاتبين
Moumen T. El-Melegy and Mohammed H. Essai
Electrical Engineering Department, Assiut University, Assiut 71516, Egypt
Electrical Engineering Department, AL-Azhar University, Qena, Egypt
ABSTRACT
Mean squared error (Mse) is the preferred measure in many data modeling techniques, which
optimizes the fit of a model with respect to the data set by minimizing the square of residuals. The
basic challenge to this technique is the presence of gross errors which usually appear as outliers.
Robust statistics introduced various techniques, e.g., L-estimators, W-estimators, M-estimators,
for estimating the parameters of a parametric model which dealing with deviations from idealized
assumptions. We will focus in this paper on one popular robust technique, that is called Mestimators,
to minimize the influence of gross errors on the accuracy of training artificial feedforward
neural networks that are often used as universal function approximators. Most supervised
neural networks (NN’s) are trained by minimizing the mean squared error of the training set. In
the presence of outliers, the resulting NN model can differ significantly from the underlying
system that generates the data. We use several M-estimators in order to make NN training more
efficient and robust. We report simulation results to analyze and evaluate the electiveness of this
approach.