Table of Contents
What Is MSE?
Mean Squared Error (MSE) is a metric that measures the average squared difference between predicted and actual values. It quantifies how well a model's predictions match the observed data. Lower MSE indicates better prediction accuracy.
MSE is the most common loss function in regression analysis and machine learning. It penalizes larger errors more heavily due to squaring, making it sensitive to outliers. RMSE (root MSE) converts the error back to the original units for easier interpretation.
Error Metrics
Comparison of Metrics
| Metric | Range | Outlier Sensitivity | Units |
|---|---|---|---|
| MSE | [0, ∞) | High | Squared units |
| RMSE | [0, ∞) | High | Original units |
| MAE | [0, ∞) | Moderate | Original units |
| R² | (-∞, 1] | Moderate | Unitless |
Frequently Asked Questions
What is a good MSE value?
There is no universal "good" MSE because it depends on the scale of your data. Compare MSE across models for the same dataset, or use RMSE for interpretation in original units. An RMSE of 5 on a 0-100 scale is much better than on a 0-10 scale.
Why use MSE over MAE?
MSE penalizes large errors more heavily, which is desirable when large errors are particularly costly. MAE treats all errors equally and is more robust to outliers. The choice depends on your application.
Can R-squared be negative?
Yes, R² can be negative when the model performs worse than simply predicting the mean for all observations. This indicates a very poor model. Ideally, R² should be close to 1.