In this video, we will cover machine learning regression metrics such as Root Mean Squared Error (RMSE), Mean Squared Error (MSE), Mean Absolute Error (MAE), and coefficient of determination (R squared).
After we train machine learning models, we would like to assess the performance of the model by comparing model predictions to actual (True) values.
Mean Absolute Error (MAE) is obtained by calculating the absolute difference between the model predictions and the true (actual) values
MAE is a measure of the average magnitude of error generated by the regression model.
Mean Square Error (MSE) is very similar to the Mean Absolute Error (MAE) but instead of using absolute values, squares of the difference between the model predictions and the training dataset (true values) is being calculated.
MSE values are generally larger compared to the MAE since the residuals are being squared.
Root Mean Square Error (RMSE) represents the standard deviation of the residuals (i.e.: differences between the model predictions and the true values (training data)).
RMSE can be easily interpreted compared to MSE because RMSE units match the units of the output.
RMSE provides an estimate of how large the residuals are being dispersed.
R-square or the coefficient of determination represents the proportion of variance (of y) that has been explained by the independent variables in the model. Best possible score is 1.0
If 𝑅2=0.8, this means that 80% of the increase in university admission is due to GRE score (assuming a simple linear regression model).
It provides an indication of goodness of fit and therefore a measure of how well unseen samples are likely to be predicted by the model.
A constant model that always predicts the expected value of y, disregarding the input features, would get a R² score of 0.0.
#machinelearning #regression
Ещё видео!