This section aims to help you understand the key metrics used to evaluate the performance and accuracy of your Marketing Mix Modeling models.
By comprehending these metrics, you can make informed decisions to optimize your marketing strategies. Below, we provide detailed explanations of each statistic, what constitutes good or bad results, and how to interpret them to draw actionable insights for your marketing efforts.
R² (R-Squared)
What it means
R² indicates how well the model’s predictions match the actual data. It represents the proportion of the variance in the dependent variable that is predictable from the independent variables.
Good vs. Bad Results
An R² value close to 1 suggests a good fit, meaning the model explains a high proportion of the variance. A value near 0 indicates a poor fit.
Interpretation and Decisions
A high R² means your model is capturing the underlying patterns well. Low R² might indicate missing variables or that the model isn’t complex enough. Consider adding more relevant variables or trying different model types.
ExpVariance (Explained Variance)
What it means
Similar to R², it measures the proportion of the variance explained by the model.
Good vs. Bad Results
High values close to 1 are good; values near 0 are not.
Interpretation and Decisions
Use this to corroborate your R² findings. If both are high, your model is reliable. If not, review your model’s variables.
MAPE (Mean Absolute Percentage Error)
What it means
MAPE shows the average absolute percentage error between the predicted and actual values.
Good vs. Bad Results
Lower MAPE values indicate better model accuracy.
Interpretation and Decisions
A low MAPE (e.g., below 10%) suggests accurate predictions, helping you trust the model's output for decision-making. High MAPE indicates the model’s predictions may not be reliable, prompting a review of data quality or model choice.
MAE (Mean Absolute Error)
What it means
MAE measures the average magnitude of errors in a set of predictions, without considering their direction.
Good vs. Bad Results
Lower MAE values signify better predictive accuracy.
Interpretation and Decisions
A low MAE indicates that your predictions are close to the actual values on average. High MAE values suggest the need to refine your model for better accuracy.
Weighted MAPE
What it means
This is MAPE adjusted to give more weight to errors in predictions of higher importance.
Good vs. Bad Results
Lower values indicate better performance, particularly in key areas.
Interpretation and Decisions
Use this to focus on improving model accuracy for the most critical predictions. A high-weighted MAPE might suggest re-evaluating high-impact variables.
MSE (Mean Squared Error)
What it means
MSE measures the average squared difference between the predicted and actual values.
Good vs. Bad Results
Lower MSE values indicate better model performance.
Interpretation and Decisions
A low MSE suggests good predictive accuracy. High MSE indicates large errors in predictions, prompting a review of model assumptions and data preprocessing.
MedianAE (Median Absolute Error)
What it means
MedianAE is the median of the absolute errors between the predicted and actual values.
Good vs. Bad Results
Lower MedianAE values indicate better model performance.
Interpretation and Decisions
Use this metric to understand the typical magnitude of errors. High MedianAE might suggest that the model needs adjustments to improve typical prediction accuracy.
MaxError
What it means
MaxError indicates the largest single error in the predictions.
Good vs. Bad Results
Lower MaxError values are better, as they show fewer extreme prediction errors.
Interpretation and Decisions
While focusing on overall model accuracy, pay attention to MaxError to ensure there are no significant outliers that could impact decision-making.
100-Weighted MAPE
What it means
This is another form of weighted MAPE scaled by a factor of 100.
Good vs. Bad Results
Lower values signify better accuracy.
Interpretation and Decisions
Similar to Weighted MAPE, but scaled for easier interpretation. Use it to gauge accuracy in critical areas of your model.
Root Mean Squared Error (RMSE)
What it means
RMSE is the square root of the average squared differences between predicted and actual values.
Good vs. Bad Results
Lower RMSE values indicate better model performance.
Interpretation and Decisions
A low RMSE suggests high accuracy and reliability in predictions. High RMSE values indicate the model’s predictions are far from the actual values, necessitating further model refinement.
These explanations should help you understand the significance of each metric in evaluating your Marketing Mix Model, identifying areas for improvement, and making informed decisions based on your model's performance.