Model Selection Metrics §
Metrics: §
- -> equivalent to SSE, RMSE. Focus on prediction performance
- adj- -> evaluate both SSE and df (n-p), in MLR, adj- than
- t test p-values(or F test p-values) -> focus on the statistical significance of predictors
- AIC and BIC:
- Akaike’s Information criterion(AIC):
- AIC = (p: number of parameters, is the sample likelihood of the model):
- 2p is a penalty on the number of predictors, adding more predictors might cause increase in AIC
- AIC tends to choose model with large likelihood, without adding too many predictors
- smaller ALC indicate better model
- Bayesian Information criterion
- BIC =
- ln n is a much larger penalty on adding predictors to the model
- Select model with the smallest AIC/BIC, they most of the time agree. if not, BIC picks shorter models, therefore more popular in MLR
- Mallow’s Cp:
- a good cp:
- then choose one with smallest cp
Key words: §