Home > Testing and Tuning Models > Testing Classification Models > Test Metrics for Classifica... > Performance > Predictive Confidence
Predictive Confidence provides an estimate of accurate the model is. Predictive Confidence is a number between 0 and 1. Oracle Data Miner displays Predictive Confidence as a percentage. For example, the Predictive Confidence of 59 means that the Predictive Confidence is 59 percent (0.59).
Predictive Confidence indicates how much better the predictions made by the tested model are than predictions made by a naive model. The naive model always predicts the mean for numerical targets and the mode for categorical targets.
Predictive Confidence is defined by the following formula:
Predictive Confidence = MAX[(1-Error of model/Error of Naive Model),0]X100
Where:
Error of Model is (1 - Average Accuracy/100)
Error of Naive Model is (Number of target classes - 1) / Number of target classes
If the Predictive Confidence is 0, then it indicates that the predictions of the model are no better than the predictions made by using the naive model.
If the Predictive Confidence is 1, then it indicates that the predictions are perfect.
If the Predictive Confidence is 0.5, then it indicates that the model has reduced the error of a naive model by 50 percent.