Home > Testing and Tuning Models > Testing Classification Models > Test Metrics for Classifica... > Classification Model Test a... > Classification Model Test V... > Performance
The Performance tab provides an overall summary of the performance of each model generated. It displays test results for several common test metrics. It displays the following:
All Measures (default). The Measure list enables you to select the measures to display. By default, all measures are displayed. The selected measures are displayed as graphs. If you are comparing test results for two or more models, then different models have graphs in different colors.
Cost, if you specified costs or the system calculated costs
In the Sort By fields. you can specify the sort attribute and sort order. The first list is the sort attribute: measure, creation date, or name (the default). The second list is the sort order: ascending or descending (default).
Below the graphs, the Models table supplements the information presented in the graph. You can minimize the table using the splitter line. The Models table summarizes the data in the histograms:
Name, the name of the model along with color of the model in the graphs
Predictive Confidence percent
Overall Accuracy percent
Average Accuracy percent
Cost, if you specified cost (costs are calculated by Oracle Data Miner for decision trees)
Algorithm (used to build the model)
Creation date
By default, results for the selected model are displayed. To change the list of models, click
and deselect any models for which you do not want to see results. If you deselect a model, then both the histogram and the summary information are removed.