Home > Testing and Tuning Models > Testing Classification Models > Test Metrics for Classifica... > Classification Model Test a... > Tuning Classification Models > Cost > Testing Regression Models > Regression Model Test Viewers > Compare Regression Test Res... > Compare Test Results
When you compare test results for two or more Regression models, each model has a color associated with it. This color indicates the results for that model. For example, if model M1 has purple associated with it, then the bar graphs on the Performance tab for M1 is displayed in purple.
By default, test results for all models in the node are compared. If you do not want to compare all test results, then click
. The Edit Test Results Selection dialog box opens. Deselect results that you do not want to see. Click OK when you have finished.
Compare Test Results opens in a new tab. Results are displayed in two tabs:
Performance tab: The following metrics are compared on the Performance tab:
Predictive Confidence for Classification Models
Mean Absolute Error
Mean Predicted Value
By default, test results for all models are compared. To edit the list of models, click
above pane that lists the models to open the Edit Test Selection (Classification and Regression dialog box.
Residual tab: Displays the residual plot for each model.
You can compare two plots side by side. By default, test results for all models are compared.
To edit the list of models, click
above pane that lists the models to open the Edit Test Selection (Classification and Regression dialog box.