MySQL HeatWave on AWS  /  HeatWave AutoML  /  Evaluate a HeatWave AutoML model

10.3 Evaluate a HeatWave AutoML model

Evaluate a HeatWave AutoML model includes scores for the model, predictions and explanations. What if analysis compares the baseline results to alternative values.

During testing, the model can include a column of known values. After evaluation, if a prediction does not match the known value, MySQL HeatWave Console marks the prediction in red.

  1. Select the HeatWave AutoML tab in the MySQL HeatWave Console, and then click Connect to MySQL DB System.

  2. Select a DB System from the drop-down menu.

  3. Enter a MySQL user name and password for the DB System.

  4. Click Connect.

  5. Select a model from the list. The lower pane shows details for the model: Training table, Target column, Description, Selected algorithm, Machine learning task, and Training score.

  6. Click Evaluate.

  7. Select table: The prompt has a reminder of the name of the training table. Select a test table compatible with the training table, and click Next.

    The Evaluate Model: model name dialog opens.

  8. Model score:

    • Select one of the Scoring metrics, and click Calculate score. The Results shows the score.

  9. Explain model:

    • Feature Importance: Shows the names of each feature and their relative importance to the model.

  10. Predictions:

    • Click Generate Predictions to create a table of predictions for each row in the test table.

    • Select a prediction row from the table, and click Explain Prediction.

      The dialog shows the Selected row, and Feature Importance. Notes explains which feature had the largest impact on the prediction, and might also include the feature that had the largest impact against the prediction.

    • Click Back to return to the predictions table.

    • Select a prediction row from the table, and click What If.

    • Values for comparison from included features: Click the i to show information for that feature column.

      This includes Minimum, Mean and Maximum, and a bar chart of values.

    • Click Back to return to the predictions table.

    • Select a prediction row from the table, and click What If.

    • Values for comparison from included features: Adjust the values, and click Create.

      The dialog shows the Comparison data, and Feature Importance. Notes has explanations for the Baseline and the Comparison. Both explain which feature had the largest impact on that prediction, and might also include the feature that had the largest impact against that prediction.

    • Click Back to return to the predictions table.