-
-
Notifications
You must be signed in to change notification settings - Fork 26.5k
Closed
Labels
Description
Describe the bug
In the calibration curve page, a "scores_df" is generated to showcase supporting model evaluation metrics in addition to the calibration curves.
I noticed that my ROC AUC score was unusually low and noticed that it was consuming Y_PRED instead of Y_PROB. This is incorrect and may confuse users in the future.
Steps/Code to Reproduce
Not applicable.
Expected Results
Expected AUC in the 65-75 range for my application.
Actual Results
Observed AUC in the 50-55 range instead.
Versions
Not really relevant but 1.2.1.