{"id":1043,"hash":"fd4600cd0d2ebe9f59e85434a0a00748e5665872c79e44ef570645664cdd12e5","pattern":"UndefinedMetricWarning: F-score is ill-defined and being set to 0.0 in labels with no predicted samples","full_message":"I'm getting this weird error:\n\nclassification.py:1113: UndefinedMetricWarning: F-score is ill-defined and being set to 0.0 in labels with no predicted samples.\n'precision', 'predicted', average, warn_for)`\n\nbut then it also prints the f-score the first time I run: \n\nmetrics.f1_score(y_test, y_pred, average='weighted')\n\nThe second time I run, it provides the score without error. Why is that?\n\n>>> y_pred = test.predict(X_test)\n>>> y_test\narray([ 1, 10, 35,  9,  7, 29, 26,  3,  8, 23, 39, 11, 20,  2,  5, 23, 28,\n       30, 32, 18,  5, 34,  4, 25, 12, 24, 13, 21, 38, 19, 33, 33, 16, 20,\n       18, 27, 39, 20, 37, 17, 31, 29, 36,  7,  6, 24, 37, 22, 30,  0, 22,\n       11, 35, 30, 31, 14, 32, 21, 34, 38,  5, 11, 10,  6,  1, 14, 12, 36,\n       25,  8, 30,  3, 12,  7,  4, 10, 15, 12, 34, 25, 26, 29, 14, 37, 23,\n       12, 19, 19,  3,  2, 31, 30, 11,  2, 24, 19, 27, 22, 13,  6, 18, 20,\n        6, 34, 33,  2, 37, 17, 30, 24,  2, 36,  9, 36, 19, 33, 35,  0,  4,\n        1])\n>>> y_pred\narray([ 1, 10, 35,  7,  7, 29, 26,  3,  8, 23, 39, 11, 20,  4,  5, 23, 28,\n       30, 32, 18,  5, 39,  4, 25,  0, 24, 13, 21, 38, 19, 33, 33, 16, 20,\n       18, 27, 39, 20, 37, 17, 31, 29, 36,  7,  6, 24, 37, 22, 30,  0, 22,\n       11, 35, 30, 31, 14, 32, 21, 34, 38,  5, 11, 10,  6,  1, 14, 30, 36,\n       25,  8, 30,  3, 12,  7,  4, 10, 15, 12,  4, 22, 26, 29, 14, 37, 23,\n       12, 19, 19,  3, 25, 31, 30, 11, 25, 24, 19, 27, 22, 13,  6, 18, 20,\n        6, 39, 33,  9, 37, 17, 30, 24,  9, 36, 39, 36, 19, 33, 35,  0,  4,\n        1])\n>>> metrics.f1_score(y_test, y_pred, average='weighted')\nC:\\Users\\Michael\\Miniconda3\\envs\\snowflakes\\lib\\site-packages\\sklearn\\metrics\\classification.py:1113: UndefinedMetricWarning: F-score is ill-defined and being set to 0.0 in labels with no predicted samples.\n  'precision', 'predicted', average, warn_for)\n0.87282051282051276\n>>> metrics.f1_score(y_test, y_pred, average='weighted')\n0.87282051282051276\n>>> metrics.f1_score(y_test, y_pred, average='weighted')\n0.87282051282051276\n\nAlso, why is there a trailing 'precision', 'predicted', average, warn_for) error message? There is no open parenthesis so why does it end with a closing parenthesis? I am running sklearn 0.18.1 using Python 3.6.0 in a conda environment on Windows 10.\n\nI also looked at here and I don't know if it's the same bug. This SO post doesn't have solution either.","ecosystem":"pypi","package_name":"scikit-learn","package_version":null,"solution":"As mentioned in the comments, some labels in y_test don't appear in y_pred. Specifically in this case, label '2' is never predicted:\n\n>>> set(y_test) - set(y_pred)\n{2}\n\nThis means that there is no F-score to calculate for this label, and thus the F-score for this case is considered to be 0.0. Since you requested an average of the score, you must take into account that a score of 0 was included in the calculation, and this is why scikit-learn is showing you that warning.\n\nThis brings me to you not seeing the error a second time. As I mentioned, this is a warning, which is treated differently from an error in python. The default behavior in most environments is to show a specific warning only once. This behavior can be changed:\n\nimport warnings\nwarnings.filterwarnings('always')  # \"error\", \"ignore\", \"always\", \"default\", \"module\" or \"once\"\n\nIf you set this before importing the other modules, you will see the warning every time you run the code.\n\nThere is no way to avoid seeing this warning the first time, aside for setting warnings.filterwarnings('ignore'). What you can do, is decide that you are not interested in the scores of labels that were not predicted, and then explicitly specify the labels you are interested in (which are labels that were predicted at least once):\n\n>>> metrics.f1_score(y_test, y_pred, average='weighted', labels=np.unique(y_pred))\n0.91076923076923078\n\nThe warning will be gone.","confidence":0.95,"source":"stackoverflow","source_url":"https://stackoverflow.com/questions/43162506/undefinedmetricwarning-f-score-is-ill-defined-and-being-set-to-0-0-in-labels-wi","votes":146,"created_at":"2026-04-19T04:52:13.800539+00:00","updated_at":"2026-04-19T04:52:13.800539+00:00"}