{"id":1022,"hash":"277ee8a76ec6415fe315d3ff642fc37e22f1e4629b02de54e41619cd9a36c182","pattern":"PyTorch BERT TypeError: forward() got an unexpected keyword argument &#39;labels&#39;","full_message":"Training a BERT model using PyTorch transformers (following the tutorial here).\n\nFollowing statement in the tutorial\n\nloss = model(b_input_ids, token_type_ids=None, attention_mask=b_input_mask, labels=b_labels)\n\nleads to\n\nTypeError: forward() got an unexpected keyword argument 'labels'\n\nHere is the full error,\n\nTypeError                                 Traceback (most recent call last)\n<ipython-input-53-56aa2f57dcaf> in <module>\n     26         optimizer.zero_grad()\n     27         # Forward pass\n---> 28         loss = model(b_input_ids, token_type_ids=None, attention_mask=b_input_mask, labels=b_labels)\n     29         train_loss_set.append(loss.item())\n     30         # Backward pass\n\n~/anaconda3/envs/systreviewclassifi/lib/python3.6/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)\n    539             result = self._slow_forward(*input, **kwargs)\n    540         else:\n--> 541             result = self.forward(*input, **kwargs)\n    542         for hook in self._forward_hooks.values():\n    543             hook_result = hook(self, input, result)\n\nTypeError: forward() got an unexpected keyword argument 'labels'\n\nI cant seem to figure out what kind of argument the forward() function expects.\n\nThere is a similar problem here, but I still do not get what the solution is.\n\nSystem information:\n\nOS: Ubuntu 16.04 LTS\nPython version: 3.6.x\nTorch version: 1.3.0\nTorch Vision version: 0.4.1\nPyTorch transformers version: 1.2.0","ecosystem":"pypi","package_name":"pytorch","package_version":null,"solution":"As far as I know, the BertModel does not take labels in the forward() function. Check out the forward function parameters.\n\nI suspect you are trying to fine-tune the BertModel for sequence classification task and the API provides a class for that which is BertForSequenceClassification. As you can see its forward() function definition:\n\ndef forward(self, input_ids, attention_mask=None, token_type_ids=None,\n            position_ids=None, head_mask=None, labels=None):\n\nPlease note, the forward() method returns the followings.\n\nOutputs: `Tuple` comprising various elements depending on the configuration (config) and inputs:\n        **loss**: (`optional`, returned when ``labels`` is provided) ``torch.FloatTensor`` of shape ``(1,)``:\n            Classification (or regression if config.num_labels==1) loss.\n        **logits**: ``torch.FloatTensor`` of shape ``(batch_size, config.num_labels)``\n            Classification (or regression if config.num_labels==1) scores (before SoftMax).\n        **hidden_states**: (`optional`, returned when ``config.output_hidden_states=True``)\n            list of ``torch.FloatTensor`` (one for the output of each layer + the output of the embeddings)\n            of shape ``(batch_size, sequence_length, hidden_size)``:\n            Hidden-states of the model at the output of each layer plus the initial embedding outputs.\n        **attentions**: (`optional`, returned when ``config.output_attentions=True``)\n            list of ``torch.FloatTensor`` (one for each layer) of shape ``(batch_size, num_heads, sequence_length, sequence_length)``:\n            Attentions weights after the attention softmax, used to compute the weighted average in the self-attention heads. \n\nHope this helps!","confidence":0.95,"source":"stackoverflow","source_url":"https://stackoverflow.com/questions/58454157/pytorch-bert-typeerror-forward-got-an-unexpected-keyword-argument-labels","votes":26,"created_at":"2026-04-19T04:52:12.289683+00:00","updated_at":"2026-04-19T04:52:12.289683+00:00"}