pypitensorflow95% confidence\u2191 19

Transformers model from Hugging-Face throws error that specific classes couldn t be loaded

Full error message
Hi after running this code below, I get the following error.

ValueError: Could not load model facebook/bart-large-mnli with any of the following classes: (<class 'transformers.models.auto.modeling_tf_auto.TFAutoModelForSequenceClassification'>,).

import tensorflow as tf
from transformers import pipeline

classifier = pipeline("zero-shot-classification", model="facebook/bart-large-mnli")

Could someone please help.
Thank you!

I had the same issue! Somebody has commented here that you need to have PyTorch installed (https://github.com/huggingface/transformers/issues/16849). To sum it up: Some models only exist as PyTorch models (e.g. deepset/roberta-base-squad2). Calling pipeline() selects the framework (TF or PyTorch) based on what is installed on your machine (or venv in my case) If both are installed, Torch will be selected If you don't have PyTorch installed, it threw above mentioned error Installing PyTorch solved the issue for me! In the GitHub issue, another workaround is mentioned: load the model in TF with from_pt=True and save as personal copy as a TF model with save_pretrained and push_to_hub

API access

Get this solution programmatically \u2014 free, no authentication.

curl https://depscope.dev/api/error/a823c774b23f31369359508c7171dacb837004df12ea4afc63d4cd5f681a1455
hash \u00b7 a823c774b23f31369359508c7171dacb837004df12ea4afc63d4cd5f681a1455
Transformers model from Hugging-Face throws error that speci… — DepScope fix | DepScope