对于如下代码
``` python
import ssl
ssl._create_default_https_context = ssl._create_unverified_context
from datasets import load_dataset
dataset = load_dataset("yelp_review_full")
dataset["train"][100]
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("bert-base-cased")
def tokenize_function(examples):
return tokenizer(examples["text"], padding="max_length", truncation=True)
tokenized_datasets = dataset.map(tokenize_function, batched=True)
from transformers import AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained("bert-base-cased", num_labels=5)
from transformers import TrainingArguments
training_args = TrainingArguments(output_dir="test_trainer")
```
我在使用 transformers 框架时出现了 ssl 问题,如下:
```
Traceback (most recent call last):
File "/Users/luojiayun/Desktop/transformers/
rerate_chn.py", line 6, in <module>
dataset = load_dataset("yelp_review_full")
File "/Users/luojiayun/anaconda3/lib/python3.10/site-packages/datasets/
load.py", line 2519, in load_dataset
builder_instance = load_dataset_builder(
File "/Users/luojiayun/anaconda3/lib/python3.10/site-packages/datasets/
load.py", line 2192, in load_dataset_builder
dataset_module = dataset_module_factory(
File "/Users/luojiayun/anaconda3/lib/python3.10/site-packages/datasets/
load.py", line 1843, in dataset_module_factory
raise e1 from None
File "/Users/luojiayun/anaconda3/lib/python3.10/site-packages/datasets/
load.py", line 1779, in dataset_module_factory
raise ConnectionError(f"Couldn't reach '{path}' on the Hub ({type(e).__name__})")
ConnectionError: Couldn't reach 'yelp_review_full' on the Hub (SSLError)
```
根据 StackOverflow 上的方法,我在代码头部新增了:
```
import ssl
ssl._create_default_https_context = ssl._create_unverified_context
```
但无法解决问题。
我使用了 dev-sidecar ,已经按照提示添加了根证书,是否与此有关?
若不是,请问有没有朋友能够指明解决方法,感激不尽。