Error with gpt-2 model

Hi everyone,
I get an error when I try to deploy an app with the model GPT-2, a NLP model download from Hugging Face. I am not sure of the reason. I guess that we can’t download the model when it is on the cloud ?
Here the message :

> File "/home/appuser/venv/lib/python3.7/site-packages/streamlit/", line 354, in _run_script
    exec(code, module.__dict__)
File "/app/robot/", line 25, in <module>
    tokenizer =load_tokenizer()
File "/home/appuser/venv/lib/python3.7/site-packages/streamlit/legacy_caching/", line 574, in wrapped_func
    return get_or_create_cached_value()
File "/home/appuser/venv/lib/python3.7/site-packages/streamlit/legacy_caching/", line 558, in get_or_create_cached_value
    return_value = func(*args, **kwargs)
File "/app/robot/", line 17, in load_tokenizer
    tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
File "/home/appuser/venv/lib/python3.7/site-packages/transformers/", line 1732, in from_pretrained
File "/home/appuser/venv/lib/python3.7/site-packages/transformers/", line 1929, in cached_path
File "/home/appuser/venv/lib/python3.7/site-packages/transformers/", line 2178, in get_from_cache
    "Connection error, and we cannot find the requested files in the cached path."

Hi @Maxime_tut,

It was likely a network error connecting to the HuggingFace servers. Could you try re-deploying your app? I was able to successfully deploy a fork of your repo:

Best, :balloon:

Oh yes you’re right, it worked now. Thanks a lot :slight_smile:

1 Like