Hi guys,
I’ve added “Transformers” in the requirements.txt file, but I got a ModuleNotFoundError -> No module named 'transformers'
when I’m trying to deploy in Streamlit share.
Not sure what I’m doing wrong here. Here’s the app if you wanted to have a glance:
https://share.streamlit.io/charlywargnier/gpuoneshottransformersstreamlit/main/app.py
Thanks,
Charly
Check the sidebar for an error message. We launch the app even if a pip failure occurs, so there’s probably a message that some driver was missing or something else to cause the requirements not to get installed properly.
1 Like
I see, thanks Randy!
So I’ve checked and removed the errors seen in the log file. No more ModuleNotFoundError -> No module named 'transformers'
- great news!
The app shows another issue, however, and the installation seems to keep going/never stops:
I pasted the log here: https://pastebin.com/HzEcvN1k
I can’t seem to find any error here, except:
[manager] Error checking Streamlit healthz: Get "http://localhost:8501/healthz": read tcp 127.0.0.1:48364->127.0.0.1:8501: read: connection reset by peer
I’ll keep trying various things over the weekend!
Thanks,
Charly
Quick heads-up @randyzwitch
15 tries later, it’s still not working and the installation keeps looping it seems …
(Latest error log is here, as well as the repo)
I think I know where the issue is coming from yet not sure how to fix it.
When I try the same script in Google Colab, Colab downloads all the needed weights/models.
I believe we need a way to upload these models to Streamlit Sharing, yet not sure how to do it:
Maybe via an extra text file?
Thanks,
Charly
1 Like
Do you have a feeling for the cumulative size of the models? I’m wondering if you’re exceeding the VM size, and it’s caught in an unfortunate deploy cycle
1 Like
It relies on GPU PyTorch. To give you an idea of size, here’s the error I get in the free version of Heroku (capped at 500M):
Compiled slug size: 863.7M is too large (max is 500M).
Thanks,
Charly
Currently, we’re at 800MB for the limits
Thanks Randy.
So I’ve reduced size to 247MB (according to Heroku) by downloading CPU only wheels:
It works fine in Heroku yet still the same looping issue in Streamlit Sharing…
How did you solve this importing error? I’m getting the same in my local computer, I’ve reinstalled streamlit 3 times now
did anyone found solution for this ?
Hi @someshfengde
Could you please describe your issue in more detail and perhaps share a screenshot showing the error you’re running into? That will help us reproduce the error. Have you checked if there’s a typo in your requirements.txt
file?
As an example, I created a sample app using the transformers
module and successfully deployed it on sharing:
https://share.streamlit.io/snehankekre/transformers-test/main/transformers_sharing.py
Best,
Snehan
Hey Snehan!
I got an error on your app when tried this morning:
https://share.streamlit.io/snehankekre/transformers-test/main/transformers_sharing.py
Is this expected?
Thanks
Charly
Thanks for flagging, @Charly_Wargnier
I tested it locally but didn’t do so on sharing. I fixed the error by including tensorflow
in my requirements.txt
file.
It should work now:
Best,
Snehan
1 Like
Hi @someshfengde,
Apps deployed on Streamlit sharing do have some resource limits. One of them is 1 GB of RAM per app. I suspect the GPT2 model takes up much more than 1 GB of RAM when read into memory, leading to the error.
Best,
Snehan
2 Likes
ohh thanks for the reply I’ll try to build the app with some smaller model
1 Like
@Charly_Wargnier That’s expected I created the app temporarily to demonstrate a fix for the No module named 'transformers'
error. I want to avoid a demo eating up sharing resources.
1 Like