App doesn't deploy in Streamlit Sharing - Possible issue with deploying PyTorch

Hello!

So my app currently doesn’t deploy in Streamlit Sharing.

For context, here’s the link to the app: https://share.streamlit.io/charlywargnier/pegasus_paraphrasing_streamlit-test/main/app.py

… the link related Github repo: https://github.com/CharlyWargnier/pegasus_Paraphrasing_Streamlit-test

… and the full issue log: https://pastebin.com/2mjt9ATH

The main error seems to be: [manager] Error checking Streamlit healthz: Get "http://localhost:8501/healthz": dial tcp 127.0.0.1:8501: connect: connection refused

I believe this may be related to an incorrect PyTorch install, but I may be wrong. The line for the PyTorch install in the requirements.txt currently is:

--find-links https://download.pytorch.org/whl/torch_stable.html torch==1.7.1+cpu

Any idea @amey-st or anyone else maybe?

Many thanks :pray:
Charly

I suspect the issue might be one of exhausting the available disk storage. Pytorch itself is a pretty big library, but then there’s also this:

Downloading: 100%|██████████| 2.28G/2.28G [00:53<00:00, 42.4MB/s][2020-12-14 23:39:42.789394]

The model itself is 2.28GB, and a maybe a GB or two of dependencies installed and the container crashes.

1 Like

Thanks for the prompt feedback Randy!

I’ll see if I can use anything lighter. What’s the max disk storage space allowed? (cc: @amey-st)

Thanks,
Charly

Hi Randy

So as discussed I’ve tried a lighter model

the app did work once (horray!) however it’s not working anymore, and I’ve got a similar error:
1. Error checking Streamlit healthz: Get "http://localhost:8501/healthz": dial tcp 127.0.0.1:8501: connect: connection refused

Log trace: https://pastebin.com/7jT96YXj

So it looks like this may still be a memory related issue on S4.

Not sure if there would be any sort of workaround if it is indeed memory related, as I believe that this Transformers model is the lightest available in the HuggingFace Library

Note that the app/repo are different from the ones I shared on Monday

Thanks,
Charly

cc: @amey-st

Hi @Charly_Wargnier! The app is crashing because even with the lighter model it’s still exceeding the memory limits. Is there any room for optimizing the memory utilization (e.g. use smaller dataset)?

Cheers,
Amey

Thanks for confirming Amey!

I shall rule out deploying this app on S4 I believe, as this is the lightest possible model. :frowning:

Out of interest, do you know how much Ram I’d need to run that app smoothly? Even an estimate would help as I’m trying to assess costs to deploy either on Heroku, AWS or GCP

Thanks very much! :pray:

Charly

Hey @Charly_Wargnier, I’m having a bit of trouble installing your app locally (torch isn’t being friendly), but hopefully these help:

  1. The limit for ram on Streamlit Share is (currently) 3GB. So you at the very minimum you probably need 4GB.
  2. If you run your app locally, you can check how much ram streamlit is using, either using task manager in Windows, or activity monitor in OSX.

Hope this helps!

Thanks Guido!

Streamlit’s enabled with localhost and I can’t seem to find any Ram data about it.

I’m on Windows, do you know where I’d need to check?

Thanks,
Charly

You would need to find a process in the processes list from your screenshot that is named python, then check how much memory it’s using.

1 Like