App hanging "Your app is in the oven" using conda after fixing torch GPU error

Hello,

This is my first time using streamlit and I am stuck at the deployment step. I was able to run my app locally but when I deployed my app to the cloud I first ran into a problem with torch using CUDA specific code.

I followed this dicussion: https://github.com/pytorch/pytorch/issues/26340 so I added the custom pipenv index to grab the CPU only pytorch from there: Added torch cpu version · damianr13/Racoont-AI@ef7e112 · GitHub but that resulted in a dependency conflict which caused pipenv to fail installing dependencies. It worked locally only by running pipenv install --skip-lock but I did not find the option of adding args to the pipenv command.

Then I followed this post: Managing your Streamlit dependencies using conda so I tried changing my package manager from pipenv to conda. Now the app hangs.

Is there any known solution to run code which depends on pytorch on the Streamlit Cloud?

1 Like

Can you share the link to your app?

Sure.

Streamlit URL: https://damianr13-racoont-ai-main-0afsgp.streamlit.app/

Link to the repo: GitHub - damianr13/Racoont-AI

Your app is most likely hanging due to hitting the resource limit. Here’s a graph of the app’s memory usage over the past six hours:

I see. I don’t think I had access to that chart.

Given that the app doesn’t even start, then all this RAM is probably consumed by conda / pip. Do you have any advice on how could I handle this?

Yup, this chart is from our backend. It seems like the size of your dependencies is nearing 3GB. I’d recommend either running the app locally or on a platform where you can pay for increased resources beyond 3GB. Alternatively, if this app is for a nonprofit or educational organization, let us know and we can see what we can do in terms of resources.

1 Like

I understand. I will try running it somwhere else then. Thank you very much for the support

1 Like

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.