Model size too large?

Hi everyone,
I am deploying my first streamlit app on the Community cloud and am constantly encountering errors. My first obstacle was that the requirements were not read in, but I could solve that problem. The app now loads all required libraries, which is great. However, I have another problem which I am stuck at for a month now.
When I want to run my app the following error pops us directly after deployment.

After a few experiments I assume that it could be because of the size of the used machine learning model, which is a tiny bit larger than 1GB. What is irritating me is that the error message indicates that it runs locally. At least that’s what I attribute the “system memory” part in the error message to.

Help would be much appreciated as I already tried a lot to troubleshoot the app.

  1. Are you running your app locally or is it deployed?
    Deployed

  2. If your app is deployed:
    a. Is it deployed on Community Cloud or another hosting platform? → Community Cloud

    b. Share the link to the public deployed app. → https://lid-app.streamlit.app/

  3. Share the link to your app’s public GitHub repository → GitHub - Wolftess/LID: A language identification systems working for over 400 languages, including several German dialects like Tyrolean, Swabian and Low Saxon.

  4. Share the full text of the error message (not a screenshot). → see above

  5. Share the Streamlit and Python versions. → latest version streamlit, python_version = “3.10.12”

Thanks a lot!

You might have exceeded the resource limits.

Have a look on this guide on resource limits.

Post your error message as text not as image.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.