Model size too large?

Hi everyone,
I am deploying my first streamlit app on the Community cloud and am constantly encountering errors. My first obstacle was that the requirements were not read in, but I could solve that problem. The app now loads all required libraries, which is great. However, I have another problem which I am stuck at for a month now.
When I want to run my app the following error pops us directly after deployment.

After a few experiments I assume that it could be because of the size of the used machine learning model, which is a tiny bit larger than 1GB. What is irritating me is that the error message indicates that it runs locally. At least thatโ€™s what I attribute the โ€œsystem memoryโ€ part in the error message to.

Help would be much appreciated as I already tried a lot to troubleshoot the app.

  1. Are you running your app locally or is it deployed?
    Deployed

  2. If your app is deployed:
    a. Is it deployed on Community Cloud or another hosting platform? โ†’ Community Cloud

    b. Share the link to the public deployed app. โ†’ https://lid-app.streamlit.app/

  3. Share the link to your appโ€™s public GitHub repository โ†’ GitHub - Wolftess/LID: A language identification systems working for over 400 languages, including several German dialects like Tyrolean, Swabian and Low Saxon.

  4. Share the full text of the error message (not a screenshot). โ†’ see above

  5. Share the Streamlit and Python versions. โ†’ latest version streamlit, python_version = โ€œ3.10.12โ€

Thanks a lot!

You might have exceeded the resource limits.

Have a look on this guide on resource limits.

Post your error message as text not as image.