I noticed that when I store a relatively large and complicated object to
session_state from a form callback then streamlit chokes and exists after a while. It doesn’t doesn’t show me any errors indicating what’s gone wrong even though I have been running with
--logger.level=debug. Any idea what could be going wrong?
- Streamlit version: 1.15.1
- Python version: 3.8.10
- Using Conda? PipEnv? PyEnv? Pex? Conda
- OS version: macOS 12.6.1
- Browser version: Firefox 107
Is this large object better cached than stored in session state?
Other than that, I’m not sure without more information about your script.
I can’t do that because the object represents a large model on a GPU.
Is it possible to see more logs beyond starting streamlit with
I still don’t know what to say without the specific information about your script (code snippets, GitHub links, console logs), but I do know that other people have cached models (searching for “cache model” will give lots of cases, and instances where Streamlit staff have very kindly inspected the cases closely to debug when it was complicated).
It was my (non-expert) understanding that caching a model would be more in line with best practice generally, but perhaps someone with a deeper understanding may be able to speak on your case of using session state for that.
Thank you so much @mathcatsand! I’ll try to put together a small project to reproduce the isse but I’m dealing with a big codebase so it’s not easy. However, I think this could be related to https://www.ray.io/ray-core library. The object I’m trying to save/cache instantiate a ray cluster and my guess is that that doesn’t play nicely with streamlit async/await model.