So I am running a code that will keep reading the processed images from an AI program that is written into a temp.jpeg file. The program will continuously write to this .jpeg file and Streamlit will continuously read this file to sort of stream the results in a designated video space.
The code I am using is as followed:
import streamlit as st
vid_area = st.empty()
with open('temp.jpeg', 'rb'):
frame = read_image()
if frame is not None:
I’m also open to using any other method to accomplish the same task.
The images are streamed to the preferred host
Without st.cache_resource, the images are streamed correctly, but the memory will eventually run out. With st.cache_resource, the first image is read but it will not change.
- Streamlit version: 1.21.0
- Python version: 3.8.10
- Using PyEnv
- OS version: Ubuntu
It appears that as the app is being used, over time the memory will run out owing to the ever growing cache. What you can do is look into clearing up the cache.
Please see this section on Controlling cache size and duration in the Streamlit Docs. For example, setting
ttl=3600 would clear up the cache in 3600 seconds. This parameter is applicable to both
Hope this helps!
Thank you for your help, but this does not seem to help my problem. Given that I’m processing a video that outputs every frame, now I’m using streamlit to read all the frame and stream it. So the ‘ttl’ of each cache is only 0.003s to keep up with a 30fps video. Doing this caused:
Using ‘max_entries’ doesn’t feel right either because there is no input and the file I need to read is only one temp file.
Hope that this gives more context. I’m still new to Streamlit, so I’m open to other suggestions.
This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.