So I am running a code that will keep reading the processed images from an AI program that is written into a temp.jpeg file. The program will continuously write to this .jpeg file and Streamlit will continuously read this file to sort of stream the results in a designated video space.
The code I am using is as followed:
import streamlit as st
import cv2
vid_area = st.empty()
@st.cache_resource
def read_image():
try:
with open('temp.jpeg', 'rb'):
return cv2.imopen("temp.jpeg")
except:
return None
while True:
frame = read_image()
if frame is not None:
vid_area.image(frame, channels='BGR')
I’m also open to using any other method to accomplish the same task.
Expected behavior:
The images are streamed to the preferred host
Actual behavior:
Without st.cache_resource, the images are streamed correctly, but the memory will eventually run out. With st.cache_resource, the first image is read but it will not change.
It appears that as the app is being used, over time the memory will run out owing to the ever growing cache. What you can do is look into clearing up the cache.
Thank you for your help, but this does not seem to help my problem. Given that I’m processing a video that outputs every frame, now I’m using streamlit to read all the frame and stream it. So the ‘ttl’ of each cache is only 0.003s to keep up with a 30fps video. Doing this caused:
KeyError: ‘d41d8cd98f00b204e9800998ecf8427e’
Using ‘max_entries’ doesn’t feel right either because there is no input and the file I need to read is only one temp file.
Hope that this gives more context. I’m still new to Streamlit, so I’m open to other suggestions.