Video frames not displaying in the same speed as in the local while hosting it on any cloud platforms while using st.image()

This is a repeat of a topic that was created here: and also an issue that was raised here:
I’m not able to find a proper solution or work around for this. It most certainly seems like a streamlit bug.

Hi @derickcjohn . You can’t get the live frame detection in the streamlit cloud. But you can get it in your host. Due the latency of of the model output. If our model is small i.e. if you use only cv2.haarcascade classifier with frontal face then you can see the live frame detection. If the model size gets increased you can see the live frame detection.

Hi @Guna_Sekhar_Venkata I’m not running any model, I’m just outputting a video as frames using st.image(). There is no latency here since the video is already present in my repo. In the issue link that I had mentioned above, you can see the logs where the source of the frames are being generated but st.image is not able to render the images.

Tq @derickcjohn for the correction of my statement.
Happy Streamlit-ing :balloon:

Hey @derickcjohn I have seen that tagged post. Then definitely it is an streaming cloud issue. I also faced that issue many times. Due to that I’m asking user only image for doing any object detection instead of live webcam frame. Feel free to look into my application:-

Streamlit cloud application

Thanks for your response, @Guna_Sekhar_Venkata and that’s an excellent app you have. I’m actually trying to stream frames (after running an object detection model on a live video stream) from my local system to the cloud. So I’m also looking for a method to do that too.