Upload File: 400 Bad Request, maximum recursion depth exceeded

Hi, let me just clarify that this is an issue which only occurred when I deploy the streamlit app in AWS Fargate. It is working fine in local and AWS EC2 instance. However, I have unable to find a solution so that I can debug, and wondering if the community can help.

The error came about from a 400 bad request when re-uploading a CSV file. The file is only of size 4kb. The error is intermittent, and seem to work only if I manually click on the “x” button and then refresh the browser.

Checks on the logs revealed that there was issues removing the previous file in session, with the error “RecursionError: maximum recursion depth exceeded”.

2021-02-01T17:23:11.829+08:00	File "/usr/local/lib/python3.7/site-packages/streamlit/uploaded_file_manager.py", line 178, in remove_files
	2021-02-01T17:23:11.829+08:00
    self._remove_files(session_id, widget_id)
	self._remove_files(session_id, widget_id)
	2021-02-01T17:23:11.829+08:00
  File "/usr/local/lib/python3.7/site-packages/streamlit/uploaded_file_manager.py", line 163, in _remove_files
	File "/usr/local/lib/python3.7/site-packages/streamlit/uploaded_file_manager.py", line 163, in _remove_files
	2021-02-01T17:23:11.829+08:00
    self.update_file_count(session_id, widget_id, 0)
	self.update_file_count(session_id, widget_id, 0)
	2021-02-01T17:23:11.829+08:00
  File "/usr/local/lib/python3.7/site-packages/streamlit/uploaded_file_manager.py", line 229, in update_file_count
	File "/usr/local/lib/python3.7/site-packages/streamlit/uploaded_file_manager.py", line 229, in update_file_count
	2021-02-01T17:23:11.829+08:00
    self._on_files_updated(session_id, widget_id)
	self._on_files_updated(session_id, widget_id)
	2021-02-01T17:23:11.829+08:00
  File "/usr/local/lib/python3.7/site-packages/streamlit/uploaded_file_manager.py", line 81, in _on_files_updated
	File "/usr/local/lib/python3.7/site-packages/streamlit/uploaded_file_manager.py", line 81, in _on_files_updated
	2021-02-01T17:23:11.829+08:00
    self.on_files_updated.send(session_id)
	self.on_files_updated.send(session_id)
	2021-02-01T17:23:11.829+08:00
  File "/usr/local/lib/python3.7/site-packages/blinker/base.py", line 267, in send
	File "/usr/local/lib/python3.7/site-packages/blinker/base.py", line 267, in send
	2021-02-01T17:23:11.829+08:00
    for receiver in self.receivers_for(sender)]
	for receiver in self.receivers_for(sender)]
	2021-02-01T17:23:11.829+08:00
  File "/usr/local/lib/python3.7/site-packages/blinker/base.py", line 267, in <listcomp>
	File "/usr/local/lib/python3.7/site-packages/blinker/base.py", line 267, in <listcomp>
	2021-02-01T17:23:11.829+08:00
    for receiver in self.receivers_for(sender)]
	for receiver in self.receivers_for(sender)]
	2021-02-01T17:23:11.829+08:00
  File "/usr/local/lib/python3.7/site-packages/streamlit/server/server.py", line 266, in on_files_updated
	File "/usr/local/lib/python3.7/site-packages/streamlit/server/server.py", line 266, in on_files_updated
	2021-02-01T17:23:11.829+08:00
    self._uploaded_file_mgr.remove_session_files(session_id)
	self._uploaded_file_mgr.remove_session_files(session_id)
	2021-02-01T17:23:11.829+08:00	File "/usr/local/lib/python3.7/site-packages/streamlit/uploaded_file_manager.py", line 196, in remove_session_files
	2021-02-01T17:23:11.829+08:00
    self.remove_files(*files_id)
	self.remove_files(*files_id)
	2021-02-01T17:23:11.829+08:00
RecursionError: maximum recursion depth exceeded

The code snippet is as below. I tried to define a high recursion limit but it has no use.

import sys

import pandas as pd
import streamlit as st

sys.setrecursionlimit(10000)


def main(url):
    st.title('MVP Demo Site')
    uploaded_file = st.file_uploader("Upload CSV Data", type=['csv'])

Help? :frowning:

Hi @Jake -

It appears this is a bug:

Best,
Randy

Hi @randyzwitch thanks for zeroing in.

Our cloud engineer mentioned that because the streamlit app is holding states of file in the file uploader, it can only run in a single container, rather than making it scalable. After turning off the load balancer, everything works fine.

Hope this can help someone who encountered the same issue.

On a separate note, does anyone know if it can work on scale and overcome the file uploader issue?

1 Like

I’ll have to pass this on to one of our engineers who has more experience with these sorts of deployment issues.

Hi @Jake !

because the streamlit app is holding states of file in the file uploader, it can only run in a single container, rather than making it scalable

If you want to run multiple replicas of the same app, you will have to configure sticky routing between a browser session and a serving container. This is because Streamlit uses WebSocket connections for rendering the app but it uses an HTTP connection for the file uploader widget. For the app to work correctly, both WebSocket and HTTP connections must be established with the same container instance.

Hope this helps,
Amey

2 Likes

hi @Jake I’ve a similar issue with an app deployed on ECS using Fargate. So am interested in how your solve the issue in the end. Are you recommending that removing the ALB is a possible solution?

Hi, I’m having the same issue on Cloud Run (GCP); how could it be fixed?

1 Like

I have the same issue, any update here?

Same issue here on GCP Cloud Run

This fixed it for me:

If you want to run multiple replicas of the same app, you will have to configure sticky routing between a browser session and a serving container. This is because Streamlit uses WebSocket connections for rendering the app but it uses an HTTP connection for the file uploader widget. For the app to work correctly, both WebSocket and HTTP connections must be established with the same container instance.

On GCP:

1 Like

Hi, if anyone is using kubernetes, just add following 4 lines under annotations section in ingress file for streamlit frontend app. It will fix the streamlit issue.

nginx.ingress.kubernetes.io/affinity: “cookie”
nginx.ingress.kubernetes.io/session-cookie-name: “route”
nginx.ingress.kubernetes.io/session-cookie-expires: “172800”
nginx.ingress.kubernetes.io/session-cookie-max-age: “172800”

For those who have this issue when hosting on AWS EC2 instances, it is caused when running the app spread out across multiple instances. Like @amey-st mentions, this is due to Streamlit forming a connection to multiple instances at once causing the app to malfunction. A simple way of getting around this (assuming your ec2 instances are being served by a load balancer) is to enable “sticky sessions” in the settings for your load balancer.

@JackLangerman @jakeoliverlee Do you know how to fix the same issue on gcp app engine.? I am running streamlet on app engine and getting 404 error whenever I click on download the file