Streamlit app on Google Cloud Run Cannot upload more than 32 MB file

Was able to create a simple file uploader app, that accepts files and place them in the proper google bucket foler. For this app, a requirement is that it uploads large images/files (>32MB) run locally just fine, but when deploying a streamlit app on Google Cloud Run (managed) with an st.file_uploader("Choose a file(s)", accept_multiple_files=True)

I get an AxiosError 413 with an image over 32MB:

Now I know this is a Google Cloud Run (managed) issue since one of the known limits, and giving it a good search yields workarounds like generating signed urls but since this error is hit before the file is accessible on the streamlit python code side, does anyone know a workaround for this?

Some of the things I could image would be chunking the file request or directly generating a signed url to send to a google bucket, but all of these options would require custom overloading of the file_uploader widget which I feel like this isn’t a good idea.

On the deployment side there are options like deploying on Nginx and using http/2.0 since there are no request limits there, but this deployment is much more complex than what I’m looking for the simplest solution.

We are currently facing the same problem. Were you able to bypass this problem? Nice regards!

@robin1010101 Easiest way to solve this problem is to upload the files to cloud storage and get signed urls. This restriction is in place to avoid APIs getting blocked by large file uploads.

I created a component to solve this problem.
You can use this to split files into chunk sizes and upload them to streamlit.