Bypassing the 200MB size limit for multi-file uploads?

Hello! :wave:

I’d like to build a utility via Streamlit for server log files. One of the challenges is that the files I need to concatenate are ~500MB each (that is, above the 200MB file size limit).

Is there a way to increase that cap?

No worries if not, I can build a connector to my google cloud buckets, but I thought it may be a handy thing to have :slight_smile:


You can update the 200MB file size limit with server.maxUploadSize. The value is in MB and defaults to 200. If you have problems, let me know!

1 Like

Superb, thanks Karrie! :raised_hands:

Hi Karrie,

I’ve 2 more questions :slight_smile:

I understand that server.maxUploadSize would need to be added as an argument. such as:

streamlit run --server.maxUploadSize=1028

However, how would it work for:

  • A deployment in Heroku?
  • A deployment in Streamlit Sharing?

I’ve also stumbled pon this note from @nthmost:

Caution : The maxUploadSize of 50MB was set to try to ensure a good working experience with Streamlit. So you can set this limit arbitrarily high, but we can’t make any guarantees about how well the Streamlit app will work for you.

Is there such a thing as a cumulative upload size beyond which the app might not work properly?


Yup, it works on Streamlit Sharing if you set the config option in .streamlit/config.toml:


This works in Heroku too. But there you can also specify the setting as a CLI argument, since you have access to changing those.

1 Like

Thanks @thiago, that’s useful.

Do you know what’s the max value, if there’s such a cap?

To make the app I have in mind useful, ideally I’d need at least a couple of gigabytes to be uploaded.


The main limitation I can think of is that uploaded files are currently stored in memory.

The files stay in memory for as long as the file uploader widget shows that file on its UI.

1 Like

Ok great. Thank you.

I’ll experiment with various sizes and will report accordingly. :slight_smile: