How to upload large dataset on Heroku?

Hi,

I deployed my first streamlit app on heroku but when I try to upload a csv file using the file uploader even a file larger than 10MB gives me a timeout error or simply stops uploading. Do you have any advise on how to solve the problem? Thanks!

Hi @Sun91, welcome to the Streamlit community!

This sounds like you’re over your allotted disk space on Heroku, which can happen if you install some of the larger, popular machine learning packages. There’s nothing you can do here outside of using fewer packages, using a smaller dataset, or paying for better hosting.

Best,
Randy

Hi @randyzwitch, thank you for the tips… Actually the problem is that i can upload larger files around 100-200MB without any problem by providing their URL, however when i try to upload the csv files from my pc using the file uploader function it doesn’t work… i also found that heroku has a 30 seconds request timeout so the problem is maybe due to the fact that it takes too long for my connection to upload those files…
I belive that uploading directly to AWS S3 as suggested on heroku documentation should solve the problem but i couldn’t find any guide on how to implement it on Streamlit… Any help would be greatly appreciated! Thanks!

Ok, in that case, if it’s not a heroku image size issue, then you won’t be able to transfer images through the streamlit app via browser either.

Can you do a text_input where the user can supply an S3 file address? If so, you can use the boto package (and maybe even requests/urllib) to download the file, then process it.