Uploading large files via Streamlit to S3

Hi,

What would be the best way to upload really large files ( 20-100GB+ files ) to an S3 bucket using Streamlit?

I have a fully functional Streamlit → S3 upload working page (using st.file_uploader), but since the files are read into Streamlit (memory) before beeing sent off to S3 it really starts to struggle and sometimes just completely stops/timeouts on very large files.

Is there a way to not have Streamlit read the file and just pass it onto an S3 bucket? Like a normal frontend upload.

I am running Streamlit on an EC2 instance in AWS. Scaling up the instance with more memory is not a viable solution. Tried this with 128GB mem instances and mostly ending up with timeouts.

Thanks for any help with this.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.