Hi,
What would be the best way to upload really large files ( 20-100GB+ files ) to an S3 bucket using Streamlit?
I have a fully functional Streamlit → S3 upload working page (using st.file_uploader), but since the files are read into Streamlit (memory) before beeing sent off to S3 it really starts to struggle and sometimes just completely stops/timeouts on very large files.
Is there a way to not have Streamlit read the file and just pass it onto an S3 bucket? Like a normal frontend upload.
I am running Streamlit on an EC2 instance in AWS. Scaling up the instance with more memory is not a viable solution. Tried this with 128GB mem instances and mostly ending up with timeouts.
Thanks for any help with this.