Store Files on Server (streamlit) and access them while air-time of App

Hi Community,

Is there a possibility to store media data ( aka. images) on the server (in a static/media folder) of streamlit while running the app (sharing)?Ofc I don’t want to use the streamlit server as a file storage (which would be nice though - watch out for irony), but I just think it would be nice to store data for runtime.

In my particular case I would like to save images (currently this also happens in “/images/cropped/circle_Josiah_Willard_Gibbs.png”, but I get a 500: “Internal server error.” when I try to load the image later. Why is that and is there another way/ place how it works?

2 Likes

Hi Chris, you can simply store the data in an s3 or google cloud bucket for easy access.

Hi @Cervone23, thanks for your reply.
Do you have experience with this and could you give me an example code?
I have made, lets say, not so good experience with AWS, so I am a bit cautious.

1 Like

Keen to get more info about this, interested too! :slight_smile:

1 Like

@chris_klose thanks for the kind reply, yes it is possible to upload and extract files from cloud locations.

I created a post for retrieving csv data from google cloud link here: https://discuss.streamlit.io/t/google-drive-csv-file-link-to-pandas-dataframe/8057?u=cervone23

You can also upload media files to GCP by creating a google cloud bucket and access the file through a simple url link. The link can easily be converted to a download button for users to download on the front end of a streamlit app - I can provide more code for that if needed!

# upload file to GCP
bucket_name= 'bucket_name'
blob_name = 'blob name'
path_to_file = 'file'


def upload_to_bucket(blob_name, path_to_file, bucket_name):
""" Upload data to a google cloud bucket and get public URL"""
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json(
    'XXXXXX_gcloud.json'
)
# print(buckets = list(storage_client.list_buckets())
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(blob_name)
blob.upload_from_filename(path_to_file)
# returns a public url
blob.make_public()
return blob.public_url

Hope this helps! @Charly_Wargnier @chris_klose

1 Like

Sounds great! Thanks @Cervone23!

Can you upload all the files from a Google Cloud folder at once too?

Thanks,
Charly

@Charly_Wargnier of course! Not entirely sure, maybe a for loop of some sort can work here =)

1 Like

This is super helpful, thank you!