Exporting the data frame uploaded on Streamlit cloud to AWS bucket

Hello everyone,

I need to upload the data frame uploaded to my Streamlit Cloud application to an AWS bucket. I checked in previous posts how to do this; however, when I need to deploy my code on GitHub, I am receiving many emails from AWS that I cannot post my tokens publicly (which is expected).
Anyone can provide me with a sample code on how can I deploy a code on GitHub that saves a data frame to AWS bucket without using my access tokens?

Hi @gwk01,

Welcome to the Streamlit forum! :wave: :smile:

Here’s a working example of using a Streamlit app to upload a dataframe to an AWS bucket:

It is indeed a huge security risk to post your tokens publicly. Which is why Streamlit and Community Cloud have a Secrets management feature:

When developing your app locally, you can create at the root of your app’s directory a .streamlit/secrets.toml file containing your tokens. For example:

.streamlit/secrets.toml
AWS_TOKEN="thisismytoken"
AWS_OTHER_SECRET="myothersecret"

If you create the above secrets, you can reference them in your app like so:

# streamlit_app.py
import streamlit as st

# Display secrets
st.write(st.secrets["AWS_TOKEN"])
st.write(st.secrets["AWS_OTHER_SECRET"])

The above will output:

thisismytoken
myothersecret

When deploying remotely, however, add this file to your .gitignore so you don’t commit your secrets to GitHub! On Community Cloud, you can paste in your secrets using the Secrets management console.

Also, check out the following as an example of how to use secrets:

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.