Running app deployed in Snowflake
App is deployed in private cloud, it is not publicly available.
Error message:
Failed to save CSV file to S3: Could not connect to the endpoint URL: “https://xxx-snowflake-data-bucket.s3.ca-central-1.amazonaws.com/test/SQL_export.csv”
Streamlit app code:
import streamlit as st
import pandas as pd
from snowflake.snowpark.context import get_active_session
import boto3
# Streamlit page configuration
st.set_page_config(page_title='Snowflake Data Exporter', layout='wide')
# Get active Snowflake session
session = get_active_session()
user_name = st.session_state.get('username', 'Unknown User')
# Define your S3 bucket details
s3_bucket_name = 'xxx-snowflake-data-bucket'
s3_folder_path = 'test/'
# Function to execute query and return results as a Pandas DataFrame
def run_query(query):
result = session.sql(query).collect()
df = pd.DataFrame(result)
return df
st.title('Snowflake Data Exporter')
# Input for custom query
query = st.text_area("Enter your query below", height=150)
if st.button('Run Query'):
if query:
df = run_query(query)
st.write(df)
csv_data = df.to_csv().encode('utf-8')
s3_file_path = f'{s3_folder_path}SQL_export.csv'
# Initialize Boto3 client with your AWS credentials
s3 = boto3.client(
's3',
region_name='ca-central-1', # Use the correct region
aws_access_key_id='XXXXXXXXXXXXXXX',
aws_secret_access_key='XXXXXXXXXXXXXXXXXXX',
#aws_session_token='YOUR_SESSION_TOKEN', # Uncomment if you're using temporary credentials
)
try:
# Upload the CSV data to S3
s3.put_object(Bucket=s3_bucket_name, Key=s3_file_path, Body=csv_data)
st.success(f"CSV file saved to {s3_file_path}")
except Exception as e:
st.error(f"Failed to save CSV file to S3: {e}")
else:
st.warning('Please enter a query to run.')
Streamlit / Python versions: Python 3.8 / Streamlit 1.22.0 / boto3 1.29.1
Not sure why I am having so many issues writing to my S3 bucket.
Any additional help or pointers would be much appreciated.
Shawn