Problem on resources limit

Good day everyone. I would like to ask why I’ m having an error like this.

Hi @Bosti07! Welcome to the Streamlit community! :tada: :partying_face: :tada:

Apps deployed on Streamlit Sharing get up to 1 CPU, 800 MB of RAM, and 800 MB of dedicated storage in a shared execution environment.

What is the size of your dataset? If it’s on the order of Synthetic Financial Datasets For Fraud Detection, uploading subsets of that size a couple of times while simultaneously training your models are bound to exhaust the allotted 800 MB of RAM. It might help to cache the dataset with a shorter TTL.

Happy Streamlit-ing! :balloon:
Snehan

1 Like

Thank you so much for your reply.

1 Like

Good day snehankekre. How can I read a csv file from the upload file. I have a snippet code but I’m having an error.
received_798687757422401

@Bosti07 You can avoid the error by first reading the file with pd.read_csv() and then returning the dataframe df. Here’s an example:

import streamlit as st
import pandas as pd

@st.cache
def read_file(file_path):
    df = pd.read_csv(file_path)
    return df

def main():
    uploaded_file = st.file_uploader("Upload file", type=".csv")
    if uploaded_file:
        st.markdown("Uploaded file:")
        df = read_file(uploaded_file)
        st.dataframe(df)

if __name__=='__main__':
    main()

Hope this helps! :wink:

Best, :balloon:
Snehan

1 Like

Wow it really works on me. Thank you so much.

1 Like

Hi @snehankekre . Can I limit a upload file size from 200MB to 100MB?

Thank you so much.

You can change the default limit for st.file_uploader by editing the following configuration option in your config.toml to be 100MB:

# Max size, in megabytes, for files uploaded with the file_uploader.
# Default: 200
maxUploadSize = 100

Happy Streamlit-ing! :balloon:
Snehan

1 Like

Thank you so much I’m gonna try it now.

Good day @snehankekre . Is there a chance I can view it I’m already hitting the limit for my resources?

Thank you.