An Swap: Error Code: Out of Memory

Hi Team,
Hope you are doing well.
I have an issue with using Streamlit.
I have written a script where I will upload my csv or excel file and there is a message column in the sheet.
So I am running Hugging Face Offline Model to run sentiment prediction the number of message is more than 1 lac, after processing around 60% of data in aroung 40 minutes it gives memory error and stops abruptly.

Could you please help in updating the config somewhere so that it never stops and continue till my process continues the sentiment prediction.

If I run the same code in Jupyter notebook it take around 1.5 hours and it completes I want same thing to happen in Streamlit UI.

Here is my code snipped where issue is there.

 if str('.')[-1]=='csv':
            with st.spinner("Loading the File....Thanks for your patience :)"):
                data = pd.read_csv(filename) 
            st.success("File has been loaded.")
    start_time =    
    st.write("Sentiment Prediction Started....")
    print("Sentiment Prediction Started....")
    with st.spinner("Sentiment Prediction in progress....."):
        data.loc[:,'finiteautomata_Sentiment'] = data.loc[:,'Cleaned Message'].progress_apply(lambda x: pred_senti(x))

There are lot of other code but the main part is this. Rest is interlinked with package calling. Please let me know if any question.


Hi @Praveen_v , I’m not sure if we can necessarily help you with this problem. I understand Jupyter notebook seemingly can do it but if streamlit is running out of memory, I’m not sure if we can do anything about that. I could suggest trying to forcefully call garbage collection or maybe have an st.cache but I’m kind of out of ideas besides that. You could also get a computer with more RAM possibly?

Hi @willhuang Thanks for your response, But I am running this in a VM having 16 GB of RAM. And after having an issue one time I cleared the cache and other histories of chrome and restarted it again. Still, it has the same issue.

There will be some way to run the long operations on streamlit. Not sure how to achieve this.
My File is aroung 85 MB and the new generated file having 50 MB approx and also the RAM of VM is 16GB so everthing is sufficient.

Also I dont think any application of st.cache here as there is only single file loaded and being processed.

Even if the case, Please let me where should I use st.cache.

Hi @pk80103 , unfortunately, I don’t think I have any specific streamlit features that can help solve this problem if you’re loading one file and doing operations on that file.

I think the only thing that I could maybe recommend is just closing all of your other applications if they’re running. If you don’t have anything running, I’m not sure if Streamlit is the tool to solve this unfortunately. However, maybe someone else on the forum can give ideas to a solution.

Hi @willhuang, Thanks again for responding. I understood that you are trying to help. If you know someone who can help on this matter. Could you please tag them?

Ah… Just got this error also.
Already implemented cache everywhere I can across the app.

Are there are other solutions?

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.