I have been testing / memory profiling an app I have. I have tried this with/without caching via st.experimental_singleton.
Locally I notice that when I just launch the shell of my app I get about 200MB allocated to memory. As soon as I enable a Class that’s called in my script which builds some objects, a huge amount of memory gets taken up for the first permutation of what’s output by the Class. Note that many streamlit elements are embedded in the class. So some example psuedo code:
import streamlit as st
st.set_page_config(page_title="My App", layout="wide", initial_sidebar_state='expanded')
class StElems:
def __init__(self, param_1, param_2):
...
def _build(self):
if self.param_1:
# data processing + streamlit output to st.container
if self.param_2:
# data processing + streamlit output to st.container
with st.sidebar:
st_param_1 = st.checkbox('Param 1', value=False)
st_param_2 = st.checkbox('Param 2', value=False)
class_instance = StElems(st_param_1, st_param_2)
class_instance.build()
Why does re-running the app within a session (based on altering param 1 and param 2) result in more memory being allocated versus that memory being released on a rerun?
I relatedely came across this post here: My Experience Deploying an App With Streamlit Sharing | by Amin Yamlahi | Geek Culture | Medium
Which seems to describe a similar issue and their solution was to delete the variables/class objects? Does that make sense?
A tangential question I have is when multiple users / sessions hit the streamlit server it appears a child process is created. How do those child processes get killed? I have noticed no relinquishing of memory after a session/browser is closed either in the example above.
Would appreciate any insight.