Spawning Process(es) Within a Streamlit Script Can Lead to a Corrupted User Session if a Script Run is Interrupted/Rerun

I’m trying to understand the expectations around Streamlit rerunning the script in reaction to new user input. For instance, when the script is rerun because the user changes the selection is a selectbox.

A lot of the times when a user does this (while the script is still running), the app will lock up and no longer complete running or respond to new user input. A total page refresh is necessary to recover.

Potentially of note, my app spawns several processes (via multiprocessing.pool.Pool) that initiate sometimes longish (10s - 2m) queries on external databases.

When my app locks up and stop responding in this situation, I typically see Streamlit print “Stopping…” to the console where I started Streamlit. When I press CTRL+C in the same console, i just get more printing of “Stopping…” and I eventually have to manually kill the original Streamlit process and it’s children.

Is there something my script should do to respond to the fact Streamlit is restarting the script? is there a signal my script can respond to and cancel all these queries and such?


1 Like

I’ve been playing around with this some more and I think I’ve isolated the issue.

Instead of using a process pool, I tried a thread pool and the problem doesn’t seem to present. Although, my app runs noticeably slower…:frowning:

I believe this comes down to how Streamlit is killing an in-process script run. Killing the in-process script run appears to be quite abrupt with no signals to the running process (at least, none that Python would just natively interrupt for). Normally, I imagine killing the script this way is probably fine but if a multiprocessing.pool.Pool has been instantiated, this just leaves the child processes hanging. I think this then leads to a corrupted session since the session is seemingly blocked from resetting or something because of the hanging child processes.

So, I think this means Streamlit does not really support spawning children processes within a script if that script can be rerun within the same session. I’ll try to drop this in a github issue later.



1 Like

We’ve also observed this behavior when trying to run calculations via a multi-processing pool and have had to adopt a more convoluted architecture (setting up a web-service that runs the multi-threaded computation and returns the result). I’m curious if the streamlit team/other users have some recommendations here.

Hey @rmartin16 - welcome to Streamlit, and thanks for the detailed post!

Indeed, this is something we do not handle well. App scripts run within Streamlit’s process, and there’s no way - currently - for Streamlit to know about other threads or processes created by the app script, and shut them down.

(I don’t think there’s any way for a Python signal to be sent to a specific thread - as far as I understand, signals in Python are always handled by the main thread, so installing an app-specific signal handler wouldn’t work here because Streamlit can’t send that signal to a thread.)

I wonder if we could add something like st.signal_handler(handler_function) that would allow app scripts to register a “Streamlit signal handler” to be called when the script is being terminated early, so that multi-threaded or multi-process Streamlit apps can handle cleanup tasks properly. I’m just thinking out loud here - this is not currently on any roadmap, but might that solve your use case?

1 Like

Yeah, I think so. Such as signal would allow my script to at least attempt to terminate anything it have may spun up; although, managing this use-case is usually overly complicated. A causal user of Streamlit may not make the effort to handle this. So, as a separate consideration, it might be nice to just ensure these processes/threads are properly terminated by Streamlit. Even now that I’m using threads instead of processes, I’m seeing a build-up of threads associated to the main Streamlit process. I assuming that threads are being orphaned similar to how the processes were (maybe…not sure here). Thanks for all this, though; great stuff.