App runs slow when multiple users accessing simultaneously

Hi,

First post here. Please let me begin by thanking the developers for building this awesome tool and the community for providing learning material!

I’m developing an app for my team of roughly 20-30 people. Currently, the app is hosted on one of the laptops in the office. The laptop is on 24/7 and pretty much idle most of the time.

When testing the app, we noticed it’s quite slow when multiple people are using it (queries, refreshing pages, etc) simultaneously. Is this normal? The users would use the IP address to visit the app via the intranet.

I’m aware that streamlit creates different sessions for users, but it seems those different sessions are all connected to the same python backend (single-threaded)?

For example, when person A runs something, followed by person B immediately runs something else. Then the backend runs the request sequentially (person A first, then person B), and not in parallel. It feels like there’s a “job queue” and the jobs have to run one by one for my application, but I’m not sure if that’s the right interpretation or if maybe there’s a problem in my setup/code.

My app is using @st.cache decorator when loading data from a sqlite database, then all other queries are performed on the fly (depending on user input) in pandas and plotted using pyecharts and plotly.

Since the dashboard will be shared among 20-30 people, it will be extremely slow if everyone has to wait until all other processes finish…any way to solve this? Thanks!

Hi @jay_pio,

Thanks for posting!

I’ll share some helpful thoughts that @tim posted in this thread:

Streamlit scales differently than your typical web app. The big difference is that, when a user connects to Streamlit, we spin up a thread just for them and run your app.py script for that user. And then Streamlit will re-run that script for that user each time they do something interactive (like pressing a button or sliding a slider); and each time the script is re-run, we spin up and down a thread again to run it. This is all to say, Streamlit itself is multi-threaded (so you’ll benefit from running it on a machine with lots of beefy cores), and its scaling challenges come from the CPU/GPU/memory costs of running Streamlit app scripts very frequently.

I’m not a web-scaling expert by any means, but what this means is that traditional scaling solutions (like putting a caching server in front of your app) probably won’t have the same result for your Streamlit app.

To help your Streamlit app scale, you’ll want to focus on having your app.py script run quickly and efficiently. The best tool to start with is probably Streamlit’s built-in caching utility , which is all about not re-running code that doesn’t need to be re-run!

I’d also recommend checking out these discussions:

Caroline :balloon:

Hey there @jay_pio! Did you find a solution to this? I’m having the exact same issue, so would be great to know if you found a way (and how) :slight_smile: Thanks in advance!