What's a good webserver for Streamlit(non-WSGI) + FastAPI(ASGI) dockerized?


I am looking for deployment methods for a streamlit + fastAPI application. Advising here, or just pointing me to a particular stack that definitely works or to a tutorial would be very much appreciated.


I have a Streamlit + FastAPI app locally, and only one page of the multi-paged app uses the Fast API, and the others are heavily streamlit + scripts.

Structurally, I followed what seems to be a famous example by Davide Fiocco. So there are two containers and they communicate via a bridge network (setting in dockerfile).

I was able to sort of get this up with Apache following a tutorial, but it hangs on the “Please wait…” page eternally, and the troubleshooting steps I’ve followed from streamlit do nothing.

Instead of hacking away at solving that, I’m questioning whether this deployment via networked docker containers + apache even makes sense, since a few questions popped up as I researched solutions:

FastAPI seems to be commonly served via Nginx/Uvicorn/Gunicorn to make use of its ASGI – is Davide’s method via container which interacts only with the streamlit frontend still allowing for it to make use of its potential? Would that combo even work for my Streamlit + FastAPI app, given Streamlit doesn’t support WSGI? (although there is at least one example of a community member making it work.)

I’m wondering what you all think of Traefik - I was wondering, but after reading about some performance test results, it seems like I should stick to Nginx if I can (although the configuration for Traefik does seem far more friendly…). Now - do I drop apache and go on to finding a solution that involves Nginx/Uvicorn/Gunicorn with my Streamlit+FastAPI app? How do I handle the fact that Streamlit does not play with WSGI?

More detail about my app in case it helps:

The streamlit frontend serves an openai LLM gateway with requests to a fastapi backend handling logic. There are other pages that pass around more complex objects that cannot be passed via api, so they just exist on their own individual pages. Features are fairly computationally expensive so I do care about speed and am concerned about ability to serve to multiple users.

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.