How to use Streamlit with Nginx?

Hi Everyone,

While good at python, network stuff is new for me.
In my corporate environment, i can only share https not http addresses with different users.

So I installed nginx, got a certficate and got it running. I am for now using default http://localhost:8501 for my streamlit app and it runs fine.
My goal is for nginx to reroute everything to httpS://localhost:8501.

Is there anyone out there who has a complete nginx conf file that will accomplish this?

Thanks for taking the time to set this up, sadly I cannot get it to work.

When I try the conf file in your guide, it does not work. I get an error on parameter ‘upstream’. Is that the entire .conf file? All i want is to go from http:// localhost:8501 to https://localhost:8501.

(Also, not a big deal, but your streamlit application example lacks some imports and a quotation mark)

Could you share your conf file?

For https you have to include ssl_conf inside server{} nginx conf file. . .

ssl_certificate     www.example.com.crt;
ssl_certificate_key www.example.com.key;
ssl_protocols       TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers         HIGH:!aNULL:!MD5;

Thank you so much Wilson

I just want to add my ten cents.

I configured Streamlit at %userprofile%/.streamlit/config.toml

[server]
baseUrlPath = "dashboard"

And my Nginx at D:\nginx\conf\nginx.conf


    server {
        listen       443 ssl default_server;
        server_name www.example.com;
		index index.php index.html index.htm;

        ssl_certificate      example.com.crt;
        ssl_certificate_key  example.com.key;
		
		ssl_protocols       TLSv1.2;

        ssl_session_cache    shared:SSL:10m;
        ssl_session_timeout  10m;

        ssl_ciphers  AES256-SHA;
        ssl_prefer_server_ciphers  on;
		
		client_max_body_size 2G;

		# Apache server
        location / {
            proxy_pass https://localhost:5555/;
        }
		
		# Streamlit server
        location /dashboard {
            proxy_pass http://localhost:8501/dashboard;
        }
		location /dashboard/static {
			proxy_pass http://localhost:8501/dashboard/static/;
		}
		location /dashboard/healthz {
			proxy_pass http://localhost:8501/dashboard/healthz;
		}
		location /dashboard/vendor {
			proxy_pass http://localhost:8501/dashboard/vendor;
		}
		location /dashboard/stream {
			proxy_pass http://localhost:8501/dashboard/stream;
			proxy_set_header   Host      $host;
			proxy_set_header   X-Real-IP $remote_addr;
			proxy_set_header   X-Forwarded-For $proxy_add_x_forwarded_for;
			proxy_set_header   X-Forwarded-Proto $scheme;
			proxy_buffering    off;
			proxy_http_version 1.1;
			# Also requires websocket:
			proxy_set_header Upgrade $http_upgrade;
			proxy_set_header Connection "upgrade";
			proxy_read_timeout 86400;
		}

	}
1 Like

for 403 error, you also need --server.enableCORS false option.

I was using Nginx as my reverse proxy for the longest and I recently switched over to using Traefik. I have a post about it. It has made life so much easier.

Ref: Deploying Streamlit with Traefik and Docker

Hello everyone! I am trying to deploy a streamlit application on Linux (Ubuntu 18.04.4) machine. I am not deploying it via docker, I am just editing Nginx configuration file at /etc/nginx/nginx.conf. Here is a full file:

events {
	worker_connections 768;
	# multi_accept on;
}

http{
    server {
        listen 80;
        listen 443 ssl;

        server_name ar-hand-api-stg.centralus.cloudapp.azure.com;
        index index.php index.html index.htm;

        ssl_certificate /etc/nginx/ssl/domain-crt.crt;
        ssl_certificate_key /etc/nginx/ssl/domain-key.key;

		ssl_protocols       TLSv1.2;

        ssl_session_cache    shared:SSL:10m;
        ssl_session_timeout  10m;

        ssl_ciphers  AES256-SHA;
        ssl_prefer_server_ciphers  on;
		
		client_max_body_size 2G;

        # Streamlit server
        location /dashboard {
            proxy_pass http://localhost:8501/dashboard;
        }
        location ^~ /static {
            proxy_pass http://127.0.0.1:8501/dashboard/static;
        }
        location ^~ /healthz {
            proxy_pass http://127.0.0.1:8501/dashboard/healthz;
        }
        location ^~ /vendor {
            proxy_pass http://127.0.0.1:8501/dashboard/vendor;
        }
        location /dashboard/stream {
            proxy_pass http://localhost:8501/dashboard/stream;
            proxy_http_version 1.1; 
            proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
            proxy_set_header Host $host;
            proxy_set_header Upgrade $http_upgrade;
            proxy_set_header Connection "upgrade";
            proxy_read_timeout 86400;
        }
    }
}

I am running Streamlit via the next command:

streamlit run streamlit_app.py --server.port 8501 --server.baseUrlPath /dashboard/ --server.enableCORS false --server.enableXsrfProtection false --server.headless=true

And then I am trying to connect to my web app via the next link: https://ar-hand-api-stg.centralus.cloudapp.azure.com/dashboard/

This result are next errors in the terminal (I also attach a screenshot):

Unchecked runtime.lastError: The message port closed before a response was received.

ar-hand-api-stg.centralus.cloudapp.azure.com/:1          GET https://ar-hand-api-stg.centralus.cloudapp.azure.com/dashboard/vendor/bokeh/bokeh-2.4.1.min.js net::ERR_INCOMPLETE_CHUNKED_ENCODING 200 (OK)

bokeh-widgets-2.4.1.min.js:38 Uncaught TypeError: bokeh.register_plugin is not a function
    at bokeh-widgets-2.4.1.min.js:38:20
    at bokeh-widgets-2.4.1.min.js:43:1
    at bokeh-widgets-2.4.1.min.js:32:3
    at bokeh-widgets-2.4.1.min.js:33:3

ar-hand-api-stg.centralus.cloudapp.azure.com/:1          GET https://ar-hand-api-stg.centralus.cloudapp.azure.com/dashboard/static/js/5.df97478a.chunk.js net::ERR_INCOMPLETE_CHUNKED_ENCODING 200 (OK)

Can anyone please help me here? I’ve already tried a dozen of different Nginx and Streamlit configurations, nothing really works. I will really appreciate any help here!

UPD: I solved my problem and successfully deployed my Streamlit app with Nginx.

Details are here.

Hello is this post related to deploying a streamlit app using https protocol?
I need some help in this regard. I have created a streamlit app on AWS EC2 instance. However the IP address appears as http but I need the address to be in the https format. Could u please share your insight with me ion how to do it?
Thank you.

Hello,
with this configuration I have been working since the first day without problems. But today I have seen that if I update my project no longer works because it gives 404. I have seen that they have changed the endpoint names. I tried to change /stream to /_stcore/stream but it still doesn’t work. Do you know what more changes I should do in my Nginx config?

        location / {
            proxy_pass http://localhost:8501/dashboard;
        }
        location /dashboard {
            proxy_pass http://localhost:8501/dashboard;
        }
        location ^~ /static {
            proxy_pass http://127.0.0.1:8501/dashboard/static;
        }
        location ^~ /healthz {
            proxy_pass http://127.0.0.1:8501/dashboard/healthz;
        }
        location ^~ /vendor {
            proxy_pass http://127.0.0.1:8501/dashboard/vendor;
        }
        location /dashboard/stream {
            proxy_pass http://localhost:8501/dashboard/stream;
            proxy_http_version 1.1;
            proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
            proxy_set_header Host $host;
            proxy_set_header Upgrade $http_upgrade;
            proxy_set_header Connection "upgrade";
            proxy_read_timeout 86400;
        }
    }

Thank you very much.
Best regards.

@evtrompa is it possible you updated streamlit? I just spent all day trying to figure out why my nginx configuration was suddenly causing an endless Please wait… error. It turns out it was because the endpoint /stream was changed to /_stcore/stream in new-ish versions of streamlit, which I found here: Any changes regarding websocket for Streamlit v1.14 vs. 1.18?. I adjusted my configuration and now things seem to be working fine. Sharing in case it helps with your issue and because the documentation about this change still seems sparse.

TLDR - Clear the browser cache or try from your phone first.

it only worked AFTER i cleared the browser cache, before that i was getting redirect errors.

this is my nginx block.

HTTP server block

server {
listen 80;
listen [::]:80;
server_name xxxxxxxxxcom www.xxxxxxxxx.com;
return 301 https://$host$request_uri;
}

HTTPS server block

server {
listen 443 ssl;
listen [::]:443 ssl;
server_name xxxxxx.com www.xxxxxxxxxx.com;

ssl_certificate /etc/letsencrypt/live/xxxxx.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/xxxxxxx.com/privkey.pem;
client_max_body_size 250M;

location / {
    proxy_pass http://localhost:8501;
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection "upgrade";
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
}

}

thanks sir… it is working for me

After struggling with this for a few days, these are the settings I finally got to work when connecting your app deployed on an AWS EC2 instance with NGINX.

Within the conf file

server {
    listen 80;
    server_name domain-name.com
    location / {
        proxy_pass http://public-ip-address:8501/;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
    }
}

If you have an SSL certificate obtained from letsencrypt, use this instead

server {
    listen 80;
    server_name domain-name.com; 

    location / {
        return 301 https://$host$request_uri;
    }
}

server {
    listen 443 ssl;
    server_name domain-name.com;

    ssl_certificate /etc/letsencrypt/live/domain-name.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/domain-name.com/privkey.pem;
   
    location / {
        proxy_pass http://public-ip-address:8501/;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;

        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
    }
}


Don’t forget to restart NGINX afterwards with

sudo service nginx restart

I hope this helps :smiling_face:

2 Likes

thank you very much. This works for me.
:star_struck:

Hello @Marc , @nthmost , @yahayakenny , @Mac_Jones, @evtrompa
I am trying the same thing I want to use Nginx to route to different web applications which are on different ports.
Therefore i have made my configuration like this :
image

It didn’t work for me,

but instead of using /app as the route i just used “/” as the location then the application was running on that endpoint. not sure why it was not working when it was /app as the endpoint. I might be missing a few tweaks in the code can any of you please help with this, that would be great.

Thank you in advance :slight_smile:

thanks, it is working for me

It worked out that way for me in the end:

       location / {
           proxy_pass http://localhost:8501/dashboard;
       }
       location /dashboard {
           proxy_pass http://localhost:8501/dashboard;
       }
       location ^~ /static {
           proxy_pass http://127.0.0.1:8501/dashboard/static;
       }
       location ^~ /healthz {
           proxy_pass http://127.0.0.1:8501/dashboard/healthz;
       }
       location ^~ /vendor {
           proxy_pass http://127.0.0.1:8501/dashboard/vendor;
       }
       # Update to reflect the endpoint name change
       location /dashboard/_stcore/stream {
           proxy_pass http://localhost:8501/dashboard/_stcore/stream;
           proxy_http_version 1.1;
           proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
           proxy_set_header Host $host;
           proxy_set_header Upgrade $http_upgrade;
           proxy_set_header Connection "upgrade";
           proxy_read_timeout 86400;
       }