Deployment with docker + aws ec3

Thank you for developing streamlit as it helps me to deploy my data science project easily.

Newbie here.

The above has been successful and I was thinking if there is a need to use nginx/gunicorn to serve more people. it’s just suppose to be a simple Machine Learning portfolio website with some interactive charts and perhaps to learn more about data engineering. I have deployed nginx and gunicorn via Heroku and it’s very easy but I think AWS should be a little more difficult.

sample https://go.aws/2A6vUqI
docker pull andrewng88/streamlit:firsttry
( if you guys want to play with it )

Also I’m using Linux AMI ( free tier ) at the moment. do I need change to something else?

I did a search and there is some nginx config that pops up but I’m really unsure how to set this up.

I just added this to my docker.

# streamlit-specific commands
RUN mkdir -p /root/.streamlit
RUN bash -c 'echo -e "\
[general]\n\
email = \"\"\n\
" > /root/.streamlit/credentials.toml'
RUN bash -c 'echo -e "\
[server]\n\
enableCORS = false\n\
" > /root/.streamlit/config.toml' 

Please be gentle as I’m really in the learning stage and would want to present a good portfolio.

Thank you.

2 Likes

Hi @Andrew_Ng, glad to hear Streamlit is helping you build your project!

The size of the machine you need is a function of what your app does. If this is a pretty simple app, using the free tier is a great starting place.

In terms of deployment with Docker and nginx, I’ll have to defer to others in the community as I’ve never done it myself. In general, the workflow is that you want to make the Docker image port available on the AWS machine (usually 8501), then you want to point nginx as the reverse proxy to the Streamlit url (usually on port 8501).

I’ll try to find someone internally who might be able to help as well.

Best,
Randy

Hi @Andrew_Ng! I’m not super familiar with the setup you are proposing but there are some tutorials that walk through the steps to deploy Streamlit app on AWS free tier, in case you haven’t seen it already: https://towardsdatascience.com/how-to-deploy-a-streamlit-app-using-an-amazon-free-ec2-instance-416a41f69dc3

If you are going to run the Streamlit app in a Docker container, one thing I can think of is you’ll need to forward a port from the host (i,e,. the EC2 instance) to the container. For example, if you run docker run -p 8501:8501 <image name> ..., it will make the app available at port 8501 on the EC2 instance. See https://docs.docker.com/engine/reference/commandline/run/#publish-or-expose-port--p---expose for more details.

Cheers,
Amey

1 Like

Maybe I should rephrase the question.

I managed to deploy via dockers using 8501 at AWS. It’s quite painful but a very rewarding experience.

I was thinking if there is a need to forward to port 80 and also if streamlit is able to serve many requests like a normal website.

I’m just wondering if it’s possible and nginx OR gunicorn are some of the techs that I should be looking at.

Thanks :slight_smile:

There’s nothing special about port 80 or 443 other than convention. So if you want to have your app running on a andrewdomain.com, then yes, you’d need to use Apache or Nginx.

This is a more interesting question…like everything else, the answer is probably an unsatisfying “maybe”! It depends on what the app is doing. If you’re doing really heavy computations, then you’d probably want to have a few instances running behind a load balancer, so that users wouldn’t be influenced by other people’s computations. But by the same token, if you are using st.cache() everywhere you can, and content is being served from RAM, then you’re going to do pretty well on the scalability side.

So my answer to your overall question is “You shouldn’t worry about it until you know you have a problem”. Which of course applies to so many computer engineering problems :slight_smile:

Thanks for the reply.

So by default we use 8501. Can I just change to say port 88 during at Docker and also at Docker run so that it will execute at 88?

p/s : Can I also use Streamlit to output to an api? Example if I predict spam and no spam and the feature is [1,2,3] the output will be 1 . Hope this link can describe better what I said.

Thanks

When you are using Docker, you have the option to map the Docker port to whatever port you want locally. See the Docker documentation for the details on port forwarding.

This is not currently possible, but we have several open feature requests for this type of functionality on GitHub. Hopefully it’s something we can make available in the future :slight_smile:

@randyzwitch

Thanks for the prompt reply.

Assuming the usuals that I EXPOSE 8501 etc at Docker so all I have to do is

docker container run -p 8080:8501 -d streamlit:app
-> streamlit port internal 8501 to external 8080 )

So if I have a 10gb at AMZ EC2, can I run 2 instance of streamlit that is independent of each other?

Thanks

Yes, that should be it, unless I’m misunderstanding the problem.

You can have as many Docker containers as you want, each running Streamlit, forwarding to different ports and they will be independent. Whether the app instance that you are running on is large enough to support that depends on what your app does.

OK thanks, it’s clear now. Cheers.

1 Like