Thank you for developing streamlit as it helps me to deploy my data science project easily.
Newbie here.
The above has been successful and I was thinking if there is a need to use nginx/gunicorn to serve more people. itās just suppose to be a simple Machine Learning portfolio website with some interactive charts and perhaps to learn more about data engineering. I have deployed nginx and gunicorn via Heroku and itās very easy but I think AWS should be a little more difficult.
sample https://go.aws/2A6vUqI
docker pull andrewng88/streamlit:firsttry
( if you guys want to play with it )
Also Iām using Linux AMI ( free tier ) at the moment. do I need change to something else?
I did a search and there is some nginx config that pops up but Iām really unsure how to set this up.
Hi @Andrew_Ng, glad to hear Streamlit is helping you build your project!
The size of the machine you need is a function of what your app does. If this is a pretty simple app, using the free tier is a great starting place.
In terms of deployment with Docker and nginx, Iāll have to defer to others in the community as Iāve never done it myself. In general, the workflow is that you want to make the Docker image port available on the AWS machine (usually 8501), then you want to point nginx as the reverse proxy to the Streamlit url (usually on port 8501).
Iāll try to find someone internally who might be able to help as well.
If you are going to run the Streamlit app in a Docker container, one thing I can think of is youāll need to forward a port from the host (i,e,. the EC2 instance) to the container. For example, if you run docker run -p 8501:8501 <image name> ..., it will make the app available at port 8501 on the EC2 instance. See https://docs.docker.com/engine/reference/commandline/run/#publish-or-expose-port--p---expose for more details.
Thereās nothing special about port 80 or 443 other than convention. So if you want to have your app running on a andrewdomain.com, then yes, youād need to use Apache or Nginx.
This is a more interesting questionā¦like everything else, the answer is probably an unsatisfying āmaybeā! It depends on what the app is doing. If youāre doing really heavy computations, then youād probably want to have a few instances running behind a load balancer, so that users wouldnāt be influenced by other peopleās computations. But by the same token, if you are using st.cache() everywhere you can, and content is being served from RAM, then youāre going to do pretty well on the scalability side.
So my answer to your overall question is āYou shouldnāt worry about it until you know you have a problemā. Which of course applies to so many computer engineering problems
So by default we use 8501. Can I just change to say port 88 during at Docker and also at Docker run so that it will execute at 88?
p/s : Can I also use Streamlit to output to an api? Example if I predict spam and no spam and the feature is [1,2,3] the output will be 1 . Hope this link can describe better what I said.
This is not currently possible, but we have several open feature requests for this type of functionality on GitHub. Hopefully itās something we can make available in the future
Yes, that should be it, unless Iām misunderstanding the problem.
You can have as many Docker containers as you want, each running Streamlit, forwarding to different ports and they will be independent. Whether the app instance that you are running on is large enough to support that depends on what your app does.
Thanks for stopping by! We use cookies to help us understand how you interact with our website.
By clicking āAccept allā, you consent to our use of cookies. For more information, please see our privacy policy.
Cookie settings
Strictly necessary cookies
These cookies are necessary for the website to function and cannot be switched off. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms.
Performance cookies
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us understand how visitors move around the site and which pages are most frequently visited.
Functional cookies
These cookies are used to record your choices and settings, maintain your preferences over time and recognize you when you return to our website. These cookies help us to personalize our content for you and remember your preferences.
Targeting cookies
These cookies may be deployed to our site by our advertising partners to build a profile of your interest and provide you with content that is relevant to you, including showing you relevant ads on other websites.