I got my app to work (well sort of) - shoutout to @snehankekre and @blackary for the help on a previous post.
Good stuff to know
I started learning machine learning and coding in python roughly 3 months ago and this is my first Streamlit app. I am trying to create a simple license plate detector that returns a string of the license plate which then taps into a license plate database and performs a google search of the given car brand, model and year.
The app right now shows an error when no picture is uploaded (that’s okay because the functionality is there once the picture is uploaded).
My issue
However, as a picture is uploaded, the model starts training and that takes up to 10-12 minuts every time - this obviously isn’t ideal and I would like to know if there is a solution such that the model could be uploaded into the app and then predict on new pictures coming in.
Thanks for sharing your question and congrats on getting your app live!
Streamlit Community Cloud now serves as a free resource for users to get started with Streamlit and deploy 1 private app + unlimited public apps. As such, it is not designed for ML training jobs. It is expected behavior that model training takes a long time.
I would like to know if there is a solution such that the model could be uploaded into the app and then predict on new pictures coming in.
Yes, that is possible. You could even upload the model to GitHub and/or use GitLFS (if your model exceeds 25 MB) to do so and in your app use @st.experimental_singleton cache the model. The solution you propose would work too! (albeit it might not be the ideal user experience) It is possible for users to upload models with st.file_uploader and use that model for inference.
Ideally you want to train the model offline (not on Community Cloud), save the model, perhaps even compress/distill the keras model, and upload it to a service like GitHub, AWS, GCP, etc, and then download/load the model in your app. Doing ML training jobs on Community Cloud would mean your app would exhaust it’s 1 GB of RAM fairly quickly.
Figuring out how to refactor your code though is your prerogative All the best, we can’t wait to see what you build!
Thanks for the insightful perspective! I figured there’d be a smarter way of doing this.
Seems like uploading the model to my GitHub repo is the way to go - but then how do I save the model? I have it functioning locally and the app works as well. How do I go about exporting my model as a file? Uploading it is not the issue, but I’m curious as to what type of file a machine learning model is.
Thanks for your help - you have been an awesome source of ideas/solutions during this!
Uploading it is not the issue, but I’m curious as to what type of file a machine learning model is.
Ah, I see! Thanks for clarifying! From your code, it looks like you’re using Keras/TensorFlow. I’d suggest pouring over documentation and tutorials on saving/loading models + searching GitHub of examples of loading saved TF/Keras models. It should be just 1-3 lines of extra code:
You probably should not rely on the following since it’s from 2 years ago, but here’s an example from one of my repos on saving TF models.
I cannot thank you enough for the help that you have given in my first babysteps towards learning data science and machine learning - feels marvelous to know that I started three-four months ago and coming up with this now due to your help
Glad you’ve got it work! It’s all you! You can pay it forward by dropping by the forums any time and helping other community members get unstuck or sharing other cool Streamlit apps you’ve built with those still unaware about Streamlit
Thanks for stopping by! We use cookies to help us understand how you interact with our website.
By clicking “Accept all”, you consent to our use of cookies. For more information, please see our privacy policy.
Cookie settings
Strictly necessary cookies
These cookies are necessary for the website to function and cannot be switched off. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms.
Performance cookies
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us understand how visitors move around the site and which pages are most frequently visited.
Functional cookies
These cookies are used to record your choices and settings, maintain your preferences over time and recognize you when you return to our website. These cookies help us to personalize our content for you and remember your preferences.
Targeting cookies
These cookies may be deployed to our site by our advertising partners to build a profile of your interest and provide you with content that is relevant to you, including showing you relevant ads on other websites.