Proper way to do model serving on Streamlit?

Good day to all,
Please advise what is the proper way for model serving in Streamlit.
Let’s say I have a model that does credit scoring and system that originates input data for the model.
Flow of input data is stochastic/on demand.
What is the proper way of interfacing input data with the model that is wrapped inside streamlit?

Thanks, Paul.

Hi Paul!

Thanks for posting! Let’s dig a bit deeper into what you’re hoping to do.

Are you looking for a way to serve the credit scoring model so that another machine can consume the result? Or, are you looking for a way to have a human understand and interpret the results of the model outputs for a particular input?

Do you need a way to have a human input the model’s inputs? Or, are you trying to pass in the inputs programmatically?

Best,
John

Hi John, thanks for a prompt response.
My question was on the data feed to streamlit app with the model inside, i.e. it relates to you clarification:

Or, are you trying to pass in the inputs programmatically?

When another system has a new data for scoring it would call a streamlit app passing the data and either get a response or streamlit app will update its GUI with a new scoring on a dashboard.

This one:

Do you need a way to have a human input the model’s inputs?

is straight forward with sreamlit input controls.

This one:

Are you looking for a way to serve the credit scoring model so that another machine can consume the result?

is better of with any RESTfull API - Flask of Springboot

Regards Paul.

Hey Paul - there’s currently no good way to get a Streamlit app to automatically re-run in response to some external, non-user input.

It’s something we’re currently thinking about, however! There’s an open feature request here; feel free to follow it, and chime in if you have input on the direction you’d like to see Streamlit take here!