Streamlit deployement with a MLFOW model

I have a Streamlit app who works in local but I have a problem in deployement.
I have a ML model upload from a MLFLOW workflow :

model_name = “sk-learn-LGBMClassifier”
stage = “Production”
loaded_model = mlflow.sklearn.load_model(model_uri=f"models:/{model_name}/{stage}")

But i can’t find a way to use this model because he is save localy.
Thanks for your help.

1 Like

Hi @JohanRocheteau, and welcome to our forums! :raised_hands:

To deploy your Streamlit app with an MLflow model on Streamlit Community Cloud, make sure your ML model is accessible from the Streamlit app once it is deployed.

One way to achieve this is by saving your model to a remote server or storage solution. Here are some options you might consider:

  • AWS S3
  • Google Cloud Storage
  • Azure Blob Storage

I hope this helps! Feel free to reach out if you have any other questions.


This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.