I have been updating my apps to recent streamlit versions. There, the old cache has been deprecated in favor of caching resources and data. Unfortunately, this seems to imply that hash_funcs are also deprecated. I find this very unfortunate and I wonder whether an alternative is even possible or whether the deprecation means that I won’t be able to use caching for my function in the future.
import streamlit as st
class Predictor:
def __init__(self, model_name):
self.model_name = model_name
self.model = get_model(model_name)
@st.cache_data(show_spinner=False)
def predict(query: str, predictor: Predictor):
# ... some code to use predictor to predict with query as input data that returns a prediction
# While predictor might not be hashable, a hashfunc would be very useful
# because then we can simply specify {Predictor: lambda predictor: predictor.model_name}
# because the model_name is the distinguishing factor
@st.cache_resource(show_spinner=False)
def get_model(model_name: str, no_cuda: bool = False):
# ... some pytorch code to return a model specific to model_name
I do not see how I can use the cache in predict with the current implementation because I cannot use hash_funcs. I canot simply ignore the predictor variable because of course different predictors/models should return different results. I could easily distinguish between them with hash_funcs by using their model_name property.
What would be the current recommended way of using cache in the predict function above? Note that Predictor actually has a lot of input arguments so I would not like to have the Predictor init inside the predict function.
I don’t know if this is the official solution, but it is more a hack that came to my mind:
@st.cache_data(show_spinner=False)
def predict(query: str, _predictor: Predictor, model_name: str):
# _predictor: disables hashing for this argument
# model_name: just pass predictor.model_name and therefore make the call unique?
I see what you mean, thanks. That would work but is of course not optimal. (And linters won’t like it - unused argument.) Was hoping there was a better solution but thanks for the suggestion.
If a more optimal/official solution is available from others I am glad to hear it.
Thanks for stopping by! We use cookies to help us understand how you interact with our website.
By clicking “Accept all”, you consent to our use of cookies. For more information, please see our privacy policy.
Cookie settings
Strictly necessary cookies
These cookies are necessary for the website to function and cannot be switched off. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms.
Performance cookies
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us understand how visitors move around the site and which pages are most frequently visited.
Functional cookies
These cookies are used to record your choices and settings, maintain your preferences over time and recognize you when you return to our website. These cookies help us to personalize our content for you and remember your preferences.
Targeting cookies
These cookies may be deployed to our site by our advertising partners to build a profile of your interest and provide you with content that is relevant to you, including showing you relevant ads on other websites.