I have trained a model (via Keras framework), exported it with
model.save('model.hdf5') and now I want to integrate it with the awesome Streamlit.
Obviously, I do not want to load the model every time the end-user insert a new input, but to load it once and for all.
so my code looks something like that:
@st.cache def load_my_model(): model = load_model('model.hdf5') model.summary() return model if __name__ == '__main__': st.title('My first app') sentence = st.text_input('Input your sentence here:') model = load_my_model() if sentence: y_hat = model.predict(sentence)
In that way I got:
“streamlit.errors.UnhashableType: <exception str() failed>”
I tried to use
@st.cache(allow_output_mutation=True) and when I run a query at the streamlit page. I got:
“TypeError: Cannot interpret feed_dict key as Tensor: Tensor Tensor(“input_1:0”, shape=(?, 80), dtype=int32) is not an element of this graph.”
(Of-course without any cache decorators the model is loaded and works fine)
HOW should I properly load and cache a Keras trained model?
Python ver: 2.7 (unfortunately)
Keras ver: 2.1.3
Tensorflow ver: 1.3.0
Streamlit ver: 0.55.2