UnhashableTypeError: Cannot hash object of type _thread.lock, found in the body of load_detector_model(). While caching the body of load_detector_model(), Streamlit encountered an object of type _thread.lock, which it does not know how to hash

Hereā€™s the Google colab link to replicate the error.

The error has been occuring only on Google colab and I am not able to comprehend why.

I am trying to use st.cache() on load_detector_model and inference function so that after predicting the cropped image, it wonā€™t reload when I try to change the parameters (sliders in side menu).

Hereā€™s the full traceback:

File "/content/Automatic-License-Plate-Recognition/app.py", line 53, in <module>
    @st.cache(suppress_st_warning=True, allow_output_mutation=True)File "/usr/lib/python3.6/copyreg.py", line 65, in _reduce_ex
    raise TypeError("can't pickle %s objects" % base.__name__)

Also would you kindly point me towards resources that can help me to display an generated / cropped image onto another streamlit page, because as soon as i change the page the previous elements seems to get discarded.

Itā€™s a problem specific to keras-retinanet.

I found an UGLY workaroundā€¦

import keras.backend.tensorflow_backend as tb
tb._SYMBOLIC_SCOPE.value = True

1 Like

Thanks for coming back around on these threads @dracarys3 and following up. The next person running into an awkward problem like this thanks you :slight_smile:

1 Like