Deployment Troubleshooting

I am trying to deploy a lung abnormality detection app where the user submits a radiograph and the model detects abnormalities. The app works beautifully on my local computer, but is not working at deployment with the error below. Anyone have any ideas?

Current working directory: /mount/src/chest-x-ray-abnormality-detection-multi-label-cnn

Model file path: /mount/src/chest-x-ray-abnormality-detection-multi-label-cnn/final_df.pkl

FileNotFoundError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you’re on Streamlit Cloud, click on ‘Manage app’ in the lower right of your app).

Traceback:

File "/home/adminuser/venv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 534, in _run_script
    exec(code, module.__dict__)File "/mount/src/chest-x-ray-abnormality-detection-multi-label-cnn/lung_abnormality_detection_app.py", line 23, in <module>
    loaded_model = joblib.load(model_file)
                   ^^^^^^^^^^^^^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/joblib/numpy_pickle.py", line 648, in load
    obj = _unpickle(fobj)
          ^^^^^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/joblib/numpy_pickle.py", line 577, in _unpickle
    obj = unpickler.load()
          ^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/pickle.py", line 1213, in load
    dispatch[key[0]](self)File "/usr/local/lib/python3.11/pickle.py", line 1590, in load_reduce
    stack[-1] = func(*args)
                ^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/scikeras/_saving_utils.py", line 49, in unpack_keras_model
    model: keras.Model = load_model(temp_dir)
                         ^^^^^^^^^^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/keras/src/saving/saving_api.py", line 238, in load_model
    return legacy_sm_saving_lib.load_model(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/keras/src/utils/traceback_utils.py", line 70, in error_handler
    raise e.with_traceback(filtered_tb) from NoneFile "/home/adminuser/venv/lib/python3.11/site-packages/tensorflow/python/saved_model/load.py", line 991, in load_partial
    raise FileNotFoundError(
  • Where do you deploy? Streamlit Cloud?
  • Can you please share a link to your public github repo?

Hi @Kendall_McNeil :wave:

Welcome to our community!

It’s key to verify the file path to your trained model (final_df.pkl). In light of the FileNotFoundError error message, it’s possible that the file path differs between your local development environment and the deployment environment.

Could you please confirm that the model file is indeed located at the specified path in the deployment environment?

Best wishes,
Charly

Hello! Thanks for the help as I am new to streamlit. Here is my github: GitHub - kmcneil901/Chest-X-Ray-Abnormality-Detection-Multi-Label-CNN: The objective of this project is to detect a variety (14 total) of common thoracic lung abnormalities in chest x-rays by building a Convolutional Neural Network (CNN) to develop an AI system for thoracic lung abnormality detection.

Thank you for the help as I am new to streamlit! I am almost positive the model is in the current path as it was pushed to the repo. That is where it should be for the deployment environment, correct? I printed the path and it looks correct for both local & deployment.

After some more troubleshooting, I am now receiving the following error:

  File "/home/adminuser/venv/lib/python3.11/site-packages/joblib/numpy_pickle.py", line 648, in load
    obj = _unpickle(fobj)
          ^^^^^^^^^^^^^^^
  File "/home/adminuser/venv/lib/python3.11/site-packages/joblib/numpy_pickle.py", line 577, in _unpickle
    obj = unpickler.load()
          ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/pickle.py", line 1213, in load
    dispatch[key[0]](self)
  File "/usr/local/lib/python3.11/pickle.py", line 1590, in load_reduce
    stack[-1] = func(*args)
                ^^^^^^^^^^^
  File "/home/adminuser/venv/lib/python3.11/site-packages/scikeras/_saving_utils.py", line 49, in unpack_keras_model
    model: keras.Model = load_model(temp_dir)
                         ^^^^^^^^^^^^^^^^^^^^
  File "/home/adminuser/venv/lib/python3.11/site-packages/keras/src/saving/saving_api.py", line 238, in load_model
    return legacy_sm_saving_lib.load_model(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/adminuser/venv/lib/python3.11/site-packages/keras/src/utils/traceback_utils.py", line 70, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File "/home/adminuser/venv/lib/python3.11/site-packages/tensorflow/python/saved_model/load.py", line 991, in load_partial
    raise FileNotFoundError(
FileNotFoundError: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for ram://10766e14d8b74364a5223461a674b378/variables/variables
 You may be trying to load on a different device from the computational device. Consider setting the `experimental_io_device` option in `tf.saved_model.LoadOptions` to the io_device such as '/job:localhost'.
  • The file path to the model is not the problem
  • Your requirements.txt file will not work on streamlit cloud

See here my debug branch for some quick fixes:

Hi - Thank you so much for looking into my repo and cleaning it up a bit! Was it working on your end? Unfortunately, I am still receiving the error below even after using the cleaned up versions you provided. Any ideas?

FileNotFoundError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you’re on Streamlit Cloud, click on ‘Manage app’ in the lower right of your app).

Traceback:

File "/home/adminuser/venv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 534, in _run_script
    exec(code, module.__dict__)File "/mount/src/chest-x-ray-abnormality-detection-multi-label-cnn/lung_abnormality_detection_app.py", line 10, in <module>
    loaded_model = joblib.load(model_path)
                   ^^^^^^^^^^^^^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/joblib/numpy_pickle.py", line 658, in load
    obj = _unpickle(fobj, filename, mmap_mode)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/joblib/numpy_pickle.py", line 577, in _unpickle
    obj = unpickler.load()
          ^^^^^^^^^^^^^^^^File "/usr/local/lib/python3.11/pickle.py", line 1213, in load
    dispatch[key[0]](self)File "/usr/local/lib/python3.11/pickle.py", line 1590, in load_reduce
    stack[-1] = func(*args)
                ^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/scikeras/_saving_utils.py", line 49, in unpack_keras_model
    model: keras.Model = load_model(temp_dir)
                         ^^^^^^^^^^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/keras/src/saving/saving_api.py", line 238, in load_model
    return legacy_sm_saving_lib.load_model(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.11/site-packages/keras/src/utils/traceback_utils.py", line 70, in error_handler
    raise e.with_traceback(filtered_tb) from NoneFile "/home/adminuser/venv/lib/python3.11/site-packages/tensorflow/python/saved_model/load.py", line 991, in load_partial
    raise FileNotFoundError(

and this is in the manage app message:

File "/home/adminuser/venv/lib/python3.11/site-packages/keras/src/utils/traceback_utils.py", line 70, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File "/home/adminuser/venv/lib/python3.11/site-packages/tensorflow/python/saved_model/load.py", line 991, in load_partial
    raise FileNotFoundError(
FileNotFoundError: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for ram://ec7c434f0b9b4065a57b2ee83122cb7a/variables/variables
 You may be trying to load on a different device from the computational device. Consider setting the `experimental_io_device` option in `tf.saved_model.LoadOptions` to the io_device such as '/job:localhost'.

Yes it worked on my local windows computer, haven’t tried it on streamlit cloud yet.

On your local computer or on streamlit cloud?

It is working great on my local still. But I am receiving this error when I attempt to deploy.

I got also the same error when i tried in in a local docker container.
I think the reason probably is that your are using pickle/joblib instead of keras own methods. I would try to save the original model as h5 with keras and also load the model with keras.

Here are some hints:

THIS WORKED! Thank you thank you thank you! It has been a long 48 hours of debugging and I am super thankful it is now working. Appreciate your time and support.

2 Likes

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.

Glad you got it sorted in the end, @Kendall_McNeil!

Kudos to @Franky1 for the support here!

Best,
Charly"