Deploying a streamlit app that uses a Tensorflow Model saved in Saved Model format

I’m trying to build an app that loads a model that is saved in Tensorflow’s SavedModel Format. The app is rendering correctly and loading the model when I do it from my end. I successfully managed to push the model to GitHub using GitLFS(Because the model is bigger than 1GB). The app has deployed successfully, but the moment I try to load the model in the deployed app, it crashes instantly. The deployment logs after the app crashed are shown below. Please suggest anything that I can do, so that I don’t run into these issues

2023-07-04 17:04:10.968606: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2023-07-04 17:04:11.007242: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2023-07-04 17:04:11.008726: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-07-04 17:04:11.978552: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
[17:06:41] ❗️ Streamlit server consistently failed status checks

Hey @Siddharth_Reddy,

Thanks for sharing this question! It sounds like your app is most likely crashing on Community Cloud due to resource usage exceeding the 1GB limit – if you can share the link to the app, I can double-check / share more info on what the resource usage for that app looks like.

Hi Caroline. Thank you for your reply. I fixed my issue, and yes you are correct, my app was exceeding the resource usage limit of 1GB. I have now deployed my app to Huggingface spaces and it works perfectly!

1 Like

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.