Streamlit app crashes when loading model from gcs bucket

I trained a tensorflow model and then saved it as model.h5 on a gcs bucket, it’s quite big around 1.4GB. But when I try to load it into my streamlit app deployed on the community cloud, the app will crash. It works ok on local, takes about 30 seconds to load_model (eg: streamlit run my_app.py) . Is it because the .h5 file is too big? What can I do to load it faster?

My code:

from keras.models import load_model
@st.cache_resource
def model_loading():
    FS = gcsfs.GCSFileSystem(project=PROJECT_NAME,
                            token=CREDENTIALS
                            )
    with FS.open(MODEL_PATH, 'rb') as model_file:
        model_gcs = h5py.File(model_file, 'r')
        model = load_model(model_gcs)
    return model

Hi @sandyocean

There’s a similar discussion at Large file preventing streamlit deployment and a possible solution is to use Git LFS (links provided therein).

Best regards,
Chanin

2 Likes

Thank you so much! This worked nicely.

1 Like

@sandyocean Glad to hear that it worked out!

Best regards,
Chanin

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.