I deployed the model as required, but it didn’t load properly. I don’t know what went wrong:
────────────────────────────────────────────────────────────────────────────────────────
[16:30:27] 🐍 Python dependencies were installed from /app/lut_streamlit/requirements.txt using pip.
Check if streamlit is installed
Streamlit is already installed
[16:30:28] 📦 Processed dependencies!
Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.
Note that I have no problem using streamlit locally. With the streamlit cloud deployment model, however, this is what happens.
I think the reasons for this are:
- My model requires gpu for inference, so CUDA is not available at the beginning. So I added whether CUDA was available or not, but still no response
My github repository link: https://github.com/XLR-man/LUT_streamlit
Run in runstreamlit.py
https://xlr-man-lut-streamlit-runstreamlit-fl31om.streamlit.app/
When I visit my app from other people’s computers, it opens a blank page, and when I visit my computer, it also opens a blank page. The log display shows no error