My app works fine on localhost, but when I upload on cloud the app throws some errors.
Here is the repo where I included a requirement file for used modules. What do you think is the reason ?
I think it is an incompatibility between some of the packages.
Any specific reason why you chose such an ancient version of streamlit?
If not, either use the latest versions in your requirements.txt
file or just leave them away:
pandas
streamlit
openpyxl
That worked, thanks @Franky1
I used an old version because there was a conflict between click and Streamlit packagesβ¦
That was a temporary glitch in one of the streamlit versions about a year ago, but has been fixed long ago.
I am able to deploy the app on localhost but getting error out while deploying on streamlit cloud
I tried downgrading the version as well but its not working
Here is the error
ModuleNotFoundError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if youβre on Streamlit Cloud, click on βManage appβ in the lower right of your app).
Traceback:
File "/home/appuser/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 565, in _run_script
exec(code, module.__dict__)File "/app/streamlit_project/app.py", line 4, in <module>
from streamlit_lottie import st_lottie
Hi @Aditya_Singh, this seems unrelated to the original post, but most like the issue is that you havenβt put streamlit-lottie in a requirements.txt file. Look through App dependencies - Streamlit Docs for more details.
Hi @blackary Yeah I resolved that issue. Inserted requirements.txt with streamlit-lottie and requests.
It works just fine
I faced the same issue whereby I could run streamlit_app.py on the local server successfully, however, when I tried to run it on streamlit cloud, it could not be deployed and Iβve been facing this problem even after rebooting the app several times:
[10:31:26] π Python dependencies were installed from /app/unimate-chatbot/requirements.txt using pip.
Check if streamlit is installed
Streamlit is already installed
[10:31:27] π¦ Processed dependencies!
Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.
Downloading (β¦)lve/main/config.json: 100%|ββββββββββ| 907/907 [00:00<00:00, 234kB/s]11-02 10:33:42.429564]
Downloading pytorch_model.bin: 100%|ββββββββββ| 498M/498M [00:01<00:00, 279MB/s][2023-11-02 10:33:44.694558]
Downloading (β¦)neration_config.json: 100%|ββββββββββ| 119/119 [00:00<00:00, 19.9kB/s]1-02 10:33:46.741112]
Downloading (β¦)okenizer_config.json: 100%|ββββββββββ| 727/727 [00:00<00:00, 846kB/s]11-02 10:33:46.917972]
Downloading (β¦)olve/main/vocab.json: 100%|ββββββββββ| 999k/999k [00:00<00:00, 14.9MB/s]-02 10:33:47.156517]
Downloading (β¦)olve/main/merges.txt: 100%|ββββββββββ| 456k/456k [00:00<00:00, 48.1MB/s]-02 10:33:47.333449]
Downloading (β¦)cial_tokens_map.json: 100%|ββββββββββ| 438/438 [00:00<00:00, 525kB/s]11-02 10:33:47.844490]
/home/appuser/venv/lib/python3.9/site-packages/transformers/models/auto/modeling_auto.py:1499: FutureWarning: The class `AutoModelWithLMHead` is deprecated and will be removed in a future version. Please use `AutoModelForCausalLM` for causal language models, `AutoModelForMaskedLM` for masked language models and `AutoModelForSeq2SeqLM` for encoder-decoder models.
warnings.warn(
Downloading (β¦)lve/main/config.json: 100%|ββββββββββ| 903/903 [00:00<00:00, 1.07MB/s]1-02 10:33:48.195562]
Downloading pytorch_model.bin: 100%|ββββββββββ| 498M/498M [00:02<00:00, 202MB/s][2023-11-02 10:33:51.202642]
Downloading (β¦)neration_config.json: 100%|ββββββββββ| 119/119 [00:00<00:00, 27.7kB/s]1-02 10:33:53.080528]
Downloading (β¦)okenizer_config.json: 100%|ββββββββββ| 26.0/26.0 [00:00<00:00, 29.1kB/s]-02 10:33:53.264995]
Downloading (β¦)lve/main/config.json: 100%|ββββββββββ| 641/641 [00:00<00:00, 714kB/s]11-02 10:33:53.442568]
Downloading (β¦)olve/main/vocab.json: 100%|ββββββββββ| 1.04M/1.04M [00:00<00:00, 16.4MB/s]02 10:33:53.693913]
Downloading (β¦)olve/main/merges.txt: 100%|ββββββββββ| 456k/456k [00:00<00:00, 56.1MB/s]-02 10:33:53.888566]
[10:06:02] βοΈ Streamlit server consistently failed status checks
[10:06:02] βοΈ Please fix the errors, push an update to the git repo, or reboot the app.
Link to my Github repo here FYI, I am trying to deploy a chatbot which requires to download transformers from huggingface.
Attached is the code in my requirements.txt:
streamlit
streamlit-chat
streamlit-extras
transformers
torch
nltk
Someone please enlighten me and I would appreciate your help. Thank you.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.