Transformers in requirements.txt but Error -> No module named 'transformers'

for my case this code help to install the transformers package in anaconda

conda install -c huggingface transformers

I faced the an issue whereby I could run streamlit_app.py on the local server successfully, however, when I tried to run it on streamlit cloud, it could not be deployed and I’ve been facing this problem even after rebooting the app several times:

[10:31:26] 🐍 Python dependencies were installed from /app/unimate-chatbot/requirements.txt using pip.

Check if streamlit is installed

Streamlit is already installed

[10:31:27] 📦 Processed dependencies!


Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.





Downloading (…)lve/main/config.json: 100%|██████████| 907/907 [00:00<00:00, 234kB/s]11-02 10:33:42.429564] 

Downloading pytorch_model.bin: 100%|██████████| 498M/498M [00:01<00:00, 279MB/s][2023-11-02 10:33:44.694558]  

Downloading (…)neration_config.json: 100%|██████████| 119/119 [00:00<00:00, 19.9kB/s]1-02 10:33:46.741112] 

Downloading (…)okenizer_config.json: 100%|██████████| 727/727 [00:00<00:00, 846kB/s]11-02 10:33:46.917972] 

Downloading (…)olve/main/vocab.json: 100%|██████████| 999k/999k [00:00<00:00, 14.9MB/s]-02 10:33:47.156517] 

Downloading (…)olve/main/merges.txt: 100%|██████████| 456k/456k [00:00<00:00, 48.1MB/s]-02 10:33:47.333449] 

Downloading (…)cial_tokens_map.json: 100%|██████████| 438/438 [00:00<00:00, 525kB/s]11-02 10:33:47.844490] 

/home/appuser/venv/lib/python3.9/site-packages/transformers/models/auto/modeling_auto.py:1499: FutureWarning: The class `AutoModelWithLMHead` is deprecated and will be removed in a future version. Please use `AutoModelForCausalLM` for causal language models, `AutoModelForMaskedLM` for masked language models and `AutoModelForSeq2SeqLM` for encoder-decoder models.

  warnings.warn(

Downloading (…)lve/main/config.json: 100%|██████████| 903/903 [00:00<00:00, 1.07MB/s]1-02 10:33:48.195562] 

Downloading pytorch_model.bin: 100%|██████████| 498M/498M [00:02<00:00, 202MB/s][2023-11-02 10:33:51.202642]  

Downloading (…)neration_config.json: 100%|██████████| 119/119 [00:00<00:00, 27.7kB/s]1-02 10:33:53.080528] 

Downloading (…)okenizer_config.json: 100%|██████████| 26.0/26.0 [00:00<00:00, 29.1kB/s]-02 10:33:53.264995] 

Downloading (…)lve/main/config.json: 100%|██████████| 641/641 [00:00<00:00, 714kB/s]11-02 10:33:53.442568] 

Downloading (…)olve/main/vocab.json: 100%|██████████| 1.04M/1.04M [00:00<00:00, 16.4MB/s]02 10:33:53.693913] 

Downloading (…)olve/main/merges.txt: 100%|██████████| 456k/456k [00:00<00:00, 56.1MB/s]-02 10:33:53.888566] 
[10:06:02] ❗️ Streamlit server consistently failed status checks
[10:06:02] ❗️ Please fix the errors, push an update to the git repo, or reboot the app.

Link to my Github repo here FYI, I am trying to deploy a chatbot which requires to download transformers from huggingface.

Attached is the code in my requirements.txt:

streamlit
streamlit-chat
streamlit-extras
transformers
torch
nltk 

Someone please enlighten me and I would appreciate your help. Thank you.