App died suddenly on cloud, working fine locally.

It was working fine. Died on the cloud before I’d made any updates.
Then I updated it to a newer version and encountered same errors. Seems to be stuck in a loop now.

(Redeployed and named main file app.py since these logs, but the errors are the same)

https://paulmontreal.streamlit.app/

https://github.com/paulmontreal/llamaindex1

error logs from streamlit:

PermissionError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you’re on Streamlit Cloud, click on ‘Manage app’ in the lower right of your app).
Traceback:
File “/mount/src/llamaindex1/Public_Chat.py”, line 218, in
resources = initialize_resources()
^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/streamlit/runtime/caching/cache_utils.py”, line 219, in call
return self._get_or_create_cached_value(args, kwargs, spinner_message)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/streamlit/runtime/caching/cache_utils.py”, line 261, in _get_or_create_cached_value
return self._handle_cache_miss(cache, value_key, func_args, func_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/streamlit/runtime/caching/cache_utils.py”, line 320, in _handle_cache_miss
computed_value = self._info.func(*func_args, **func_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/mount/src/llamaindex1/Public_Chat.py”, line 208, in initialize_resources
index = backend.load_or_create_index()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/streamlit/runtime/caching/cache_utils.py”, line 219, in call
return self._get_or_create_cached_value(args, kwargs, spinner_message)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/streamlit/runtime/caching/cache_utils.py”, line 261, in _get_or_create_cached_value
return self._handle_cache_miss(cache, value_key, func_args, func_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/streamlit/runtime/caching/cache_utils.py”, line 320, in _handle_cache_miss
computed_value = self._info.func(*func_args, **func_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/mount/src/llamaindex1/backend.py”, line 89, in load_or_create_index
index = VectorStoreIndex.from_documents(documents)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/llama_index/core/indices/base.py”, line 106, in from_documents
transformations = transformations or Settings.transformations
^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/llama_index/core/settings.py”, line 238, in transformations
self._transformations = [self.node_parser]
^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/llama_index/core/settings.py”, line 141, in node_parser
self._node_parser = SentenceSplitter()
^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/llama_index/core/node_parser/text/sentence.py”, line 103, in init
self._tokenizer = tokenizer or get_tokenizer()
^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/llama_index/core/utils.py”, line 162, in get_tokenizer
enc = tiktoken.encoding_for_model(“gpt-3.5-turbo”)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/tiktoken/model.py”, line 110, in encoding_for_model
return get_encoding(encoding_name_for_model(model_name))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/tiktoken/registry.py”, line 86, in get_encoding
enc = Encoding(**constructor())
^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/tiktoken_ext/openai_public.py”, line 76, in cl100k_base
mergeable_ranks = load_tiktoken_bpe(
^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/tiktoken/load.py”, line 148, in load_tiktoken_bpe
contents = read_file_cached(tiktoken_bpe_file, expected_hash)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/adminuser/venv/lib/python3.12/site-packages/tiktoken/load.py”, line 75, in read_file_cached
with open(tmp_filename, “wb”) as f:
^^^^^^^^^^^^^^^^^^^^^^^^

Is this a me thing, or a streamlit thing?
Thanks for any help.

  1. Share the link to the public app (deployed on Community Cloud).
  2. Share the link to your app’s public GitHub repository (including a requirements file).
  3. Share the full text of the error message (not a screenshot).
  4. Share the Streamlit and Python versions.

had no luck trying to debug this on streamlit. app is still down and stuck in a loop.

tried deploying the same code on hugging face and not seeing any problems there.

seems like a streamlit issue I can’t fix?