json.decoder.JSONDecodeError: This app has encountered an error

Summary

My app ArxivGPT was working perfectly and suddenly I get the following error. I delete my app
and reload again but I still get the same error. Could you please clarify what is happening?

json.decoder.JSONDecodeError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you’re on Streamlit Cloud, click on ‘Manage app’ in the lower right of your app).

Traceback:
File “/home/appuser/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py”, line 565, in _run_script
exec(code, module.dict)
File “/app/arxivgpt-chatbot-_streamlitapp/Streamlit_ChatBot.py”, line 16, in
ArxivReader = download_loader(“ArxivReader”)
File “/home/appuser/venv/lib/python3.9/site-packages/llama_index/readers/download.py”, line 133, in download_loader
library = json.loads(library_raw_content)
File “/usr/local/lib/python3.9/json/init.py”, line 346, in loads
return _default_decoder.decode(s)
File “/usr/local/lib/python3.9/json/decoder.py”, line 340, in decode
raise JSONDecodeError(“Extra data”, s, end)

I am using python3.9.

Requriements file content

langchain
llama-index==0.5.20
openai
streamlit
fpdf

Links

* Link to your GitHub repo:
*Link to your deployed app:

1 Like

Hello.

I ran into the same problem, but I upgraded the llama-index version to 0.6.21.post1 and the problem went away. It might be possible to use a lower version, but I haven’t checked.

I would be happy to help you solve your problem.

Hi Tom,

Thank you for your reply! I managed to solve the problem and as you said it was related with llama-index version so I upgraded from 0.5.20 to 0.6.18 and the problem was fixed.

The point is when you have to upgrade then you have to change also part of the code and it is annoying :slight_smile:

Ilias

I’m glad you were able to resolve the issue.

Yes, llama-index makes destructive changes quite fast and it is very hard just to keep up.

Furthermore, the problem this time is that if the application does not keep up with the llama-index updates, it will naturally break. llamahub should review the way it distributes the loader.

1 Like

Yes I absolutely agree with you… and is very annoying…

1 Like

This topic was automatically closed 180 days after the last reply. New replies are no longer allowed.