Hi everyone!
I have an scraping app deployed (https://seochecker.streamlit.app/) that has been in use for the last 8 month with no problems. It is used by many people, and it is used for some especific websites.
It has suddenly stopped working, giving the following error, which I don’t seem to find a solution for:
Traceback (most recent call last):
File “/home/adminuser/venv/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py”, line 542, in _run_script
exec(code, module.__dict__)
File “/mount/src/fever/seo_checklist_app.py”, line 356, in
main()
File “/mount/src/fever/seo_checklist_app.py”, line 196, in main
seot, lenseot = get_seo_title_length(soup)
File “/mount/src/fever/seo_checklist_app.py”, line 20, in get_seo_title_length
lenseot = len(seot.text)
AttributeError: ‘NoneType’ object has no attribute ‘text’
.
This happens with all the functions I’ve defined. Looks like requests.get has stopped working in a deployed environment, as the error is a “NoneType”, as if the variables were empty. It’s working in local, both using Visual Studio Code (jupyterbook notebook) and running the app in the console (streamlit run appname.py), but I don’t know why it’s not working in the cloud.
Requirements.txt is the same as before (I’ve updated libraries now, and so I did with the requirements doc, so everything should be correct).
I’ve done many tests, and seems like the requests.get is now a 202 response instead of a 200, but only in cloud. I’ve added a time.sleep(15) just to see if waiting for it helps, but nothing.
Could it be that something has changed in all the websites (more or less 150)? Should I use a header maybe? Any issue with Streamlit going on?
Thanks!