How do I deploy an ollama based streamlit UI that pulls downloaded models on the cloud too?

If you’re creating a debugging post, please include the following info:

  1. https://yashvoladoddi37-ollama-st-chat-lwyjbk.streamlit.app/
  2. GitHub - yashvoladoddi37/ollama-st
  3. → httpx.ConnectError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you’re on Streamlit Cloud, click on ‘Manage app’ in the lower right of your app).
  4. streamlit == 1.35, python == 3.11

    ^This image shows the entire traceback of the error