Is Streamlit Cloud compatible with llama3.2-vision?

My app runs fine locally but not when deployed to Steamlit’s Community Cloud.

  1. Are you running your app locally or is it deployed? Runs fine locally but not when deployed to Steamlit’s Community Cloud.
  2. If your app is deployed:
    a. Is it deployed on Community Cloud or another hosting platform?
    Community Cloud
    b. Share the link to the public deployed app.
    https://reztech.streamlit.app/
  3. Share the link to your app’s public GitHub repository
    GitHub - balance/vision
    vision/requirements.txt at master · balance/vision · GitHub
  4. Share the full text of the error message (not a screenshot).
ConnectError: [Errno 111] Connection refused


During handling of the above exception, another exception occurred:


────────────────────── Traceback (most recent call last) ───────────────────────

  /home/adminuser/venv/lib/python3.12/site-packages/streamlit/runtime/scriptru  

  nner/exec_code.py:88 in exec_func_with_error_handling                         

                                                                                

  /home/adminuser/venv/lib/python3.12/site-packages/streamlit/runtime/scriptru  

  nner/script_runner.py:579 in code_to_exec                                     

                                                                                

  /mount/src/vision/app.py:69 in <module>                                       

                                                                                

    66 │   │   │   │   │   st.warning("No advice could be retrieved for this d  

    67 │   │   │   │   │   logging.warning("Empty response received from ollam  

    68 │   │   │                                                                

  ❱ 69 │   │   │   except ollama.exceptions.ConnectionError as conn_err:        

    70 │   │   │   │   st.error("Failed to connect to the analysis service. Pl  

    71 │   │   │   │   logging.error(f"ConnectionError while accessing ollama   

    72                                                                          

────────────────────────────────────────────────────────────────────────────────

AttributeError: module 'ollama' has no attribute 'exceptions'
  1. Share the Streamlit and Python versions.
    streamlit==1.40.0
    Python 3.12

Ollama runs AI models that require significant computation, which might not be available on the community cloud. Therefore, you will have to use a separate API for inference tasks.