Connection Refused When Calling Ollama LLM from Streamlit Cloud

Hi everyone,

I’ve built a Streamlit app that records audio, transcribes it, then sends the transcript to an Ollama model (mistral) via langchain_ollama. Locally it works perfectly, but when I deploy to Streamlit Cloud, every call to ChatOllama.invoke() fails with:

ConnectionRefusedError: [Errno 111] Connection refused

Environment:

  • Streamlit Cloud build: Python 3.12
  • Requirements: streamlit, speech_recognition , langchain_ollama, langchain-core
  • Ollama server: Docker container on DigitalOcean, publicly accessible, SSL-terminated

Repository:
https://github.com/zainkhan10/streamlit-voice-chat

Question:
Has anyone successfully integrated an external Ollama server with a Streamlit Cloud app? Or are there network restrictions on Streamlit Cloud that prevent outbound calls to custom ports? Any pointers on how to diagnose or work around this would be hugely appreciated!

Thanks in advance!
—Zain