Streamlit app in Docker container not connecting to Ollama

  1. Hi All, i made my streamlit gen ai app which is working well using ollama local LLM. But when i am trying to deploy it on Linux Server using Docker, the streamlit app is not able to connect with Ollama

Docker build

docker build -t pqchat .
docker run -p 8501:8501 --add-host=host.docker.internal:host-gateway pqchat

Dockerfile
FROM python:3.9-slim
WORKDIR /PQChatbot
COPY . .
RUN pip3 install -r requirements.txt
EXPOSE 8501
HEALTHCHECK CMD curl --fail http://localhost:8501/_stcore/health
ENTRYPOINT [“streamlit”, “run”, “RSJ_PQ_Chatbot.py”, “–server.port=8501”, “–server.address=0.0.0.0”]

Error Message:

ConnectionError: HTTPConnectionPool(host=‘localhost’, port=11434): Max retries exceeded with url: /api/generate/ (Caused by NewConnectionError(‘<urllib3.connection.HTTPConnection object at 0xffff00062a00>: Failed to establish a new connection: [Errno 111] Connection refused’))

The app works fine without Docker environment, but in docker it is not able to connect with Ollama. Please provide pointer to solve this.

docker #Ollama #Deploy #streamlit

actually you have to create 2 images. one of streamlit app and other of ollama. and you have to establish connection between both. ollama will run on 11434 port.
i found one solution on this link:
Ollama — Build a ChatBot with Langchain, Ollama & Deploy on Docker | by A B Vijay Kumar | Feb, 2024 | Medium

Use instead of lacalhost the Gatway address