How to deploy App using Ollama and Llama2 in Community cloud?

My app uses Ollama and Llama2 LLM. The deployment of App is successful but while running when it try to connect to local Ollama server, it fails. I also tried to install ollama using shell command in streamlit app but it fails. Is it possible to run ollama in community cloud?

I don’t think you can run an Ollama server in streamlit cloud, or indeed any other server. Only streamlit apps and serve some static content.

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.