I recently built a Streamlit interface for chatting with any of 125+ Ollama’s language models. It lets you download and select models to chat with in real-time through a intuitive interface. If you want to talk or use different AI models independently for productivity without dealing with command lines, this might be interesting for you.
The chat isn’t super fancy in terms of features yet, but adding conversation storage and RAG integrations are possible down the line.
Note: To actually chat with the models, you’ll need to run the app locally. Check out the GitHub guide steps to do it.
Feel free to explore it and share your feedback, as it would be very appreciated.
Project Source: GitHub