We built a clean, minimal AI chat interface powered by Ollama Cloud Models, designed as a fast workspace for trying multiple frontier LLMs in one place.
You only need a single Ollama Cloud API key to chat with 19+ top models—no separate OpenAI/Gemini keys, no billing setup, or no credit card needed.
This project is developed by DataVyn Labs (DataVyn Labs · GitHub)
What it does
- Talk to 19+ Ollama cloud models (OpenAI, DeepSeek, Qwen, Gemini, Mistral, Kimi, GLM, MiniMax, and more) from a single UI.
- Upload
.txt,.pdf,.json,.py,.csvand send the content to the model. - Voice input via mic with automatic transcription.
- Secure API key handling (session-only, never saved to disk).
- Dark, Claude-style interface built entirely with Streamlit.
deployed app
-
github repo - anshk1234/DataVyn-Labs-X-Ollama-agents
You just need an Ollama Cloud API key (Settings → API Keys on ollama.com) and you’re ready to go.
