chrome://settings/browseros-ai to add Ollama as a provider.gpt-oss:20bOLLAMA_ORIGINS="*" ollama serve
chrome://settings/browseros-ai
ollama model ID
OLLAMA_ORIGINS="*" ollama serve
Unfortunately, Ollama by default doesn’t allow requests from other apps without this.
💥 If you don’t want to run from CLI, we recommend using LM studio. See the guide here - Setting-up LM Studio with BrowserOS

Select the model in Agent drop-down and start using it 🚀