Integrate the "AI chatbot" feature with local LLMs and frameworks, e.g. Ollama, WebUI, etc.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-02-2025 04:07 AM
Hello,
I noticed the "AI chatbot" feature in Firefox, which provides access to public cloud AI chatbots. However, it is now feasible to install a Large Language Model locally, e.g. using Ollama and other frameworks (e.g. WebUI), either on the same computer or on another computer at home (same LAN).
Wouldn't it be great to provide an integration feature of Firefox with our own LLM framework?
Regards
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-24-2025 01:57 PM
Hello,
I confirm, it would be super nice to be able to connect to an ollama server 🙂
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-12-2025 03:55 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-08-2025 01:10 AM
There is an awesome suggestion in that thread ^
If you have Open WebUI (or SillyTavern or something) running, you can just put the URL into browser.ml.chat.provider in about:config.
The sidebar just loads and displays a webpage (ChatGPT or whatever) anyway.
You need to run a UI for Ollama at the moment but that feels like a reasonable compromise. It works for me at least.

