02-04-2025 04:07 AM
Hello,
I noticed the "AI chatbot" feature in Firefox, which provides access to public cloud AI chatbots. However, it is now feasible to install a Large Language Model locally, e.g. using Ollama and other frameworks (e.g. WebUI), either on the same computer or on another computer at home (same LAN).
Wouldn't it be great to provide an integration feature of Firefox with our own LLM framework?
Regards
24-04-2025 01:57 PM
Hello,
I confirm, it would be super nice to be able to connect to an ollama server 🙂
12-06-2025 03:55 PM
08-07-2025 01:10 AM
There is an awesome suggestion in that thread ^
If you have Open WebUI (or SillyTavern or something) running, you can just put the URL into browser.ml.chat.provider in about:config.
The sidebar just loads and displays a webpage (ChatGPT or whatever) anyway.
You need to run a UI for Ollama at the moment but that feels like a reasonable compromise. It works for me at least.