cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Integrate the "AI chatbot" feature with local LLMs and frameworks, e.g. Ollama, WebUI, etc.

ntsarb
Making moves

Hello,

I noticed the "AI chatbot" feature in Firefox, which provides access to public cloud AI chatbots. However, it is now feasible to install a Large Language Model locally, e.g. using Ollama and other frameworks (e.g. WebUI), either on the same computer or on another computer at home (same LAN).

Wouldn't it be great to provide an integration feature of Firefox with our own LLM framework?

Regards

3 REPLIES 3

Picpic
Making moves

Hello,
I confirm, it would be super nice to be able to connect to an ollama server 🙂

sf1tzp
Making moves

There is an awesome suggestion in that thread ^

If you have Open WebUI (or SillyTavern or something) running, you can just put the URL into browser.ml.chat.provider in about:config.

The sidebar just loads and displays a webpage (ChatGPT or whatever) anyway.


You need to run a UI for Ollama at the moment but that feels like a reasonable compromise. It works for me at least.

Type a product name