cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Share your feedback on the AI services experiment in Nightly

asafko
Employee
Employee

Hi folks, 

In the next few days, we will start the Nightly experiment which provides easy access to AI services from the sidebar. This functionality is entirely optional, and it’s there to see if it’s a helpful addition to Firefox. It is not built into any core functionality and needs to be turned on by you to see it. 

If you want to try the experiment, activate it via Nightly Settings > Firefox Labs (please see full instructions here). 

We’d love to hear your feedback once you try out the feature, and we’re open to all your ideas and thoughts, whether it’s small tweaks to the current experience or big, creative suggestions that could boost your productivity and make accessing your favorite tools and services in Firefox even easier.

Thanks so much for helping us improve Firefox!

3,736 REPLIES 3,736

Are there other AI services that you think should be in the list? You can try it out in the sidebar by setting your AI service's url for `browser.ml.chat.provider` from about:config.

Here's an example of setting Claude (which has since been added to the list). https://connect.mozilla.org/t5/discussions/share-your-feedback-on-the-ai-services-experiment-in-nigh...

BiORNADE
Making moves

Thinking abt it, that sounds like what a Sidebar would do. Wouldn't the right click functions be an extension, and instead of trying to make it ChatGPT only, make a sidebar so users can put their Chatbots?

In the end, everybody wins

Have you tried setting a custom chat provider to be any page you want with `browser.ml.chat.provider`? It might be similar to what you're asking for, so potentially if that matches up with what people want, we could try to figure out a more streamlined way to do this. E.g., "use current tab as chatbot" ?

WoodwardIII
Making moves

Any plan to integrate with Apple intelligence on macs? Also, it would be great to see integration with local LLMs like Ollama (not sure how working execution but maybe typing in a local host address)

Making use of on-device local inference APIs provided by the OS is an interesting approach as various platforms are adding capabilities on macOS/iOS, Windows, Android that should be much faster for those with newer hardware. The current chatbot feature can make use of local LLMs like llamafile exposed on localhost running on existing hardware, so we'll want to see how we can support both old and new devices with acceptable performance and quality.

wutongtaiwan
Familiar face

I suddenly had an idea that some people are afraid of AI violating privacy, but as long as AI is used in the right place, privacy can be protected. For example, using AI to identify website trackers, there are many trackers in the world, and these trackers are not necessarily on the list of trackers in Firefox, and using AI to identify possible trackers may be a good way to protect privacy. Of course, it can also be blocked by mistake, so it's best to put this feature in Strict mode with enhanced tracking protection

Thanks for the suggestion. Were you thinking this would be a custom model running locally as we probably wouldn't want to send requests to check if something is a tracker, so perhaps something similar to the existing safe browsing that detects phishing and malware sites? This is still AI in some form, but it might not be a good fit for generative AI.

lstep
Making moves

Support for an open source Chat bot (through an API) like ollama (https://ollama.com/) would be greatly appreciated, it would also allow more privacy as local LLMs can be used. Ollama supports the standard OpenAI API, so it would "just" need to get the base_url as a parameter and the model...

Currently the chatbot feature supports any web chatbot including open-source https://llamafile.ai which runs LLMs locally on-device. llamafile also supports OpenAI API chat/completions, but the current Firefox implementation relies on a server responding with a webpage to show in the sidebar.

Would you want a chatbot or maybe a dedicated summarize feature (without followup chat/questions) that directly uses inference APIs potentially pointed at locally running ollama?

Tom4
Making moves

You should provide an option for on device models or focus on providing access to privacy respecting AI-services, the integration of proprietary ai services that have free user usage limits, require registration, and have problems with privacy should not be integrated into the browser.

Are there particular prompts that you would find more useful? Building on what we've implemented for local PDF alt-text generation, we can use that for running other models such as something specialized for generating summaries. Because these models run on-device, there wouldn't be usage limits but the quality and response speed will depend on your hardware.

haingdc
Making moves

Hi, It's take a while for chatbot sidebar to load anytime a switch from other sidebar tool to chatbot sidebar. It's feel like delay time can affect to user experience. Hope you guy make it faster. Thanks

These chatbots from various providers are hosted webpages, so requests will depend on your location. Are the requests similarly slow if you open these chatbots in a regular tab? Maybe we can make it feel a bit more responsive by hiding the previous content when you're switching and showing a loading indicator?

reckless
Making moves

Hi, this feature is very interesting.

Could it be possible to connect self-hosted LLVMs like OLLAMA? (on localhost, as well as remoted with https)

I've seen ollama showing a chatbot webpage similar to self-hosted llamafile. Does the sidebar load the webpage when you set browser.ml.chat.provider to your own (local) server? It might not support passing in a prompt, but you'll at least have your self-hosted chatbot available in the sidebar.

We're currently exploring not requiring the server to respond with html and instead Firefox displays responses from calling an inference API, and I believe all llamafile, vllm and ollama support POST to /v1/chat/completions.

raidingshaman
Making moves

Need Keyboard shortcuts in order to maximize productivity with this new AI feature

Thanks, we've filed this bug for a keyboard shortcut. https://bugzilla.mozilla.org/show_bug.cgi?id=1905027

If you're not using the new Sidebar (also available in Firefox Labs), adding the sidebar icon by customizing your toolbar might help.

Lumither
Making moves

Why not try to make `browser.ml.chat.shortcuts` config available in settings (instead of opening about:config).

Did you find that pref looking around about:config? You should be able to toggle the shortcuts behavior directly from Firefox Labs (about:settings#experimental) now:

labs select.png

hazing3991
Making moves

It would be nice to include DuckDuckGo AI Chat as an option to the chatbot sidebar. It provides anonymous access to popular AI models.

You can configure a custom provider by setting `browser.ml.chat.provider` to something like "https://duckduckgo.com/?q=chat&ia=chat" or now "https://duck.ai" (which redirects to a similar url), but it seems like DuckDuckGo AI Chat currently requires a ?q= value and ignores it, so passing in a prompt and selected text doesn't work currently.

This could still be useful to have anonymous chatbot access to AI models side-by-side with regular tabs.

gmeneguzzo
Making moves

If I am reading an italian text (e.g.) and I select it , then I click the emoji to summarize it using AI , mozilla send to Claude this request

"Please summarize the selection using precise and concise language. Use headers and bulleted lists in the summary, to make it scannable. Maintain the meaning and factual accuracy."

as result I receive a summarize in English which is useless (for me).
You should change the prompt as follow

Please summarize the selection using precise and concise language. Use headers and bulleted lists in the summary, to make it scannable. Maintain the meaning and factual accuracy. Summarize using the same language in the original text.

Or you should add in the emojy menu "Summarize in default language set in the lab section"




Are you using a localized Firefox or language pack? The prompt passed in should be in Italian: "Riassumi la selezione utilizzando un linguaggio preciso e conciso. Usa intestazioni ed elenchi puntati nel riassunto per renderlo facile da leggere. Mantieni il significato e l’accuratezza fattuale."

Even with English prompts, we've noticed chatbots tend to respond in the language of the selected text. But you can also modify the prompts and/or prefix to include "in Italian."

user8
Making moves

Can't choose duck.ai

You can set `browser.ml.chat.provider` to "https://duck.ai" to have it available in the sidebar, but it looks like passing in prompts and selected text does not work with this page right now.

duck.ai.png

k2662
Making moves

You can't run a local llm on the Ai chatbot UI. Can this feature be added?

Are you talking about running a LLM directly in Firefox or using local chatbot providers? There's existing configuration that others here have gotten llamafile or ollama working in the sidebar including passing in prompts. Firefox alt-text generation for pdfjs supports running various models including LLMs, so there's a path to doing it within Firefox, but currently it's quite slow. If people do want to try this out on Nightly, we can look into exposing this at least for advanced users with sufficient hardware.

askabaz
Making moves

Can you also provide DuckDuckGo AI Chat (https://duck.ai/ or https://duckduckgo.com/?q=Duck+Assist&ia=chat)? I tried to implement it by modifying the string "browser.ml.chat.provider" but it doesn't work.

Edit: I tried to set "browser.ml.chat.provider" to "https://duckduckgo.com/?q=&ia=chat&bang=true": It works but it is very easy to get the error "Search query entered was too long. Please shorten and try again.".

DDG AI Chat weirdly doesn't work if the search query ("q=" part in the URL) is empty. On a quick test, it seems to work for me setting it to https://duck.ai or https://duckduckgo.com/aichat.
The layout is slightly broken (weird horizontal scrollbar) but that seems to be DDG's fault, not Firefox.

EDIT: I see what you mean now. If I ask it to summarize info for example, I get that error message too. I don't know if there's a way to make it work by editing the config strings... 🤔

dotnsau
Making moves

Adding AI chatbots to browsers is a great idea, but you should do it the way Brave browser did. It's probably the only browser that did it sensibly, by displaying an option like "Ask Leo" while typing in the search bar. Similarly, here there should be an option to choose a default chatbot in the settings, and then when typing in the search bar, we should see options like "Ask ChatGPT", "Ask Google Gemini", etc., depending on the chatbot we chose as the default. Of course, I believe that eventually, such an option should be built-in by default, and then we can disable it in the settings, not the other way around

Mozilla has made it very clear that they are going to stick to opt-in for the foreseeable future, and I respect that decision. AI should always be optional.

shelley
Making moves

So I guess it will just put the chatbots page side by side in the page that I am currently in? lol

I can probably tweak the source code and change it to any website and it will do the job. 😂

shelley sells seashells by the seashore

The best part of this feature is the highlight menu, because of the auto-prompts and such.
And yes, you can do that, but you could also just make an extension to use it, which is publicly available, much easier to write, and much easier to distribute. https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/user_interface/Sidebars

jahtnamas
Making moves

i figured it out, y'all— mozilla's trying to make the firefox icon real by using its flagship browser to set the planet on fire! i get it now!

but for real, do not do this. i absolutely despise when an organization backpedals on their core values and the disappointment in such acts compounds nigh-daily now. how does anyone in the tech sector sleep soundly at night anymore? (...oh, right. money.)

 

Indeed.  That is the issue that violates trust in the product:  "backpedals on their core values"

Castor
Making moves

What the hell??? How do you look at everything happening around AI —the environmental cost, the incredibly bad and dangerous results it gives, the public outcry from artists, creators, and basically anyone with enough common sense and knowledge on the situation— and still decide "yeah i'll implement that"? Whoever came up with the idea is only second in terms of stupidity to who ever decided to actually go through with it.

No one wants that. Some people /may/ think they want it, but they have simply fallen for the lies of tech  bros out there trying really hard to make it seem like AI is actually intelligent (it is not).

Mozilla Firefox is one of the only good alternatives to chromium, I love it, I love how you guys have been working to protect our privacy. Don't do this to us now, this is such a stupid decision, it will only make everything worse. Come on.

I can only guess that perhaps the way A.I. is being implemented brings in more advertising revenue and more data collecting revenue.  What else is has become as motivating as profit in these bizarre times?

EarlMardle
Making moves

It's not just about the AI, which "no"

It's a very deep concern about the standard of decision-making at Mozilla that you would even give house room to the idiocy around AI

What other decisions are you making about FF with processes as broken, and frankly stupid, as that one?

this. to me jumping on the "AI" bandwagon indicates a lack of good sense and customer focus

alisonborealis
Making moves

I have been using firefox since ~2006. This is my first time leaving a comment, but I am leaving a comment because my ongoing use of Firefox depends on this issue.

I have ABSOLUTELY ZERO INTEREST in supporting AI implementation on Firefox.

The things that make Firefox appealing are anathema to AI. You cannot have a "more secure" browser if you implement AI. You cannot have a browser that respects people's self-created content if you implement AI. You cannot have a browser that makes using the internet more streamlined and effective if you implement AI. You cannot have a browser that respects people's privacy if you implement AI. These are not challenges. These are realities.

AI, as we all know, was built on plagiarism, copyright infringement and outright theft. Without theft, AI wouldn't work. Without ongoing theft, AI will not improve - we've seen the data on models degrading when they train on model outputs instead of raw data; AI needs the raw data (which nobody is ethically sourcing). As we've seen Google Search and Google Scholar degrade to the point of uselessness because of the addition of AI, we know that adding AI to browsers increases false and misinformation via the LLMs outright lying, manufacturing facts, and "hallucinating" information.

It seems counterproductive for Firefox to implement AI (particularly if Mozilla wants to retain or grow its userbase). As we're witnessing the growing use of AI in deepfakes and revenge porn, and in replacing actual work by real human beings, if Mozilla wants to respect the safety, privacy, and employability of its users, it does not benefit Mozilla (reputationally or practically) to implement AI on Firefox or any of its other products.

If Mozilla values its users, Firefox will not implement AI.

Type a product name