cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Share your feedback on the AI services experiment in Nightly

asafko
Employee
Employee

Hi folks, 

In the next few days, we will start the Nightly experiment which provides easy access to AI services from the sidebar. This functionality is entirely optional, and it’s there to see if it’s a helpful addition to Firefox. It is not built into any core functionality and needs to be turned on by you to see it. 

If you want to try the experiment, activate it via Nightly Settings > Firefox Labs (please see full instructions here). 

We’d love to hear your feedback once you try out the feature, and we’re open to all your ideas and thoughts, whether it’s small tweaks to the current experience or big, creative suggestions that could boost your productivity and make accessing your favorite tools and services in Firefox even easier.

Thanks so much for helping us improve Firefox!

3,072 REPLIES 3,072

I believe that feature isn't necessarily on search engine result page as it can generally include the page contents to the chatbot. Are you wanting the Firefox feature to automatically share the page with the chatbot instead of requiring text selection?

Maybe you can give a button to choose whether or not you want to share the page. I think Brave's AI is more helpful and useful to me

I'm willing to share the page with the chatbot, but there may be some privacy-conscious people who will object, so a toggle can be provided in the browser settings for the user to decide whether to enable it or not, and not to enable it by default.

Would a different interaction, e.g., a dedicated "share page" button, separate from the current text-selection interaction work? Instead of a toggle, someone could decide not to make use of a 1-click "summarize" or "help me understand" on a search result page.

A toggle could also be useful such as part of another action, e.g., share the page with chatbot when opening reader view, so it's still one interaction after enabling?

I feel like it's a little better to separate the share page button from the text selection interaction

ffffff
Making moves

If Mozilla commits to going ahead with this feature unchanged despite significant localized backlash in this tiny discussion thread, you'd better start planning and preparing both the technical and the average user targeted PR for the release wayyy ahead.

Privacy-Preserving Attribution is arguably good, yet tech media still made a circus out of it, with countless users denouncing "Mozilla's Evil Fall to Advertising Revenue", completely missing the point of the feature and rarely taking the time to read a single paragraph on what it actually does. It's unfortunate, but some things really didn't go well with PPA's release.

This feature is arguably bad. Straight-up, no arguing, simply bad for many people. I don't know how one could spin this well—maybe there isn't even a good strategy, just a least bad one—but Mozilla must lay the groundwork, somehow. Right now, I feel like I can see the writing on the wall, except instead of writing it's a stain and the wall is actually Mozilla's reputation.

Again, I hope you folks figure something out, and it all goes well, because I'm not terribly excited about going into another space with numerous angry users who may or may not have read past the title of an article, to try and add nuance to the conversation and explain what Mozilla is actually doing and why. I know nobody asked me to do this, but I feel like I have to, sometimes, because it really ought to be Mozilla's job, yet the ball keeps getting dropped.

Maybe the majority of users don't care, so it seems advantageous frustrating a couple thousand to please a million, or something. I don't know if I agree with that choice, but it's your prerogative, I suppose.

We had multiple blog posts as part of landing the initial exploration of the feature:

This only landed in Nightly 129 less than a month ago, and we can continue to default the feature off as we make improvements. Do you have suggestions of what needs to be added to the feature before the release or the messaging?

I suggest deleting the feature and apologizing publicly.

Seconded.

I agree on the strongest possible terms

Not only do I remember that first blog post, I also remember the overwhelmingly negative response to it. People were saying this was a bad idea way back in June. Do Mozilla not read feedback? Please read the room.

aminought
Making moves

@MardakHello! I have a suggestion. Can you add page url into context here? Some AI chats can answer more accurate if you provide this information.

Sure, we can add more to the context, so any other values you think might be useful? Just checking, are you expecting the default prompts to include the url or that you want custom prompts to be able to include url and other values?

I think that providing title, url and selection is enough for correct answer. I've configured and actually use two custom prompts for my needs and in my language:
1. Summarize: only title, but I would prefer page url instead.
2. Explain: only selection.

IMO, custom prompts and custom providers are unnecessary.

The latest Nightly 130 (20240726152430) includes a url in the context for prompt inclusion or targeting. It's not used in any prompts by default yet, but we'll look for feedback on how it might be used for prompts.

wutongtaiwan
Familiar face

Please forgive me for saying something here that is not related to this topic. I hope Mozilla will drop the feature of privacy attribution, or turn it off by default, otherwise Mozilla could be subject to a hefty fine if Europeans sue Mozilla for violating the GDPR

Allan-L
Making moves

Olá, no final dos prompts padrões poderia colocar a linguagem que esta sendo utilizada na interface do firefox para que os chatbots respondam na linguagem do usuario: EX: <prompt>... reply in "<current ff language>"

EX 2: em portuguese-pt-br configured in settings

<prompt>... reply in Portuguese-PT-BR

The current plan is to have localized builds use translated prompts, e.g., `Estou na página "%tabTitle%" com "%selection|12000%" selecionado. Por favor, resuma a seleção usando linguagem precisa e concisa. Use cabeçalhos e listas com marcadores no resumo, para torná-lo escaneável. Mantenha o significado e a precisão factual.` (Here I used Firefox Translate, but our usual community process will take care of it when we expose these prompts to localization.)

In the meantime, could you try changing the prompts from about:config `browser.ml.chat.prompt.prefix` and/or browser.ml.chat.prompts.0 to see if translated prompts or "reply in Portuguese pt-BR" works better for you?

pt-BR summarize.png

I understand the plan to have variants in each language; however, I believe language models work better in the native language of their training. For example, when prompts are made in English, chatbots interpret the content better. Therefore, in my humble opinion, I would leave the body of the prompt in English, making it unnecessary to create prompts in all languages. This accelerates development and makes chatbots understand the prompt better. If I’m not mistaken, Google’s chatbot, for instance, only creates images when English is used. I tested it here, and as you can see from my screenshots, it worked perfectly by adding at the end for the bot to respond in the desired language. Congratulations and thank you for your attention.

Additionally, I would like to suggest another idea: in this submenu, an option could be added to open the chatbots without any prompt, such as:

  • [Ask ChatGPT]
    • Open ChatGPT Panel
    • Summarize
    • Etc...
    •  

The other day, I was taking a test in the browser, and I had to keep switching to another tab to access the chat. If there was a side-by-side panel for asking questions, it would have helped me a lot.

One more question/idea: Have you considered having an option to send a screenshot of the open page to chatbots that support interpreting images?



Screenshot from 2024-07-27 20-33-53.pngScreenshot from 2024-07-27 20-34-52.png

Indeed, there might be some trouble with various LLMs supporting the 100+ Firefox locales. Those using these localized Firefox might not like seeing English show up in the chatbot, but perhaps this is something we can test to see if the quality of chatbot interpreting content is better.

If you want to pass in the selected text without a prompt, you could create a new `browser.ml.chat.prompts.empty` string pref and put in a space:

pt-BR blank.png

Or if you're just wanting a fresh chatbot to ask general questions without passing in the selection, you can have the chatbot icon available with the new Sidebar experimental feature also in Firefox Labs or add the sidebar icon to your toolbar to toggle it open/close.

...Why are you posting this? I'm sorry, but I don't see what this adds to the discussion, and it very much looks like it was written by an LLM.

<deleted>

williamwalker
Making moves

Hey asafko, thanks for the heads up! I'm always excited to try new Firefox features. Quick question - will this AI sidebar integration work with existing AI services we might already use, or is it limited to specific partners?

Are there other AI services that you think should be in the list? You can try it out in the sidebar by setting your AI service's url for `browser.ml.chat.provider` from about:config.

Here's an example of setting Claude (which has since been added to the list). https://connect.mozilla.org/t5/discussions/share-your-feedback-on-the-ai-services-experiment-in-nigh...

gpiper
Making moves

AI is the opposite of "private" — if it is integrated, data WILL be used for it to learn, everywhere.

If Firefox had integrated local LLM inference to power this AI chatbot feature, all the data you pass in for prompting and the generated responses could stay on your computer and not sent anywhere for learning. Would that be better for you?

You can get a similar behavior today by configuring this feature to use a local chatbot like https://llamafile.ai

I also want Firefox to integrate a native LLM to make it more secure and private

Nope. I don't want any part of any AI on my browser. It is evil. It will get smarter while our brains get dumber. It is literally a Mind Flayer.


@Mardak wrote:

If Firefox had integrated local LLM inference to power this AI chatbot feature, all the data you pass in for prompting and the generated responses could stay on your computer and not sent anywhere for learning. Would that be better for you?


Yes, it would be better. Firefox has a reputation for having very strong standards for its users' privacy. By implementing the feature with cloud LLMs, you are eroding its ethos. Local is the only way and you need to put a lot of thought into how it'll be privacy-conscious and security-conscious for your users, all while making sure it's not getting in the way for users who don't want it. Microsoft is being absolutely flamed for their integration of AI into Windows because they didn't do any of these things. This feature should never have been shipped in the state it's in now, and you need to take a long time to think about these things I mentioned and how it can be implemented in the users' favor.

BiORNADE
Making moves

Thinking abt it, that sounds like what a Sidebar would do. Wouldn't the right click functions be an extension, and instead of trying to make it ChatGPT only, make a sidebar so users can put their Chatbots?

In the end, everybody wins

Have you tried setting a custom chat provider to be any page you want with `browser.ml.chat.provider`? It might be similar to what you're asking for, so potentially if that matches up with what people want, we could try to figure out a more streamlined way to do this. E.g., "use current tab as chatbot" ?

WoodwardIII
Making moves

Any plan to integrate with Apple intelligence on macs? Also, it would be great to see integration with local LLMs like Ollama (not sure how working execution but maybe typing in a local host address)

Making use of on-device local inference APIs provided by the OS is an interesting approach as various platforms are adding capabilities on macOS/iOS, Windows, Android that should be much faster for those with newer hardware. The current chatbot feature can make use of local LLMs like llamafile exposed on localhost running on existing hardware, so we'll want to see how we can support both old and new devices with acceptable performance and quality.

wutongtaiwan
Familiar face

I suddenly had an idea that some people are afraid of AI violating privacy, but as long as AI is used in the right place, privacy can be protected. For example, using AI to identify website trackers, there are many trackers in the world, and these trackers are not necessarily on the list of trackers in Firefox, and using AI to identify possible trackers may be a good way to protect privacy. Of course, it can also be blocked by mistake, so it's best to put this feature in Strict mode with enhanced tracking protection

Thanks for the suggestion. Were you thinking this would be a custom model running locally as we probably wouldn't want to send requests to check if something is a tracker, so perhaps something similar to the existing safe browsing that detects phishing and malware sites? This is still AI in some form, but it might not be a good fit for generative AI.

Allan-L
Making moves

Hello, I enabled the "browser.ml.chat.shortcuts" option, which displays a floating menu when selecting text or performing a long click. Is this option a test for future versions? If so, corrections are needed to ensure this menu only appears when text is selected. Currently, I am using Ubuntu with the Nightly build, and my mouse's scroll wheel is faulty, so I use the scroll bars everywhere. The problem is that this menu keeps appearing constantly, whether I'm using the scroll bar or simply performing a long click without moving the mouse. I believe the best option would be for it to appear only when text is selected on the page. Thank you!

Thanks for reporting. Could you try the latest Nightly 131 (20240808093537) to see if the shortcuts stop showing up with the scrollbar click or scrollwheel click? The long-press behavior should be off for now, but we'll keep this in mind as we add selection-less behaviors.

Yeah, is perfect now, you are very fast

i believe the long press option is a good idea for touchscreen to perform page action, ex: back, forward, send or share the page, send to printer etc... because in mobile theses options make more sense to longe press a page or a blank area of the page, in desktop our already have the right click menu with much more options

haingdc
Making moves

I also like shortcuts to increase/decrease sidebar width. Sometimes, I really that the it's too narrow for content. Then we have drag feature to adjust the width. IMO, But It's can be easier with shortcuts