06-21-2024 11:55 AM - last edited on 10-18-2024 02:19 PM by Jon
Hi folks,
In the next few days, we will start the Nightly experiment which provides easy access to AI services from the sidebar. This functionality is entirely optional, and it’s there to see if it’s a helpful addition to Firefox. It is not built into any core functionality and needs to be turned on by you to see it.
If you want to try the experiment, activate it via Nightly Settings > Firefox Labs (please see full instructions here).
We’d love to hear your feedback once you try out the feature, and we’re open to all your ideas and thoughts, whether it’s small tweaks to the current experience or big, creative suggestions that could boost your productivity and make accessing your favorite tools and services in Firefox even easier.
Thanks so much for helping us improve Firefox!
07-20-2024 12:27 AM
We had multiple blog posts as part of landing the initial exploration of the feature:
This only landed in Nightly 129 less than a month ago, and we can continue to default the feature off as we make improvements. Do you have suggestions of what needs to be added to the feature before the release or the messaging?
09-04-2024 07:48 PM
I suggest deleting the feature and apologizing publicly.
09-08-2024 01:14 PM
Seconded.
09-09-2024 08:10 AM
I agree on the strongest possible terms
09-09-2024 09:33 AM
Not only do I remember that first blog post, I also remember the overwhelmingly negative response to it. People were saying this was a bad idea way back in June. Do Mozilla not read feedback? Please read the room.
09-19-2024 02:43 PM
Deleting the feature entirely and apologizing for disrespecting your userbase, as you can clearly see by the majority of the responses here that you are ignoring or responding to disingenuously.
07-18-2024 06:57 PM
07-20-2024 12:30 AM
Sure, we can add more to the context, so any other values you think might be useful? Just checking, are you expecting the default prompts to include the url or that you want custom prompts to be able to include url and other values?
07-20-2024 07:11 PM
I think that providing title, url and selection is enough for correct answer. I've configured and actually use two custom prompts for my needs and in my language:
1. Summarize: only title, but I would prefer page url instead.
2. Explain: only selection.
IMO, custom prompts and custom providers are unnecessary.
07-27-2024 10:32 AM
The latest Nightly 130 (20240726152430) includes a url in the context for prompt inclusion or targeting. It's not used in any prompts by default yet, but we'll look for feedback on how it might be used for prompts.
07-19-2024 11:43 PM
Please forgive me for saying something here that is not related to this topic. I hope Mozilla will drop the feature of privacy attribution, or turn it off by default, otherwise Mozilla could be subject to a hefty fine if Europeans sue Mozilla for violating the GDPR
07-20-2024 12:28 PM
Olá, no final dos prompts padrões poderia colocar a linguagem que esta sendo utilizada na interface do firefox para que os chatbots respondam na linguagem do usuario: EX: <prompt>... reply in "<current ff language>"
EX 2: em portuguese-pt-br configured in settings
<prompt>... reply in Portuguese-PT-BR
07-27-2024 10:55 AM
The current plan is to have localized builds use translated prompts, e.g., `Estou na página "%tabTitle%" com "%selection|12000%" selecionado. Por favor, resuma a seleção usando linguagem precisa e concisa. Use cabeçalhos e listas com marcadores no resumo, para torná-lo escaneável. Mantenha o significado e a precisão factual.` (Here I used Firefox Translate, but our usual community process will take care of it when we expose these prompts to localization.)
In the meantime, could you try changing the prompts from about:config `browser.
07-27-2024 04:47 PM - edited 07-27-2024 04:56 PM
I understand the plan to have variants in each language; however, I believe language models work better in the native language of their training. For example, when prompts are made in English, chatbots interpret the content better. Therefore, in my humble opinion, I would leave the body of the prompt in English, making it unnecessary to create prompts in all languages. This accelerates development and makes chatbots understand the prompt better. If I’m not mistaken, Google’s chatbot, for instance, only creates images when English is used. I tested it here, and as you can see from my screenshots, it worked perfectly by adding at the end for the bot to respond in the desired language. Congratulations and thank you for your attention.
Additionally, I would like to suggest another idea: in this submenu, an option could be added to open the chatbots without any prompt, such as:
The other day, I was taking a test in the browser, and I had to keep switching to another tab to access the chat. If there was a side-by-side panel for asking questions, it would have helped me a lot.
One more question/idea: Have you considered having an option to send a screenshot of the open page to chatbots that support interpreting images?
08-08-2024 10:38 AM
Indeed, there might be some trouble with various LLMs supporting the 100+ Firefox locales. Those using these localized Firefox might not like seeing English show up in the chatbot, but perhaps this is something we can test to see if the quality of chatbot interpreting content is better.
If you want to pass in the selected text without a prompt, you could create a new `browser.ml.chat.prompts.empty` string pref and put in a space:
Or if you're just wanting a fresh chatbot to ask general questions without passing in the selection, you can have the chatbot icon available with the new Sidebar experimental feature also in Firefox Labs or add the sidebar icon to your toolbar to toggle it open/close.
07-24-2024 06:54 AM
...Why are you posting this? I'm sorry, but I don't see what this adds to the discussion, and it very much looks like it was written by an LLM.
07-26-2024 03:02 AM - edited 07-26-2024 03:04 AM
07-27-2024 04:21 AM
Hey asafko, thanks for the heads up! I'm always excited to try new Firefox features. Quick question - will this AI sidebar integration work with existing AI services we might already use, or is it limited to specific partners?
07-27-2024 10:58 AM
Are there other AI services that you think should be in the list? You can try it out in the sidebar by setting your AI service's url for `browser.
Here's an example of setting Claude (which has since been added to the list). https://connect.mozilla.org/t5/discussions/share-your-feedback-on-the-ai-services-experiment-in-nigh...
07-27-2024 08:27 AM
AI is the opposite of "private" — if it is integrated, data WILL be used for it to learn, everywhere.
07-27-2024 11:04 AM
If Firefox had integrated local LLM inference to power this AI chatbot feature, all the data you pass in for prompting and the generated responses could stay on your computer and not sent anywhere for learning. Would that be better for you?
You can get a similar behavior today by configuring this feature to use a local chatbot like https://llamafile.ai
07-27-2024 09:33 PM
I also want Firefox to integrate a native LLM to make it more secure and private
07-28-2024 07:42 AM
Nope. I don't want any part of any AI on my browser. It is evil. It will get smarter while our brains get dumber. It is literally a Mind Flayer.
09-03-2024 06:16 PM
@Mardak wrote:If Firefox had integrated local LLM inference to power this AI chatbot feature, all the data you pass in for prompting and the generated responses could stay on your computer and not sent anywhere for learning. Would that be better for you?
Yes, it would be better. Firefox has a reputation for having very strong standards for its users' privacy. By implementing the feature with cloud LLMs, you are eroding its ethos. Local is the only way and you need to put a lot of thought into how it'll be privacy-conscious and security-conscious for your users, all while making sure it's not getting in the way for users who don't want it. Microsoft is being absolutely flamed for their integration of AI into Windows because they didn't do any of these things. This feature should never have been shipped in the state it's in now, and you need to take a long time to think about these things I mentioned and how it can be implemented in the users' favor.
07-27-2024 10:50 AM
Thinking abt it, that sounds like what a Sidebar would do. Wouldn't the right click functions be an extension, and instead of trying to make it ChatGPT only, make a sidebar so users can put their Chatbots?
In the end, everybody wins
08-08-2024 09:58 AM
Have you tried setting a custom chat provider to be any page you want with `browser.
07-30-2024 02:07 PM
Any plan to integrate with Apple intelligence on macs? Also, it would be great to see integration with local LLMs like Ollama (not sure how working execution but maybe typing in a local host address)
08-08-2024 09:55 AM
Making use of on-device local inference APIs provided by the OS is an interesting approach as various platforms are adding capabilities on macOS/iOS, Windows, Android that should be much faster for those with newer hardware. The current chatbot feature can make use of local LLMs like llamafile exposed on localhost running on existing hardware, so we'll want to see how we can support both old and new devices with acceptable performance and quality.
07-31-2024 12:28 AM
I suddenly had an idea that some people are afraid of AI violating privacy, but as long as AI is used in the right place, privacy can be protected. For example, using AI to identify website trackers, there are many trackers in the world, and these trackers are not necessarily on the list of trackers in Firefox, and using AI to identify possible trackers may be a good way to protect privacy. Of course, it can also be blocked by mistake, so it's best to put this feature in Strict mode with enhanced tracking protection
08-08-2024 09:48 AM
Thanks for the suggestion. Were you thinking this would be a custom model running locally as we probably wouldn't want to send requests to check if something is a tracker, so perhaps something similar to the existing safe browsing that detects phishing and malware sites? This is still AI in some form, but it might not be a good fit for generative AI.
08-06-2024 10:25 PM
Hello, I enabled the "browser.ml.chat.shortcuts" option, which displays a floating menu when selecting text or performing a long click. Is this option a test for future versions? If so, corrections are needed to ensure this menu only appears when text is selected. Currently, I am using Ubuntu with the Nightly build, and my mouse's scroll wheel is faulty, so I use the scroll bars everywhere. The problem is that this menu keeps appearing constantly, whether I'm using the scroll bar or simply performing a long click without moving the mouse. I believe the best option would be for it to appear only when text is selected on the page. Thank you!
08-08-2024 09:41 AM
Thanks for reporting. Could you try the latest Nightly 131 (20240808093537) to see if the shortcuts stop showing up with the scrollbar click or scrollwheel click? The long-press behavior should be off for now, but we'll keep this in mind as we add selection-less behaviors.
08-08-2024 06:46 PM
Yeah, is perfect now, you are very fast
08-08-2024 06:52 PM - edited 08-08-2024 06:53 PM
i believe the long press option is a good idea for touchscreen to perform page action, ex: back, forward, send or share the page, send to printer etc... because in mobile theses options make more sense to longe press a page or a blank area of the page, in desktop our already have the right click menu with much more options
08-07-2024 08:59 PM
I also like shortcuts to increase/decrease sidebar width. Sometimes, I really that the it's too narrow for content. Then we have drag feature to adjust the width. IMO, But It's can be easier with shortcuts
08-08-2024 09:43 AM - edited 08-08-2024 09:43 AM
Are you referring to the chatbot sidebar being too narrow? It recently got wider with Nightly 130 (20240803095257), so if you switch between a narrow sidebar for say history and chatbot, it should automatically get wider and return to narrow without needing to drag.
08-08-2024 08:46 PM
Yes. Chatbot sidebar is narrow. I also think automatically get wider feature is an improve.
08-08-2024 10:20 AM
Support for an open source Chat bot (through an API) like ollama (https://ollama.com/) would be greatly appreciated, it would also allow more privacy as local LLMs can be used. Ollama supports the standard OpenAI API, so it would "just" need to get the base_url as a parameter and the model...
08-08-2024 10:43 AM - edited 08-08-2024 10:52 AM
Currently the chatbot feature supports any web chatbot including open-source https://llamafile.ai which runs LLMs locally on-device. llamafile also supports OpenAI API chat/completions, but the current Firefox implementation relies on a server responding with a webpage to show in the sidebar.
Would you want a chatbot or maybe a dedicated summarize feature (without followup chat/questions) that directly uses inference APIs potentially pointed at locally running ollama?
08-08-2024 03:36 PM
You should provide an option for on device models or focus on providing access to privacy respecting AI-services, the integration of proprietary ai services that have free user usage limits, require registration, and have problems with privacy should not be integrated into the browser.