cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Share your feedback on the AI services experiment in Nightly

asafko
Employee
Employee

Hi folks, 

In the next few days, we will start the Nightly experiment which provides easy access to AI services from the sidebar. This functionality is entirely optional, and it’s there to see if it’s a helpful addition to Firefox. It is not built into any core functionality and needs to be turned on by you to see it. 

If you want to try the experiment, activate it via Nightly Settings > Firefox Labs (please see full instructions here). 

We’d love to hear your feedback once you try out the feature, and we’re open to all your ideas and thoughts, whether it’s small tweaks to the current experience or big, creative suggestions that could boost your productivity and make accessing your favorite tools and services in Firefox even easier.

Thanks so much for helping us improve Firefox!

3,086 REPLIES 3,086

Are you referring to the chatbot sidebar being too narrow? It recently got wider with Nightly 130 (20240803095257), so if you switch between a narrow sidebar for say history and chatbot, it should automatically get wider and return to narrow without needing to drag.

Yes. Chatbot sidebar is narrow. I also think automatically get wider feature is an improve.

lstep
Making moves

Support for an open source Chat bot (through an API) like ollama (https://ollama.com/) would be greatly appreciated, it would also allow more privacy as local LLMs can be used. Ollama supports the standard OpenAI API, so it would "just" need to get the base_url as a parameter and the model...

Currently the chatbot feature supports any web chatbot including open-source https://llamafile.ai which runs LLMs locally on-device. llamafile also supports OpenAI API chat/completions, but the current Firefox implementation relies on a server responding with a webpage to show in the sidebar.

Would you want a chatbot or maybe a dedicated summarize feature (without followup chat/questions) that directly uses inference APIs potentially pointed at locally running ollama?

Tom4
Making moves

You should provide an option for on device models or focus on providing access to privacy respecting AI-services, the integration of proprietary ai services that have free user usage limits, require registration, and have problems with privacy should not be integrated into the browser.

Are there particular prompts that you would find more useful? Building on what we've implemented for local PDF alt-text generation, we can use that for running other models such as something specialized for generating summaries. Because these models run on-device, there wouldn't be usage limits but the quality and response speed will depend on your hardware.

haingdc
Making moves

Hi, It's take a while for chatbot sidebar to load anytime a switch from other sidebar tool to chatbot sidebar. It's feel like delay time can affect to user experience. Hope you guy make it faster. Thanks

These chatbots from various providers are hosted webpages, so requests will depend on your location. Are the requests similarly slow if you open these chatbots in a regular tab? Maybe we can make it feel a bit more responsive by hiding the previous content when you're switching and showing a loading indicator?

reckless
Making moves

Hi, this feature is very interesting.

Could it be possible to connect self-hosted LLVMs like OLLAMA? (on localhost, as well as remoted with https)

I've seen ollama showing a chatbot webpage similar to self-hosted llamafile. Does the sidebar load the webpage when you set browser.ml.chat.provider to your own (local) server? It might not support passing in a prompt, but you'll at least have your self-hosted chatbot available in the sidebar.

We're currently exploring not requiring the server to respond with html and instead Firefox displays responses from calling an inference API, and I believe all llamafile, vllm and ollama support POST to /v1/chat/completions.

gpiper
Making moves

You should be branding Mozilla products as "AI Free" not integrating the demon seed into your codebase.

ffffff
Making moves

Related: Vivaldi's stance on the current AI/LLM trend

If Vivaldi didn't use Google's Blink engine, this discussion thread is the kind of reason I'd consider leaving Firefox for. It's very disappointing to see Mozilla of all entities embracing AI like this.

The Mozilla Manifesto says "commercial involvement in the development of the internet is critical." Seems it's entirely possible to have commercial involvement without adopting extremely controversial and highly criticized market trends.

Maybe I'll be proven wrong and this will be good for the internet and Firefox users, but I'm not optimistic.

In unrelated news, I keep trying to unsubscribe from this discussion because it only serves to make me sad, yet I still receive e-mails notifications. I want out. Anyone know what I might be doing wrong?

Don't be too afraid of AI, after all, Firefox doesn't force AI on, and Mozilla doesn't enable AI by default, so you don't have to worry about AI infringing on your privacy

They could have just made it an extension then. As a Lab it is far more likely to be included as a feature in the future which is something I didn't ask for and do not want, even if it is opt-in by default.

 just humouring the product infringes not only the privacy of every web user, but their copyrights as well. and that's before we get into the environmental and ethical problems (generative 'ai' marketing is a transparent financial fraud by VCs and CEOs).

so yes, we should oppose all usage of, and cooperation with LLM and generative 'ai' scams.

The current Firefox feature supports open models from providers like Hugging Face and allows us to guide users to fully local inference and truly open-source models like OLMo when those functionalities are ready. Mozilla is also improving AI such as democratizing access with llamafile, supporting open-source models that have open training data with better privacy for everyone, and generally engaging with the broader community including lawmakers to make AI good for the internet.

Being more integrated with AI such as this Firefox feature allows us to make a difference for users and non-users of chatbots by magnifying the efforts we have across Mozilla.

AI doesn't make the Internet better. People do.

raidingshaman
Making moves

Need Keyboard shortcuts in order to maximize productivity with this new AI feature

Thanks, we've filed this bug for a keyboard shortcut. https://bugzilla.mozilla.org/show_bug.cgi?id=1905027

If you're not using the new Sidebar (also available in Firefox Labs), adding the sidebar icon by customizing your toolbar might help.

Lumither
Making moves

Why not try to make `browser.ml.chat.shortcuts` config available in settings (instead of opening about:config).

Did you find that pref looking around about:config? You should be able to toggle the shortcuts behavior directly from Firefox Labs (about:settings#experimental) now:

labs select.png

PatOr
Making moves

Can you add Copilot to choose?

MR-d3R
Making moves

It'd be great to add shortcut for that

DonutRush
Making moves

You should not be inserting lake-draining spam machines into a browser that claims to want to make the web a better place. This is a deeply immoral and short-sighted decision, and your comments in this thread make you seem shockingly uninformed about what people outside of your tech bubble think about this garbage.

Mozilla should not be encouraging the decline of the internet.

Rika
Making moves

I think a user-defined prompt will make it more convenient
for example,When I selected some text,it will have a option like"My prompt",and the name of the option can be changed,to define the content of "My prompt",we need to go to settings then edit it.
like"I’m on page "selected texts",Please translate the text above"

BeKingOrDie
Making moves

I love it 😍😍😍😍😍😍, and some short-cut to open very quick 

YPAP2205
Making moves

Will it be possible to add custom AI providers without needing to change the AI provider link in about:config? Just so users can add their own AI they wish to use.

MWIchigo
Making moves

This is a feature I would never use.

On the other hand, a feature I not only would use, but in fact have used numerous times per day for several years, is hitting enter in an empty search bar to go to the search engine. It's a novel feature that other browsers have and Firefox seems to be mysteriously missing.

Lighthouse
Making moves

Please add DuckDuckGo AI Chat.
No need for account and more privacy.
(https://duckduckgo.com/aichat)

ppm1337
Making moves

This is a nice addition (which always should remain optional in my opinion), but without a keyboard shortcut this is completely useless. It's faster to just open Edge next to firefox/having a chatbot in a separate tab than clicking 3 times to open the AI chatbot sidebar.

Encrypter
Making moves

I appreciate that some users may have reservations about these features, but incorporating them into Firefox can help make the browser more accessible and appealing to a wider audience. Since these tools are entirely optional, users who have privacy concerns can easily disable them.

@asafko Regarding the implementation, I was wondering if it would be possible to enhance the chat-bot's Ask chatbot functionality. Specifically, when I click on 'Summarize' without selecting any text, could the chat-bot be provided with the URL of the page I'm on? Currently, it displays a message indicating that I'm on a page with no selection, which isn't very helpful when I'm trying to summarize the page's content. Thank you.

Somebody1
Making moves

can we atleast have something similar to this

Somebody1_0-1724770572571.png

This is DuckDuckGos policy on their AI features. It does not even require login! While I cannot say that I have NO doubts on whether or not they follow this I am sure Mozilla can follow it.

Jeeakro
Making moves

The feature seems useful, essentially a free and thus less taxing to use version of those AI extensions that do a similar thing. One very needed feature is editing the prompt templates; when studying a pdf and not understanding something, the explain option is not perfectly reliable and sometimes makes the AI regurgitate the selection bloatedly. Also the "quiz me" template is too specific to use most of the time, I'd like to select whether I want multiple choice or how many questions or options should be generated.

phonics8226
Making moves

BUG REPORT:

When a custom model is added and then switched using the dropdown menu at the top of the sidebar, it becomes impossible to revert back to the custom model, even after closing and reopening the sidebar.

Additionally, is it possible to have multiple custom providers? I often compare and utilize different models for various tasks to achieve better results. Maybe with "browser.ml.chat.provider.1", "browser.ml.chat.provider.2", etc.

Screenshot 2024-09-03 001519.png
Screenshot 2024-09-03 001929.png

That's an interesting idea to support multiple custom providers for easy switching. Are these all local providers or maybe we can add these to the list?

Well, one is duck.ai as you can see on the screenshots, and judging by another comment you made, there are no plans to include it in the list.

Anyway, any word on the bug? Or can we expect support for multiple custom providers in some future Nightly release?

hazing3991
Making moves

It would be nice to include DuckDuckGo AI Chat as an option to the chatbot sidebar. It provides anonymous access to popular AI models.

You can configure a custom provider by setting `browser.ml.chat.provider` to something like "https://duckduckgo.com/?q=chat&ia=chat" or now "https://duck.ai" (which redirects to a similar url), but it seems like DuckDuckGo AI Chat currently requires a ?q= value and ignores it, so passing in a prompt and selected text doesn't work currently.

This could still be useful to have anonymous chatbot access to AI models side-by-side with regular tabs.

gmeneguzzo
Making moves

If I am reading an italian text (e.g.) and I select it , then I click the emoji to summarize it using AI , mozilla send to Claude this request

"Please summarize the selection using precise and concise language. Use headers and bulleted lists in the summary, to make it scannable. Maintain the meaning and factual accuracy."

as result I receive a summarize in English which is useless (for me).
You should change the prompt as follow

Please summarize the selection using precise and concise language. Use headers and bulleted lists in the summary, to make it scannable. Maintain the meaning and factual accuracy. Summarize using the same language in the original text.

Or you should add in the emojy menu "Summarize in default language set in the lab section"




Are you using a localized Firefox or language pack? The prompt passed in should be in Italian: "Riassumi la selezione utilizzando un linguaggio preciso e conciso. Usa intestazioni ed elenchi puntati nel riassunto per renderlo facile da leggere. Mantieni il significato e l’accuratezza fattuale."

Even with English prompts, we've noticed chatbots tend to respond in the language of the selected text. But you can also modify the prompts and/or prefix to include "in Italian."