cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Share your feedback on the AI services experiment in Nightly

asafko
Employee
Employee

Hi folks, 

In the next few days, we will start the Nightly experiment which provides easy access to AI services from the sidebar. This functionality is entirely optional, and it’s there to see if it’s a helpful addition to Firefox. It is not built into any core functionality and needs to be turned on by you to see it. 

If you want to try the experiment, activate it via Nightly Settings > Firefox Labs (please see full instructions here). 

We’d love to hear your feedback once you try out the feature, and we’re open to all your ideas and thoughts, whether it’s small tweaks to the current experience or big, creative suggestions that could boost your productivity and make accessing your favorite tools and services in Firefox even easier.

Thanks so much for helping us improve Firefox!

3,166 REPLIES 3,166

Nope. I don't want any part of any AI on my browser. It is evil. It will get smarter while our brains get dumber. It is literally a Mind Flayer.


@Mardak wrote:

If Firefox had integrated local LLM inference to power this AI chatbot feature, all the data you pass in for prompting and the generated responses could stay on your computer and not sent anywhere for learning. Would that be better for you?


Yes, it would be better. Firefox has a reputation for having very strong standards for its users' privacy. By implementing the feature with cloud LLMs, you are eroding its ethos. Local is the only way and you need to put a lot of thought into how it'll be privacy-conscious and security-conscious for your users, all while making sure it's not getting in the way for users who don't want it. Microsoft is being absolutely flamed for their integration of AI into Windows because they didn't do any of these things. This feature should never have been shipped in the state it's in now, and you need to take a long time to think about these things I mentioned and how it can be implemented in the users' favor.

BiORNADE
Making moves

Thinking abt it, that sounds like what a Sidebar would do. Wouldn't the right click functions be an extension, and instead of trying to make it ChatGPT only, make a sidebar so users can put their Chatbots?

In the end, everybody wins

Have you tried setting a custom chat provider to be any page you want with `browser.ml.chat.provider`? It might be similar to what you're asking for, so potentially if that matches up with what people want, we could try to figure out a more streamlined way to do this. E.g., "use current tab as chatbot" ?

WoodwardIII
Making moves

Any plan to integrate with Apple intelligence on macs? Also, it would be great to see integration with local LLMs like Ollama (not sure how working execution but maybe typing in a local host address)

Making use of on-device local inference APIs provided by the OS is an interesting approach as various platforms are adding capabilities on macOS/iOS, Windows, Android that should be much faster for those with newer hardware. The current chatbot feature can make use of local LLMs like llamafile exposed on localhost running on existing hardware, so we'll want to see how we can support both old and new devices with acceptable performance and quality.

wutongtaiwan
Familiar face

I suddenly had an idea that some people are afraid of AI violating privacy, but as long as AI is used in the right place, privacy can be protected. For example, using AI to identify website trackers, there are many trackers in the world, and these trackers are not necessarily on the list of trackers in Firefox, and using AI to identify possible trackers may be a good way to protect privacy. Of course, it can also be blocked by mistake, so it's best to put this feature in Strict mode with enhanced tracking protection

Thanks for the suggestion. Were you thinking this would be a custom model running locally as we probably wouldn't want to send requests to check if something is a tracker, so perhaps something similar to the existing safe browsing that detects phishing and malware sites? This is still AI in some form, but it might not be a good fit for generative AI.

Allan-L
Making moves

Hello, I enabled the "browser.ml.chat.shortcuts" option, which displays a floating menu when selecting text or performing a long click. Is this option a test for future versions? If so, corrections are needed to ensure this menu only appears when text is selected. Currently, I am using Ubuntu with the Nightly build, and my mouse's scroll wheel is faulty, so I use the scroll bars everywhere. The problem is that this menu keeps appearing constantly, whether I'm using the scroll bar or simply performing a long click without moving the mouse. I believe the best option would be for it to appear only when text is selected on the page. Thank you!

Thanks for reporting. Could you try the latest Nightly 131 (20240808093537) to see if the shortcuts stop showing up with the scrollbar click or scrollwheel click? The long-press behavior should be off for now, but we'll keep this in mind as we add selection-less behaviors.

Yeah, is perfect now, you are very fast

i believe the long press option is a good idea for touchscreen to perform page action, ex: back, forward, send or share the page, send to printer etc... because in mobile theses options make more sense to longe press a page or a blank area of the page, in desktop our already have the right click menu with much more options

haingdc
Making moves

I also like shortcuts to increase/decrease sidebar width. Sometimes, I really that the it's too narrow for content. Then we have drag feature to adjust the width. IMO, But It's can be easier with shortcuts

Are you referring to the chatbot sidebar being too narrow? It recently got wider with Nightly 130 (20240803095257), so if you switch between a narrow sidebar for say history and chatbot, it should automatically get wider and return to narrow without needing to drag.

Yes. Chatbot sidebar is narrow. I also think automatically get wider feature is an improve.

lstep
Making moves

Support for an open source Chat bot (through an API) like ollama (https://ollama.com/) would be greatly appreciated, it would also allow more privacy as local LLMs can be used. Ollama supports the standard OpenAI API, so it would "just" need to get the base_url as a parameter and the model...

Currently the chatbot feature supports any web chatbot including open-source https://llamafile.ai which runs LLMs locally on-device. llamafile also supports OpenAI API chat/completions, but the current Firefox implementation relies on a server responding with a webpage to show in the sidebar.

Would you want a chatbot or maybe a dedicated summarize feature (without followup chat/questions) that directly uses inference APIs potentially pointed at locally running ollama?

Tom4
Making moves

You should provide an option for on device models or focus on providing access to privacy respecting AI-services, the integration of proprietary ai services that have free user usage limits, require registration, and have problems with privacy should not be integrated into the browser.

Are there particular prompts that you would find more useful? Building on what we've implemented for local PDF alt-text generation, we can use that for running other models such as something specialized for generating summaries. Because these models run on-device, there wouldn't be usage limits but the quality and response speed will depend on your hardware.

haingdc
Making moves

Hi, It's take a while for chatbot sidebar to load anytime a switch from other sidebar tool to chatbot sidebar. It's feel like delay time can affect to user experience. Hope you guy make it faster. Thanks

These chatbots from various providers are hosted webpages, so requests will depend on your location. Are the requests similarly slow if you open these chatbots in a regular tab? Maybe we can make it feel a bit more responsive by hiding the previous content when you're switching and showing a loading indicator?

reckless
Making moves

Hi, this feature is very interesting.

Could it be possible to connect self-hosted LLVMs like OLLAMA? (on localhost, as well as remoted with https)

I've seen ollama showing a chatbot webpage similar to self-hosted llamafile. Does the sidebar load the webpage when you set browser.ml.chat.provider to your own (local) server? It might not support passing in a prompt, but you'll at least have your self-hosted chatbot available in the sidebar.

We're currently exploring not requiring the server to respond with html and instead Firefox displays responses from calling an inference API, and I believe all llamafile, vllm and ollama support POST to /v1/chat/completions.

gpiper
Making moves

You should be branding Mozilla products as "AI Free" not integrating the demon seed into your codebase.

ffffff
Making moves

Related: Vivaldi's stance on the current AI/LLM trend

If Vivaldi didn't use Google's Blink engine, this discussion thread is the kind of reason I'd consider leaving Firefox for. It's very disappointing to see Mozilla of all entities embracing AI like this.

The Mozilla Manifesto says "commercial involvement in the development of the internet is critical." Seems it's entirely possible to have commercial involvement without adopting extremely controversial and highly criticized market trends.

Maybe I'll be proven wrong and this will be good for the internet and Firefox users, but I'm not optimistic.

In unrelated news, I keep trying to unsubscribe from this discussion because it only serves to make me sad, yet I still receive e-mails notifications. I want out. Anyone know what I might be doing wrong?

Don't be too afraid of AI, after all, Firefox doesn't force AI on, and Mozilla doesn't enable AI by default, so you don't have to worry about AI infringing on your privacy

They could have just made it an extension then. As a Lab it is far more likely to be included as a feature in the future which is something I didn't ask for and do not want, even if it is opt-in by default.

 just humouring the product infringes not only the privacy of every web user, but their copyrights as well. and that's before we get into the environmental and ethical problems (generative 'ai' marketing is a transparent financial fraud by VCs and CEOs).

so yes, we should oppose all usage of, and cooperation with LLM and generative 'ai' scams.

The current Firefox feature supports open models from providers like Hugging Face and allows us to guide users to fully local inference and truly open-source models like OLMo when those functionalities are ready. Mozilla is also improving AI such as democratizing access with llamafile, supporting open-source models that have open training data with better privacy for everyone, and generally engaging with the broader community including lawmakers to make AI good for the internet.

Being more integrated with AI such as this Firefox feature allows us to make a difference for users and non-users of chatbots by magnifying the efforts we have across Mozilla.

AI doesn't make the Internet better. People do.

raidingshaman
Making moves

Need Keyboard shortcuts in order to maximize productivity with this new AI feature

Thanks, we've filed this bug for a keyboard shortcut. https://bugzilla.mozilla.org/show_bug.cgi?id=1905027

If you're not using the new Sidebar (also available in Firefox Labs), adding the sidebar icon by customizing your toolbar might help.

Lumither
Making moves

Why not try to make `browser.ml.chat.shortcuts` config available in settings (instead of opening about:config).

Did you find that pref looking around about:config? You should be able to toggle the shortcuts behavior directly from Firefox Labs (about:settings#experimental) now:

labs select.png

PatOr
Making moves

Can you add Copilot to choose?

MR-d3R
Making moves

It'd be great to add shortcut for that

DonutRush
Making moves

You should not be inserting lake-draining spam machines into a browser that claims to want to make the web a better place. This is a deeply immoral and short-sighted decision, and your comments in this thread make you seem shockingly uninformed about what people outside of your tech bubble think about this garbage.

Mozilla should not be encouraging the decline of the internet.

Rika
Making moves

I think a user-defined prompt will make it more convenient
for example,When I selected some text,it will have a option like"My prompt",and the name of the option can be changed,to define the content of "My prompt",we need to go to settings then edit it.
like"I’m on page "selected texts",Please translate the text above"

BeKingOrDie
Making moves

I love it 😍😍😍😍😍😍, and some short-cut to open very quick 

YPAP2205
Making moves

Will it be possible to add custom AI providers without needing to change the AI provider link in about:config? Just so users can add their own AI they wish to use.