this post was submitted on 27 Nov 2024
205 points (93.2% liked)

Firefox

18006 readers
256 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 5 years ago
MODERATORS
 

They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 104 points 1 week ago
[–] that_leaflet 59 points 1 week ago

That was there before 133, don’t remember the exact release that added it.

[–] [email protected] 39 points 1 week ago (2 children)

I don't understand the hate. It's just a sidebar for the supported LLMs. Maybe I'm misunderstanding?

Yes, I would prefer Mozilla focus on the browser, but to me, this seems like it was done in an afternoon.

[–] PrefersAwkward 10 points 1 week ago* (last edited 1 week ago)

It seems like common cynicism. Mozilla adds this feature, as not to yield major features to other browsers. Mozilla's lets you natively have lots of different AI solutions to pick from.

Not every feature is for everyone. Not every feature is done being improved on at release.

And in spite of popular opinions, organizations don't do just one thing and then do just the next thing and the thing after that. Organizations can and do focus on and prioritize many things at the same time.

And for people who are naysaying AI at every mention, it has a lot of great and fascinating uses, and if you think otherwise, you really should try them more. I've used it plenty for work and life. It's not going away, might as well do some nice things with it.

[–] [email protected] 3 points 1 week ago (1 children)

I want my browser to be a browser. I don't want Pocket, I don't want AI, I don't want bullshit. There are plugins for that.

[–] [email protected] 4 points 1 week ago (4 children)

that's the great thing: you don't have to use it

load more comments (4 replies)
[–] ocassionallyaduck 32 points 1 week ago (8 children)

Thing is, for your average user with no GPU and whp never thinks about RAM, running a local LLM is intimidating. But it shouldn't be. Any system with an integrated GPU, and the more RAM the better, can run simple models locally.

The not so dirty secret is that ChatGPT 3 vs 4 isn't that big a difference, and neither are leaps and bounds ahead of the publically available models for about 99% of tasks. For that 1% people will ooh and aah over it, but 99% of use cases are only seeing marginal gains on 4o.

And the simplified models that run "only" 95% as well? They can use 90% fewer resources give pretty much identical answers outside of hyperspecific use cases.

Running a a "smol" model as some are called, gets you all the bang for none of the buck, and your data stays on your system and never leaves.

I've been yelling from the rooftops to some stupid corporate types that once the model is trained, it's trained. Unless you are training models yourself, there is no need for the massive AI clusters, just for the model. Run it local on your hardware at a fraction of the cost.

[–] [email protected] 30 points 1 week ago (1 children)

There's the tragedy with this new feature: they fast-tracked this past more popular requests, sticking it into Release Firefox.

But they only rushed the part that connects to third parties. There was also a "localhost" option which was originally alongside the Big Five corporate offerings, but Mozilla ultimately decided to bury that one inside of the about:config settings.

[–] MrOtherGuy 10 points 1 week ago

I'm guessing that the reason (and a good one at that) is that simply having an option to connect to a local chatbot leads to just confused users because they also need the actual chatbot running on their system. If you can set up that, then you can certainly toggle a simple switch in about:config to show the option.

[–] ilhamagh 4 points 1 week ago (2 children)

Can you point me to some resources to running smol llm?

My use case prob just to help "typing" miscellaneous idea I have or check for my grammatical error, in english.

Thanks, in advance.

load more comments (6 replies)
[–] [email protected] 30 points 1 week ago (1 children)

They better not decide to enable it by default.

[–] [email protected] 23 points 1 week ago (2 children)

it's not enabled by default ... it's opt out by default

[–] [email protected] 34 points 1 week ago* (last edited 1 week ago)

I think that means that it's opt-in.

[–] [email protected] 9 points 1 week ago

if third-party accounts are needed, it'll have to stay that way.

[–] [email protected] 25 points 1 week ago (2 children)

Didn't want it in Opera, don't want it in Firefox. I mean they can keep trying and I'll just keep on ignoring this shit :/

load more comments (2 replies)
[–] [email protected] 22 points 1 week ago (10 children)

I wish I had telemetry on such features.

I really doubt a significant number of people use AI chatbots often enough that having it in a dedicated sidebar is worth it.

load more comments (10 replies)
[–] [email protected] 18 points 1 week ago (1 children)

Thanks for nothing, Mozilla.

[–] [email protected] 21 points 1 week ago (1 children)

They should raise the ceo's pay some more to celebrate.

[–] [email protected] 9 points 1 week ago* (last edited 1 week ago)

And fire a few employees just cause.

[–] ohwhatfollyisman 13 points 1 week ago (4 children)

as someone who's never dabbled with ai bots, what does this feature do? is it only to query for information like a web search?

[–] [email protected] 14 points 1 week ago (1 children)

It just adds ChatGPT or similar to your sidebar. Chatbots can do a lot of things, they are mostly good for information research and technical help, although they have serious flaws like hallucinating false information sometimes

load more comments (1 replies)
[–] [email protected] 8 points 1 week ago (4 children)

It is a sidebar that sends a query from your browser directly to a server run by a giant corporation like Google or OpenAI, consumes an excessive amount of carbon/water, then sends a response back to you that may or may not be true (because AI is incapable of doing anything but generating what it thinks you want to see).

Not only is it unethical in my opinion, it's also ridiculously rudimentary...

load more comments (4 replies)
[–] [email protected] 5 points 1 week ago

From the description in the UI, it does sound like it. Theoretically, a chatbot could be created where you can ask questions about the webpage you have currently opened, so if you don't want to read a long article, for example. I guess, you could probably just throw a link into an existing chatbot either way, but yeah, direct integration might be convenient either way.

Well, or a chatbot could be created, which has access to your browser history, bookmarks and tabs, so you can ask it when you last saw certain information. However, you'd need a locally running chatbot for that, which makes it more difficult to implement.

load more comments (1 replies)
[–] [email protected] 13 points 1 week ago (5 children)

I mean, if you're going to do it, where's the Ollama love?

[–] [email protected] 4 points 1 week ago

I was disappointed there was no local option...

load more comments (4 replies)
[–] [email protected] 12 points 1 week ago* (last edited 1 week ago)

why a fucking chatbot? translate a page better for me you fucking losers, all the translation options suck for privacy outside of specifically trained local AIs. this is the BEST use case for a small local LLM yet mozilla with all its brains and resources couldnt rub two neurons together for this.

or they could do character prediction on your typing to make typing faster. just some legit examples, why waste resources to build a chat ai into my browser when i can just open a website???

[–] Treczoks 11 points 1 week ago

Luckily, it seems to be disabled by default. At the moment.

[–] [email protected] 11 points 1 week ago (2 children)

Are any of these open source or trustworthy?

[–] [email protected] 10 points 1 week ago

I think Mistral is model-available (ie I'm not sure if they release training data/code but they do release model shape and weights), huggingchat definitely is open source and model-available

load more comments (1 replies)
[–] ilinamorato 10 points 1 week ago (2 children)

This happened ages ago, didn't it? Am I missing something new?

[–] [email protected] 7 points 1 week ago (1 children)

Yeah, it did. That feature has been there at least since when Mozilla enabled "Firefox labs" section in settings by default a few months ago, and maybe even earlier than that

[–] victorz 6 points 1 week ago (1 children)
[–] ilinamorato 4 points 1 week ago (1 children)

Well, this month in particular....

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 8 points 1 week ago

For a second I thought it said "experimental failure". Would be more accurate, I think.

[–] [email protected] 7 points 1 week ago* (last edited 1 week ago)

I will say, the Le Chat provider is pretty decent. You really can use it more natural language. "Rewrite it with a better rhyme scheme" "remove the last line" and it just got it.

Why no local option though? Why no anonmysing option?

Edit: There is a right click option which does make this officially actually useful for me now (summarize this!).

Other models do have RAG options and Mist real supports making agents with specified documentation too to at least fine tune too (not as good as full grounding though IMHO)

[–] [email protected] 7 points 1 week ago

Now add support for GPT4All and everyone is happy again.

[–] [email protected] 6 points 1 week ago (5 children)

If they do it in a privacy-preseeving way, this could help them get back market share which will generally benefit an open internet.

load more comments (5 replies)
[–] [email protected] 5 points 1 week ago (2 children)

Wasn't this there for a while, or just me.

load more comments (2 replies)
[–] [email protected] 5 points 1 week ago

Wow, great job Firefox. Thanks.

If I wanted unreliable bullshit like AI, I'd use Chrome.

[–] [email protected] 4 points 1 week ago

I wonder if this can be removed at compile time, like Pocket.

[–] [email protected] 4 points 1 week ago (1 children)

Sigh. I'm glad to have switched to LibreWolf.

load more comments (1 replies)
load more comments
view more: next ›