cll7793

joined 2 years ago
[–] cll7793 2 points 1 year ago

No problem! Glad you found it useful!

[–] cll7793 7 points 1 year ago

Alignment at its finest.

[–] cll7793 4 points 1 year ago

Of course. I know some open source devs that advice backing up raw training data, LoRa, and essentially the original base models for fine tuning.

Politicians sent an open letter out in protest when Meta released their LLaMA 2. It is not unreasonable to assume they will intervene for the next one unless we speak out against this.

[–] cll7793 31 points 1 year ago

Yeah lol. Search engines have gone down in quality so bad it really does feel like that. It has been so difficult finding stuff on google/duckduckgo etc that I have to rely on portals now.

[–] cll7793 13 points 1 year ago* (last edited 1 year ago) (1 children)

Thank you all for your answers!

I wanted to add one resource I found that has helped me find even more relevant search results:

A Lemmy Search Engine https://www.search-lemmy.com/

[–] cll7793 5 points 1 year ago

Thank you so much!

[–] cll7793 2 points 1 year ago* (last edited 1 year ago) (1 children)

Thank you so much! Comments like these is why I love Lemmy. A hidden gem in the sea of news spam.

If I can ask, how did you even find that place?

[–] cll7793 1 points 1 year ago

Also no problem! I feel like I had to share this one.

[–] cll7793 3 points 1 year ago* (last edited 1 year ago)

I hope so, but from what I can tell, we are going to have a repeat of the Patriot Act and the horrors that caused as showed by Edward Snowden.

The politicians are only getting one side of the argument about AI from CEOs and those in positions of power. It is important that the politicans recognize the good AI is doing as well. This is why I made this post to try to get some voice out there.

[–] cll7793 2 points 1 year ago

It would be difficult indeed, but without a doubt they will still try and cause massive damage to our basic freedoms. For example, imagine if one day all chips require DRMs at the hardware level that cannot be disabled. This is just one example of the damage they could do. There isn't much any consumer can do if they do this since developing your own GPU is nearly impossible.

[–] cll7793 3 points 1 year ago* (last edited 1 year ago) (8 children)

They are requesting for something beyond watermarking. Yes, it is good to have a robot tell you when it is making a film. What is particularly concerning is that the witnesses want the government to keep track of every prompt and output ever made to eventually be able to trace its origin. So all open source models must somehow encode some form of signature, much like the hidden yellow dots printers produce on every sheet.

There is a huge difference between a watermark stating that "this is ai generated" and having hidden encodings, much like a backdoor, where they can trace any pubicly released ai image, video, and perhaps even text output, to some specific model, or worse DRM required "yellow dot" injection.

I know researchers have already looked into encoding hidden undetectable patterns in text output, so an extension to everything else is not unjustified.

Also, if the encodings are not detectable by humans, then they have failed the original purpose of making ai generated content known.

view more: ‹ prev next ›