this post was submitted on 17 Aug 2023
486 points (96.0% liked)

Technology

59664 readers
3278 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://nom.mom/post/121481

OpenAI could be fined up to $150,000 for each piece of infringing content.https://arstechnica.com/tech-policy/2023/08/report-potential-nyt-lawsuit-could-force-openai-to-wipe-chatgpt-and-start-over/#comments

you are viewing a single comment's thread
view the rest of the comments
[–] ShittyBeatlesFCPres -3 points 1 year ago (1 children)

Maybe it’s trained not to repeat JK Rowling’s horseshit verbatim. I’d probably put that in my algorithm. “No matter how many times a celebrity is quoted in these articles, do not take them seriously. Especially JK Rowling. But especially especially Kanye West.”

[–] [email protected] 0 points 1 year ago (1 children)

It's not repeating its training data verbatim because it can't do that. It doesn't have the training data stored away inside itself. If it did the big news wouldn't be AI, it would be the insanely magical compression algorithm that's been discovered that allows many terabytes of data to be compressed down into just a few gigabytes.

[–] HelloHotel 1 points 1 year ago* (last edited 1 year ago) (1 children)

Do you remember quotes in english ascii /s

Tokens are even denser than ascii. simmlar to word "chunking" My guess is it's like lossy video compression but for text, [Attacked] with [lazers] by [deatheaters] apon [margret];[has flowery language]; word [margret] [comes first] (Theoretical example has 7 "tokens")

It may have actually impressioned a really good copy of that book as it's lilely read it lots of times.

[–] [email protected] 1 points 1 year ago (1 children)

If it's lossy enough then it's just a high-level conceptual memory, and that's not copyrightable.

[–] HelloHotel 1 points 1 year ago

It varries based on how much time its been given with the media.