this post was submitted on 08 Jun 2024
361 points (97.9% liked)

Technology

60123 readers
3026 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] General_Effort 12 points 6 months ago (20 children)

I had a short look at the text of the bill. It's not as immediately worrying as I feared, but still pretty bad.

https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202320240SB1047

Here's the thing: How would you react, if this bill required all texts that could help someone "hack" to be removed from libraries? Outrageous, right? What if we only removed cybersecurity texts from libraries if they were written with the help of AI? Does it now become ok?

What if the bill "just" sought to prevent such texts from being written? Still outrageous? Well, that is what this bill is trying to do.

[–] Cosmicomical -5 points 6 months ago* (last edited 6 months ago) (10 children)

Seems a reasonable request. You are creating a tool with the potential to be used as a weapon, you must be able to guarantee it won't be used as such. Power is nothing without control.

[–] TheGrandNagus 5 points 6 months ago (5 children)

How is that reasonable? Almost anything could be potentially used as a weapon, or to aid in crime.

[–] gbzm -1 points 6 months ago

I guess let's deregulate guns then. Oh wait.

load more comments (4 replies)
load more comments (8 replies)
load more comments (17 replies)