this post was submitted on 09 Jul 2023
518 points (97.1% liked)

Technology

59118 readers
3836 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Two authors sued OpenAI, accusing the company of violating copyright law. They say OpenAI used their work to train ChatGPT without their consent.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago (1 children)

a firm line between what a random human can do versus an automated intelligent system with potential unlimited memory/storage and processing power.

I think we need a better definition here. Is the issue really the processing power? Do we let humans get a pass because our memories are fuzzy? From your example you're assuming massive details are maintained in the AI situation which is typically not the case. To make the data useful it's consumed and turned into something useful for the system.

This is why I'm worried about legislation and legal precedent. Most people think these AI systems read a book and store the verbatim text off somewhere to reference when that isn't really the case. There may be fragments all over, and it may be able to reconstitute the text, but we don't seem to have the same issue with data being synthesized in a similar way with a human brain.

[–] [email protected] 3 points 1 year ago (1 children)

A continuous record of location + time or even something like "license plate at location plus time" is scary enough to me, and that's easily data a system could hold decades of

[–] [email protected] 0 points 1 year ago (1 children)

Is that scary because it's a machine? Someone could tail you and follow you around and manually write it all down in a notebook.

Yes the ease of data collection is an issue and I'm very much for better privacy rights for us all. But from the issue you've stated I'd be more afraid of what the 70 year old politicians who don't understand any of this would write up in a bill.

[–] [email protected] 2 points 1 year ago (1 children)

Someone could tail you and follow you around and manually write it all down in a notebook.

They could, and then they could also be charged with stalking.

It's not just ease of collection. It's how the data is being retained, secured, and shared among a great many other things. Laws just haven't kept up with technology, partly because yeah 70yo politicians that don't even understand email but also because the corporations behind the technology lie and bribe to keep it that way, and face little consequences when they do so improperly or mishandle it. E.G.

https://www.cbc.ca/news/politics/cadillac-fairview-5-million-images-1.5781735

When the government does it, we seem to have even less recourse.

[–] [email protected] 1 points 1 year ago

Would it be stalking if you signed a legal agreement that allowed them to track you? That is the reason the California law exists. Most of us have accepted a license agreement to us an app or service and in exchange we gave up privacy rights. And it may not have even been with the company consuming the data.

Sadly the law requires you to contact everyone to demand your data be deleted. Passing a law to have the default be never store my data means most of social media goes away or goes behind a paywall. This also goes for any picture hosting company who charges you nothing for hosting as they use your images.

This would also most likely mean that very explicit declarations must be made to allow anyone to use your material causing a lot of business to say it's too big of a risk and ditch a lot of support.

Right now we kind of work on good faith which maybe doesn't work.