this post was submitted on 29 Sep 2024
469 points (97.2% liked)

People Twitter

5391 readers
1988 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS
 

Taylor & Francis and Wiley sold out their researchers in bulk, this should be a crime.

Researchers need to be able to consent or refuse to consent and science need to be respected more than that.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 99 points 2 months ago (1 children)

Oh look! Socialism for the rich!

[–] [email protected] 16 points 2 months ago

πŸ§‘β€πŸš€πŸ”«πŸ§‘β€πŸš€

[–] [email protected] 95 points 2 months ago

See - this is why I don't give a shit about copyright.

It doesn't protect creators - it just enriches rent-seeking corporate fuckwads.

[–] [email protected] 56 points 2 months ago (1 children)

Daily reminder that copyright isn’t the only conceivable weapon we can wield against AI.

Anticompetitive business practices, labor law, privacy, likeness rights. There are plenty of angles to attack from.

Most importantly, we need strong unions. However we model AI regulation, we will still want some ability to grant training rights. But it can’t be a boilerplate part of an employment/contracting agreement. That’s the kind of thing unions are made to handle.

[–] [email protected] 14 points 2 months ago (2 children)

Look, I'm not against AI and automation in general. I'm not against losing my job either. We should use this as tools to overcome scarcity, use it for the better future of all of humanity. I don't mind losing my job if I could use my time to do things I love. But that won't happen as long as greedy ass companies use it against us.

[–] krashmo 10 points 2 months ago

We conquered our resource scarcity problem years ago. Artificial scarcity still exists in society because we haven't conquered our greed problem.

[–] [email protected] 9 points 2 months ago (1 children)

Both of you argue from the flawed assumption that AI actually has the potential that marketing people trying to bullshit you say it has. It doesn't.

[–] [email protected] 3 points 2 months ago (2 children)

AI has its usage. Not the ones people cream their pants about, but to say it's useless is just wrong. But people tend to misunderstand what AI, ML, and whatever else is. Just like everyone was celebrating the cloud without knowing what the cloud was ten, twenty years ago.

[–] [email protected] 5 points 2 months ago (1 children)

It has its uses but none of them include anything even close to replacing entire jobs or even significant portions of jobs.

[–] [email protected] 2 points 2 months ago

I disagree, but I might have different experiences.

[–] [email protected] 1 points 2 months ago

It doesn't matter what it's good for. What matters is what the MBA parasites think it's good for.

They will impulsively replace jobs, and then when it fails, try to rehire at lower wages.

[–] [email protected] 54 points 2 months ago (2 children)

β€œit is in the public interest for these emerging technologies to be trained on high-quality, reliable information.”

Oh, well if you say so. Oh wait, no one has a say anyway because corporations ru(i)n everything.

[–] [email protected] 27 points 2 months ago (1 children)

"We need to train LLMs with your data in order to make you obsolete."

[–] krashmo 6 points 2 months ago (1 children)

If that's what it takes to get rid of CEOs then I'm on board.

Seriously though, that's the best application of AI. CEO is a simple logic based position, or so they tell us, that happens to consume more financial resources than many dozen lower level employees. If anyone is on the chopping block it should be them, in both senses of the phrase.

[–] [email protected] 1 points 2 months ago

For CEOs it might even bring down the percentage of nonsense they say even with the high rates of nonsense AI produces.

[–] mumblerfish 8 points 2 months ago

It's nice to see them lowering the bar for "high-quality" at the same time. Really makes it seem like they mean it. /s

[–] [email protected] 51 points 2 months ago

"it's in the public interest" so all these articles will be freely available to the public. Right?... Riiight?!

[–] finitebanjo 49 points 2 months ago (4 children)

"How is nobody talking about this?"

The average person has the science literacy at or below a fifth grader, and places academic study precedence below that of a story about a wish granting sky fairy who made earth in his basement as a hobby with zero lighting (obviously, as light hadn't been invented at that point).

[–] kalkulat 1 points 2 months ago (4 children)

A musician friend of mine, when asked "Why are there no Mozarts or Beethovens any more?" replies "We went through your schools."

load more comments (4 replies)
load more comments (3 replies)
[–] Grimy 24 points 2 months ago (1 children)

It's for reasons like these that I think its foolhardy to be advocating for a strengthening of copyrights when it comes to AI.

The windfall will not be shared, the data is already out of the hands of the individuals and any "pro-artist" law will only help kill the competition for companies like Google, Sony Music, Disney and Microsoft.

These companies will happily pay huge sums to lock anyone out of the scene. They are already splitting it between each other, they are anticipating a green light for regulatory capture.

[–] [email protected] 9 points 2 months ago

Copyright is not supposed to be protecting individuals work from corporations, but the otherway around

[–] sunbrrnslapper 8 points 2 months ago (1 children)

I think this happens because the publisher owns the content and owes royalties to authors under certain conditions (which may or may not be met in this situation). The reason I think this is I had a PhD buddy who published a book (nonfiction history) and we all got a hardy chuckle at the part of the contract that said the publisher got the theme park rights. But what if there were other provisions in the contract that would allow for this situation without compensating the authors? Anywho, this is a good reminder to read the fine print on anything you sign.

[–] [email protected] 8 points 2 months ago (1 children)

I’d guess books are different, but researchers don’t get paid anything for publishing in academic journals

[–] sunbrrnslapper 4 points 2 months ago

Oh yeah, good point.

[–] TriflingToad 7 points 2 months ago (1 children)

"it is in the public interest for these emerging technologies to be trained on high quality information"

Ok but we have to pay hundreds of dollars for a single book in college because....?

[–] Bearlydave 1 points 2 months ago

If it is in the public interest then all of that information should be open sourced.

[–] [email protected] 7 points 2 months ago (1 children)

Guess it's time to poison the data

A couple dozen zero-width unicode characters between every letter, white text on white background filled with nonsense, any other ideas?

[–] [email protected] 10 points 2 months ago

Hilariously the data is poisoning itself, because as the criteria for decent review are dwindling, more non - reproducible crap science is published. Or its straight up fake. Journals don't care, correcting the scientific record always takes months or years. Fuck the publishers.

[–] The2b 4 points 2 months ago* (last edited 2 months ago) (1 children)

How does cutting peer review time help get more content? The throughput will still be the same regardless of if it takes 15 days or a year to complete a peer review

[–] [email protected] 3 points 2 months ago

Isn't that because the peers also write stuff? So it's not just a fixed delay on one-by-one papers, but a delay that goes between peers' periods of working on papers too.

[–] kalkulat 3 points 2 months ago* (last edited 2 months ago)

I believe that most of today's writers would advise you to run your publisher's contract past a competent lawyer before you sign.

Maybe academic authors are not as aware that publishing is full of ravening wolves that have been pulling these tricks since Dickens handed his Pickwick Papers to Punch. Poor babes in the woods.

[–] [email protected] 3 points 2 months ago

Reminds me of the song "Feed the Machine" by Poor Man's Poison:

https://youtu.be/BlnVP2_dIb4?si=t6CdslRzb7USrH1j

[–] phoneymouse 2 points 2 months ago

Good way to make authors negotiate differently with you in the future. I doubt they’ll just sign on to whatever they did previously without some guard rails around this

[–] [email protected] 1 points 2 months ago

The shitty chat bots do need high quality data. This is much better than scraping off reddit, since a glorified auto-complete cannot know that eating rocks is bad for you. You can't retroactively complain after having signed away your rights to something. But you can change things moving forward. If you are incorruptible and don't care about money, start an organization with those values and convince the researchers to join you. Good luck (seriously, I hope you succeed).

[–] jimmydoreisalefty 1 points 2 months ago

Thanks for sharing this with us!

Terms of Service and contracts...

It reminded me of this:

"Disney says man can't sue over wife's death because he agreed to Disney+ terms of service" [https://www.nbcnews.com/news/us-news/disney-says-man-cant-sue-wifes-death-agreed-disney-terms-service-rcna166594]

[–] Feathercrown 1 points 2 months ago

It's in the public interest for academic publishers to be wiped off the face of the planet

load more comments
view more: next β€Ί