Lemmy.World

160,302 readers
6,507 users here now

The World's Internet Frontpage Lemmy.World is a general-purpose Lemmy instance of various topics, for the entire world to use.

Be polite and follow the rules โš– (https://legal.lemmy.world/tos/).

Get started

See the Getting Started Guide

Donations ๐Ÿ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Liberapay patrons

Join the team ๐Ÿ˜Ž

Check out our team page to join

Questions / Issues

More Lemmy.World

Mastodon Follow

Discord

Matrix

Alternative UIs

Monitoring / Stats ๐ŸŒ

Mozilla HTTP Observatory Grade

Lemmy.World is part of the FediHosting Foundation

founded 1 year ago
ADMINS
1
 
 

Several big businesses have published source code that incorporates a software package previously hallucinated by generative AI.

Not only that but someone, having spotted this reoccurring hallucination, had turned that made-up dependency into a real one, which was subsequently downloaded and installed thousands of times by developers as a result of the AI's bad advice, we've learned. If the package was laced with actual malware, rather than being a benign test, the results could have been disastrous.

2
3
85
submitted 3 months ago* (last edited 3 months ago) by [email protected] to c/[email protected]
 
 

Lemmy did not warn that this was already posted days ago. Apologies. Here's another take https://pluralistic.net/2024/04/01/human-in-the-loop/#monkey-in-the-middle

4
 
 

Simply look out for libraries imagined by ML and make them real, with actual malicious code. No wait, don't do that

5
6
7
view more: next โ€บ