this post was submitted on 11 Oct 2023
506 points (92.6% liked)

Technology

59708 readers
5537 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 6 points 1 year ago (1 children)

People are literally paying monthly subscriptions for access to a bunch of these things.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

Did you read the article? The problem hasn't been getting some people to pay for some things, it's that the things that are available so far are losing loads of money. Or at least, that's the premise.

[–] [email protected] 6 points 1 year ago (2 children)

Yeah, so far. It's super early in the modern incarnation of AI that actually has the chance to pay off, LLMs.

This isn't like Bitcoin where there's huge hype for a pretty small market opportunity. We all realize the promise, we are just still figuring out how to get rid of hallucinations and making it consistent and tuned to a certain business usage.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (5 children)

Well, and also navigating the minefields that the LLMs absolutely have copyrighted material in them that wasn't paid for or licensed. E.G. Dall-E can produce a full image of Fresh Cut Grass, a character owned by Critical Role.

And that the stuff they produce isn't copyright-able.

load more comments (5 replies)
[–] xantoxis 6 points 1 year ago

Goood. Gooooooooooood.

[–] kromem 4 points 1 year ago

Great, now factor in the cost of data collection if not subsidizing usage that you are effectively getting free RLHF from...

The one thing that's been pretty much a guarantee over the last 6 months is that if there's a mainstream article with 'AI' in the title, there's going to be idiocy abound in the text of it.

[–] [email protected] 3 points 1 year ago

This is the best summary I could come up with:


A new report from the Wall Street Journal claims that, despite the endless hype around large language models and the automated platforms they power, tech companies are struggling to turn a profit when it comes to AI.

Github Copilot, which launched in 2021, was designed to automate some parts of a coder’s workflow and, while immensely popular with its user base, has been a huge “money loser,” the Journal reports.

OpenAI’s ChatGPT, for instance, has seen an ever declining user base while its operating costs remain incredibly high.

A report from the Washington Post in June claimed that chatbots like ChatGPT lose money pretty much every time a customer uses them.

Platforms like ChatGPT and DALL-E burn through an enormous amount of computing power and companies are struggling to figure out how to reduce that footprint.

To get around the fact that they’re hemorrhaging money, many tech platforms are experimenting with different strategies to cut down on costs and computing power while still delivering the kinds of services they’ve promised to customers.


The original article contains 432 words, the summary contains 172 words. Saved 60%. I'm a bot and I'm open source!

load more comments
view more: ‹ prev next ›