this post was submitted on 08 Dec 2024
457 points (94.5% liked)

Technology

59882 readers
3274 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
457
The GPT Era Is Already Ending (www.theatlantic.com)
submitted 4 days ago* (last edited 4 days ago) by [email protected] to c/technology
 

If this is the way to superintelligence, it remains a bizarre one. “This is back to a million monkeys typing for a million years generating the works of Shakespeare,” Emily Bender told me. But OpenAI’s technology effectively crunches those years down to seconds. A company blog boasts that an o1 model scored better than most humans on a recent coding test that allowed participants to submit 50 possible solutions to each problem—but only when o1 was allowed 10,000 submissions instead. No human could come up with that many possibilities in a reasonable length of time, which is exactly the point. To OpenAI, unlimited time and resources are an advantage that its hardware-grounded models have over biology. Not even two weeks after the launch of the o1 preview, the start-up presented plans to build data centers that would each require the power generated by approximately five large nuclear reactors, enough for almost 3 million homes.

https://archive.is/xUJMG

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 3 days ago

Showing the most played songs and artists is not really a difficult analysis task that does not require any machine learning.

You want to dimension reduce to get that "people who listen to stuff like you also like to listen to" recommendation. To have an idea whom to play a new song to, you ideally want to analyse the song itself and not just people's reaction to it and there we're deep in the weeds of classifiers.

Using LLMs in particular though is probably suit-driven development because when you're trying to figure out whether a song sounds like pop or rock or classical then LLMs are, at best, overkill. Analysing song texts might warrant LLMs but I don't think it'd gain you much. If you re-train them on music instead of language you might also get something interesting, classifying music by phrasal structure and whatnot don't look at me I may own a guitar but am no musician. And, of course, "interesting" doesn't necessarily mean "business case" unless you're in the business of giving Adam Neely video ideas. "Spotify, play me all pop songs that sing 'caught in the middle' in the same way"... not a search that's going to make spotify money, or anyone asked for.