this post was submitted on 31 Jan 2024
502 points (97.0% liked)

Technology

59593 readers
2971 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AMD’s new CPU hits 132fps in Fortnite without a graphics card::Also get 49fps in BG3, 119fps in CS2, and 41fps in Cyberpunk 2077 using the new AMD Ryzen 8700G, all without the need for an extra CPU cooler.

you are viewing a single comment's thread
view the rest of the comments
[–] aluminium 30 points 9 months ago (1 children)

Oh, oh ok I thought one of the new Threadrippers is so powerful that the CPU can do all those graphics in Software.

[–] sardaukar 24 points 9 months ago (1 children)

It's gonna take decades to be able to render 1080p CP2077 at an acceptable frame rate with just software rendering.

[–] modeler 0 points 9 months ago (1 children)

It's all software, even the stuff on the graphics cards. Those are the rasterisers, shaders and so on. In fact the graphics cards are extremely good at running these simple (relatively) programs in an absolutely staggering number of threads at the same time, and this has been taken advantage of by both bitcoin mining and also neural net algorithms like GPT and Llama.

[–] pivot_root 6 points 9 months ago

It's a shame you're being downvoted; you're not wrong. Fixed-function pipelines haven't been a thing for a long time, and shaders are software.

I still wouldn't expect a threadripper to pull off software rendering a modern game like Cyberpunk, though. Graphics cards have a ton of dedicated hardware for things like texture decoding or ray tracing, and CPUs would need to waste even more cycles to do those in software.