this post was submitted on 20 Sep 2023
42 points (92.0% liked)

Games

32620 readers
2220 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 1 year ago
MODERATORS
top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 14 points 1 year ago* (last edited 1 year ago) (4 children)

Oh boy can't wait to have cups that burn a hole right through their coolers

I'd really love it if we'd just have a generation or two where they focused on making cpus more efficient and less hot rather than ramping power every generation , same with gpus

[–] [email protected] 7 points 1 year ago (3 children)

This only got bad with the most recent generation of CPUs. AMD 5xxx series is very efficient as demonstrated by Gamers Nexus. The Intel CPUs from 2500k to idk, 8xxx series? were efficient until they started slapping more cores and then cranking the power on them.

[–] [email protected] 2 points 1 year ago (1 children)

Yes the second thing about cranking power and cores is what I'm talking about.

Also, as far as gpus, the 2000 series was ridiculously power hungry at the time, and it looks downright reasonable now. It's like the Overton window of power consumption lol.

Image

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

I dunno, I ran a 2080 on the same PSU that I used on a 2013 build, a 650W seasonic. Got some graphs? Power consumption didn’t seem to jump that bad until the latest gen.

My current 3090 is a power hog though, that’s when I’d say it started for Nvidia (3000 series). For AMD, 7000 series CPUs, and I’m not really sure for Intel. 9900k was the last Intel CPU I ran, it seemed fine. I was running a 9900k/2080 on the same PSU as the 2500k/570 build.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

As for as the 2080 goes, like I said, it was big FOR THE TIME, and power hungry FOR THE TIME. It's still reasonable especially for today's standards

As for as the last two gens, 3000 and 4000 series, they are known to draw more than their rated power requirements, which, for their min recommended psu wattage, 3080 was 50 watts more than the 2080 (750w), and 4080 was 100 w more than that (850w)

To add to that, both of these gens of cards, when doing graphics intensive things like gaming, can overdraw power and have been known to cause hard shutdowns in pcs with PSUs that are even slightly higher rated than their min rec. Before these last two gens you could get away with a slightly lower than rated wattage PSU and sacrifice a little performance but that is definitely no longer the case.

And sure, the performance to watts used is better in the 3080, but they also run 10+ degrees hotter and the 4000 series even moreso.

I just hope the 5000 series goes the way of power consumption refinement rather than smashing more chips onto a board or vram fuckery like with the 4060, like I'd be happy with similar performance on the 5000 series if it was less power hungry

[–] [email protected] 2 points 1 year ago

The 7 series are more efficient than the 5 series. They just are programmed to go as fast as thermals allow. So the reviewers that had really powerful coolers on the cpus saw really high power draw. If instead you set a power cap, you get higher performance per watt than the previous generations.

Having the clocks scale to a thermal limit is a nice feature to have, but I don't think it should have been the default mode.

[–] [email protected] 1 points 1 year ago

Intel became less efficient because of how long they were stuck on 14nm. In order to compensate to beat amd in performance mindshare, they needed to push the clocks hard.

Overtime, cpus have been sitting closer to max clock, defeating the purpose of overclocking to many, where adding 1GHz was not out of the ordinary. Now getting 0.5GHz is an acheivement.

[–] [email protected] 6 points 1 year ago (1 children)

AMD uses 290/390 to compete with Nvidias 970, people buy Nvidia, shoulda bought a 390 meme is born after the 3.5 gb vram controversy happens. AMd mocked for high power consumption.

AMD releases 6000 series gous to compete with Nvidias Ampere line, uses a notibly significant lower power draw, people still buy Nvidia.

Power draw was never part of the equation.

[–] [email protected] 0 points 1 year ago (1 children)

That's because Nvidia still has the leg up on rtx, but that doesn't mean Nvidia shouldn't be thinking about it. I'm not talking about what the market directs them to do, I'm talking about what I hate personally

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I mean they did this generation technically. All of the rtx 4000 cards sans the 4090 are fairly efficient... only because nvidia moved the names of the gpu for each tier thats not the halo card.

Point is, you cant have everything and people generally prioritize performance first. Because efficiency has rarely gave either gpu company more profit gpu wise.

If you cared about efficiency, Nvidia answer to people would be buying their RTX 4000 SFF Ada(75w ~3060ti perf) or RTX 6000 Ada... if you can afford it.

[–] [email protected] 2 points 1 year ago (1 children)

I felt the same when the current-gen CPUs were announced, but when I looked closer at AMD's chips, I learned that they come with controls for greatly reducing the power use with very little performance loss. Some people even report a performance gain from using these controls, because their custom power limits avoid thermal throttling.

It seems like the extreme heat and power draw shown in the marketing materials are more like competitive grandstanding than a requirement, and those same chips can instead be tuned for pretty good efficiency.

[–] [email protected] 0 points 1 year ago

Yeah I'm talking about Nvidia and Intel here, but tbh ryzen 4000 cpus run pretty hot, but they also optimized ryzen quite a bit before they changed to this new chip set, which makes sense to me. Seems like Nvidia and Intel are worried about what looks good power wise on paper rather than optimization sometimes.

[–] [email protected] 0 points 1 year ago (1 children)

I know someone who works at Nvidia, and he said the problem is that Moore's law is dead. Apparently the only way we can generate more performance right now is to input more energy and/or increase size.

Obviously that doesn't scale forever, and the 40 series are already fucking massive. So where does that leave us with the 50 series? We need some breakthrough.

[–] [email protected] 0 points 1 year ago

The real answer is ARM based systems and a new pcie slot standard that makes the traces closer to the cpu similar to that dell ram standard.

Also I genuinely doubt an architecture as recent as Lovelace is optimized as much as possible

[–] notaviking 10 points 1 year ago

Imagine a +-6GHz CPU with 3D cache. Now we just have to wait for LTT to fuckup the graphs

[–] KalabiYau 6 points 1 year ago

I think so. The AMD 3D cache CPUs are impressive in terms of gaming performance (though the inability to overclock them still leaves some benefit to non-3D cache CPUs. Non-3d cache CPUs are also great for everything other than gaming).