this post was submitted on 13 Dec 2024
31 points (100.0% liked)

Hardware

759 readers
248 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 4 days ago

10 years ago was about when 1440p became attainable for not just the highest end systems, and at actual frame rates. At the time I had a 7970 which was a generation old, but it ran games beautifully, and had tanked in price by the time I'd bought it. And when I eventually got my GTX 970 it just solidified the deal.

But at the same time that was when the split between high resolution and high refresh rate happened. 1440p high refresh rate became the cream of the crop, and 1440p 60 or 1080 120 took over for high end, but attainable.

Also lmao this old review from the 290x. I guess that was when 4k took over as the "you can't do this even if you wanted, but you can try." Even the 512 bit memory bus was no match for 4k. You needed crossfire just to get 60 fps. https://www.anandtech.com/show/7457/the-radeon-r9-290x-review/11