this post was submitted on 19 Jul 2023
34 points (88.6% liked)

PC Gaming

8683 readers
548 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
 

The previous link was broken, so I've reposted a safer one with archive.org

you are viewing a single comment's thread
view the rest of the comments
[–] nivenkos 17 points 1 year ago (4 children)

Do that many people upgrade every generation?

I still use a 1070, so the GPU comparisons here aren't relevant.

The main issue I hit was deciding between DDR4 and DDR5 RAM since we're in an awkward transition phase - and that affects motherboard and so CPU choices too.

[–] RightHandOfIkaros 9 points 1 year ago

Upgrading every generation is stupid. I try to upgrade every 5 years if I can afford it.

My 1080ti says the performance gap versus cost to upgrade is not affordable right now. So I gotta keep waiting.

[–] TheHighRoad 5 points 1 year ago (2 children)

Well, I've had the same CPU/Mobo/RAM for over ten years and only upgraded my GPU once from a GTX660 to a 5700xt at the start of the pandemic. I'm finally seeing some issues with some modern AAA content. Hogwarts legacy won't really run at all, for example.

I also haven't wiped my system in the same amount of time, so that may be more the culprit than the system itself. Still going strong!

[–] [email protected] 4 points 1 year ago (1 children)

FYI it probably isn't the 5700XT that's causing issues in Hogwarts, mine works fine.

[–] TheHighRoad 2 points 1 year ago

I think it's a memory issue, most likely due to the sorry state of my Windows installation. Need to knock off the lazy and wipe it, but it's pretty remarkable that it works as well as it does. I started with fresh Win7 and have survived upgrades to Win8 and Win10 in addition to the major feature updates that come now and again. I thought it was totally borked a few years back but some obscure automated tool managed to fix it.

IT BELONGS IN A MUSEUM!

[–] nivenkos 3 points 1 year ago (2 children)

The CPU becomes the real issue though - which then means changing motherboard, which means changing RAM, etc. and then you might as well get an NVMe too etc.

[–] TheHighRoad 2 points 1 year ago

I've come to realize that I don't really "upgrade" anything but the GPU and adding storage. I've never so much as dropped in a new CPU without going through the whole rigamarole you just described. Build them to last, folks.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Sometimes you get around that for longer by upgrading to the highest possible configuration on that platform. Often for cheap second hand.

I replaced my 2017 Ryzen 1800x with a Ryzen 5800x3D recently which is supported on my x370 Motherboard. Huge upgrade, no platform change required. I think I can wait for DDR5 and a new motherboard for years to come.

[–] [email protected] 3 points 1 year ago (1 children)

I used to upgrade every generation, and yeah, it was stupidly expensive. But it was my only hobby, and you could actually seen performance increases each time.

But for the last 10 years or so, there's much less point. Sometimes there are major advances (Cuda, RTX) that make it worthwhile for a single generation upgrade, but mostly it's just a few FPS at highest settings. So now I just upgrade every few years.

[–] TheHighRoad 2 points 1 year ago

Back in the 90s and early 00s, frequent upgrades were kind of required to stay up to date with new games. The last 10-15 years have been muuuuch slower in that regard, thanks to consoles I guess. I'm not complaining, but I miss the sense of developers really pushing boundaries like they did in the old days.

[–] [email protected] 1 points 1 year ago

The only reason I upgraded my 10 series to a 30 series is because I'm a dummy and bought a monitor with only HDMI 2.1 and no display port, so I needed to upgrade my GPU or I would have no gsync. Otherwise, I probably would have waited at least 2 more generations to upgrade.