If you don't upgrade to Windows 11, you can't use Recall, which is a great reason not to upgrade to Windows 11.
Greentext
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
- Anon is often crazy.
- Anon is often depressed.
- Anon frequently shares thoughts that are immature, offensive, or incomprehensible.
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
I upgraded to Linux. It worked out well for me since I mostly pay retro games and games from yesteryear.
I upgraded a Chromebook to Linux recently. That was a huge bump in performance that I wasn't expecting, not even just for gaming.
People want shiny new things. I've had relatives say stuff like "I bought this computer 2 years ago and it's getting slower, it's awful how you have to buy a new one so quickly." I suggest things to improve it, most of which are free or very cheap and I'd happily do for them. But they just go out and buy a brand new one because that's secretly what they wanted to do in the first place, they just don't want to admit they're that materialistic.
Can I have some tips too?
Appreciate the meme but yea that is one way to probably improve performance. Or upgrade the RAM, clean the fans, reapply thermal compound, clear out temporary files, disable unused services or reinstall Windows if they really need it just to run Chrome and Zoom which is all they do.
Even just blowing out all the dust from a passive cooler (under the CPU fan) can make your system run a good 10°C cooler.
Clean the fans.
Reinstall the os clean. That's usually why a new computer feels snappy: it's just fresh.
Free:
- clean fans and heatsink - others mentioned, and the reason is better cooling so it doesn't throttle
- kill unnecessary services - that's why reinstalling works
- install Linux - not reasonable for everyone, but Linux uses far fewer resources
- delete old files - as disks get full, it takes longer to find somewhere for files to go; try to leave 10-20% free
- try a small overclock - many older CPUs can give a little more without upgrading cooling; only do it if temps look good
Relatively cheap (<$200 each):
- upgrade drive to NVMe - huge difference if running an HDD, still noticeable of running a SATA SSD
- add more RAM (only if you're constantly running out)
- upgrade CPU - esp if AMD since they release lots of CPUs for the same socket
It really depends on what's making it slow though.
They're invested in PC gaming as social capital where the performance of your rig contributes to your social value. They're mad because you're not invested in the same way. People often get defensive when others don't care about the hobbies they care about because there's a false perception that the not caring implies what they care about is somehow less than, which feels insulting.
Don't yuck others' yum, but also don't expect everyone to yum the same thing.
I use a gaming laptop from 2018. Rog Zephyrus.
fan started making grating noise even after thorough cleaning, found a replacement on Ebay and boom back in business playing Hitman and Stardew.
Will I get 120 fps or dominate multiplayer? nah. But yeah works fine. Might even be a hand me down later on.
Absolutely it totally depends on what you got originally. If you only got an okay ish PC in 2018 then it definitely still won't be fit for purpose in 2025, but if you got a good gaming PC in 2018 it probably will still work in another 5 years, although at that point you'll probably be on minimum settings for most new releases.
I would say 5 to 10 years is probably the lifespan of a gaming PC without an upgrade.
However my crappy work laptop needs replacing after just 3 years because it was rubbish to start with.
It depends on what gaming you do. My 10 year old PC with 6 year old GPU plays Minecraft fine.
My other "new PC" is a mini PC with Nvidia 1080 level graphics and it plays half life Alyx fine.
We replaced my mom's warcraft machine 3 years ago. It replaced an athlonII from 2k7 at 14 years old. Your tank may be a 74yo grandmother so be nice.
If not playing competitive, there's very little reason to go latest and greatest. Just buy something with software support, or use Linux where support is practically guaranteed for at least a decade
The computer I built in 2011 lasted until last summer. I smiled widely when I came to tell my wife and my friend, where my friend then asked why I was smiling when my computer no longer worked.
"Because now he can buy a new one" my wife quickly replied 😁
One upside of AAA games turning into unimaginative shameless cash-grabs is that the biggest reason to upgrade is now gone. My computer is around 8 years old now. I still play games, including new games - but not the latest fancy massively marketed online rubbish games. (I bet there's a funner backronym, but this is good enough for now.)
I'm still pushing a ten year old PC with an FX-8350 and a 1060. Works fine.
I didn't think of my computer as old until I saw your comment with ten years and it's gpu in the same sentence. When did that happen??
If people are pushing you to buy stuff, they are not friends. Do not listen to them.
They're mad they spent 1k$ on a gpu and still can't do 4k without upscaling on the newest crapware games
Yeah, I'm with you anon. Here's my rough upgrade path (dates are approximate):
- 2009 - built PC w/o GPU for $500, only onboard graphics; worked fine for Minecraft and Factorio
- 2014 - added GPU to play newer games (~$250)
- 2017 - build new PC (~$800; kept old GPU) because I need to compile stuff (WFH gig); old PC becomes NAS
- 2023 - new CPU, mobo, and GPU (~$600) because NAS uses way too much power since I'm now running it 24/7, and it's just as expensive to upgrade the NAS as to upgrade the PC and downcycle
So for ~$2200, I got a PC for ~15 years and a NAS (drive costs excluded) for ~7 years. That's less than most prebuilts, and similar to buying a console each gen. If I didn't have a NAS, the 2023 upgrade wouldn't have had a mobo, so it would've been $400 (just CPU and GPU), and the CPU would've been an extreme luxury (1700 -> 5600 is nice for sim games, but hardly necessary). I'm not planning any upgrades for a few years.
Yeah it's not top of the line, but I can play every game I want to on medium or high. Current specs: Ryzen 5600, RX 6650 XT, 16GB RAM.
People say PC gaming is expensive. I say hobbies are expensive, PC gaming can be inexpensive. This is ~$150/year, that's pretty affordable... And honestly, I could be running that OG PC from 2009 with just a second GPU upgrade for grand total of $800 over 15 years if all I wanted was to play games.
put linux on that beast and it'll keep running new games til 2030
I thought anon was the normie? The average person doesnt upgrade their PC every two years. The average person buys a PC and replaced it when nothing works anymore. Anon is the normie, they are the enthusiasts. Anon is not hanging with a group of people with matching ideologies.
I upgraded last year from i7-4700k to i7-12700k and from GTX 750Ti to RTX 3060Ti, because 8 threads and 2GB of vram was finally not enough for modern games. And my old machine still runs as a home server.
The jump was huge and I hope I'll have money to upgrade sooner this time, but if needed I can totally see that my current machine will work just fine in 6-8 years.
For me the most important reason to upgrade things is security updates. E.g. if you have an old smartphone it might not get security updates anymore.
Some people don't seem to care, but I get paranoid about hackers breaking into my phone in some way.
Phones suffer a lot from forced obsolescence. More often than not, the hardware is fine, but the OEM abandons it because "lol fuck you, buy new shit". Anyone that says that a Samsung S7 "can't handle current apps" is out of their mind
Other than camera and software, there's hardly any reason to buy new phones over flagships from some years ago.
My current PC used for gaming is a self built one from 2014. I have upgraded a few things during the years, most notably GPU and memory, but it did an excellent job for over a decade. Recently it started to show its age with various weird glitches and also some performance issues in several newer games and so I've just ordered a new one. But I'm pretty proud of my sustainable computing achievement.
I feel this.
I went AM4 in 2017 when the AMD gave a leap forward at a reasonable price and efficiency.
Then I added a 3060 when one became available.
They're both undervolted, and ticking along nicely.
I don't plan to change anything until probably 2027. Heck, I'm still catching up to 2020 in my games backlog.
I originally built my current PC back in 2016 and only just "upgraded" it last year. I put upgrade in quotes because it was literally a free motherboard and GPU my buddy no longer needed. I went from a Core i5 6600K to a Ryzen 5 5500GT and a GTX960 4GB to a GTX1070. Still plays all the games I want it to, so I have no desire to upgrade it further right now. I think part of it is I'm still using 1080P 60Hz monitors.
I also have a 2014-ish desktop. Over the years added an SSD and replaced the graphics card around 5 years ago.
I can still run most games on medium settings, even some new ones if they are properly optimized, but nothing crazy, 1080p.
I just started to feel that my rig is getting slower and even AA games become more demanding.
I fully support using hardware as long as possible to minimise e-waste and see no reason to upgrade a PC every 2-3 years.
Edit: typo
Still on a 1060 here. Sure, it's too slow for anything from the PS5 era, but that's what my PS5 is for.
It does have a 1 in 4 chance of bluescreening when I quit FFXIV, but I don't know what's causing that. Running it at 100% doesn't seem to crash it, possibly something about the drivers not freeing shit properly, I dunno.
if you had a top of the line pc in 2014 you'd be talking about a 290x/970/980 which would probably work really well for most games now. For CPU that'd be like a 4th gen intel or AMD Bulldozer which despite its terrible reputation probably runs better nowadays thanks to better multi-threading.
A lot of the trending tech inflating minimum requirements nowadays are stuff like raytracing (99% of games don't even need it) and higher FPS/resolution monitors that aren't that relevant if you're still pushing 1080p/60. Let's not even begin with Windows playing forced obsolescence every few years.
Hell, most games that push the envelope of minimum specs like Indiana Jones are IMO just unoptimised messes built on UE5 than legitimately out of scope of hardware from the last decade. Stuff like Ninite hasn't delivered in enabling photorealistic asset optimisation but HAS enabled studios to cut back on artist labour in favour of throwing money at marketing.
I showed this to my penultimate daughter, who coopted my (literal 2014) Dell PC, the only thing I'd ever done to it was add memory, it is a beast still. Said "look, your 4chan twin" and she cracked up. But if she does not steal it when she moves out I will probably be able to get ten more years out of it.
I will drive the 1660 Super until the wheels fall off
I buy old electronics for 1/10 of what new stuff costs, install Linux or Foss os, keep it for years without problems until hard drive goes
I don't game on PC but neither do a lot of people who pay $2500 for a laptop, people who inevitably call me for tech help for basic shit.
What's the point? I'd rather have the commons than like a mountain of consumer goods that all suck and are getting worse.
Maybe it's just my CPU or something wrong with my setup, but i feel like new games (especially ones that run on Unreal Engine 5) really kick my computers ass at 1440p. Just got the 7900xtx last year and using a ryzen 9 3900xt i got from 2020 for reference. I remember getting new cards like 10 years ago and being able to crank the settings up to max with no worries, but nowadays I feel I gotta worry about lowering settings or having to resort to using upscaling or frame generation.
Games dont feel very optimized anymore, so I can see why people might be upgrading more frequently thinking it's just their pc being weak. I miss the days where we could just play games in native resolution.
I still have my 2014 machine. I've upgraded it with an M.2 drive and more RAM. Everything else is perfectly fine and I wouldn't see the difference with a newer machine. I'll keep it for a long as I can because the longer I wait the better the machine I replace it with will be.
Also I just wouldn't know what to do with it after. I can't bring myself to throwing away a perfectly good machine, but keeping it would be hoarding.
I built a PC in 2011 with an AMD Phenom II. Can't remember which one, it may have been a 740. And I'm pretty sure a Radeon HD 5450 until FO4 came out in 2015 and I needed a new graphics card. Upgraded to a Radeon R7 240, and some other AM3 socketed CPU I found for like, $40 on eBay. By no means was I high end gaming over here. And it stayed that way until 2020, when I finally gutted the whole thing and started over. It ran everything I wanted to play. So I got like, 9 years out of about $600 in parts. That's including disc drives, power supply, case, and RAM. And I'm still using the case. I got my money's worth out of it, for sure. The whole time we were in our apartment, it was hooked up to our dumb TV. So, it was our only source of Netflix, YouTube, DVDs, and Blu-rays. It was running all the time. Then, I gave all the innards to my buddy to make his dad a PC for web browsing. It could still be going in some form, as far as I know.
I want to say I upgrade every 6 years. Getting mid to upper specs and a mid range video card and it’ll last you for a long time.
I'm the one person who people go to for PC part advice, but I actually try to talk them down. Like, do you need more RAM because your experience is negatively impacted by not having enough, or do you just think you should have more just because?