this post was submitted on 29 Apr 2024
15 points (77.8% liked)
Hardware
5059 readers
8 users here now
This is a community dedicated to the hardware aspect of technology, from PC parts, to gadgets, to servers, to industrial control equipment, to semiconductors.
Rules:
- Posts must be relevant to electronic hardware
- No NSFW content
- No hate speech, bigotry, etc
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's funny, because I have never experienced that, and I've always OCed and over volted my CPU's. My current CPU is a 7 year old Ryzen R5 1600, that I have been running with both OC and higher voltage too. Every CPU I've had for 40 years now, has been replaced because they became obsolete. I've worked as an IT consultant for 10+ years in the 90's and 00's, and NEVER experienced silicon degradation. All sort of other problems, like faulty soldering when led became illegal, and capacitors when fake poor quality Chinese capacitors found their way into production. There is no way silicon degradation should be an issue within a short time span of a couple of years.
Anyways as it looks now it doesn't seem like a good idea to buy Intel.
I've never felt silicon degradation on a CPU. But I can say that I've had a GPU with a stable overclock for years that started getting a bit flakey and I had to go back to stock settings. Of course for GPUs there are also more frequent driver updates. Maybe that effect was due to the driver and games also trying to squeeze more out of the hardware.