But boy did it change the price you have to pay for it.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Hands up if you/someone you know purchased a Steam Deck or other computer handheld, instead of upgrading their GPU πββοΈ
To be honest I stopped following PC hardware altogether because things were so stagnant outside of Intel's alder lake and the new x86 P/E cores. GPUs that would give me a noticeable performance uplift from my 1060 aren't really at appealing prices outside the US either IMO
It's diminishing returns.
We need a giant leap forward to show a noticeable effect now.
Like, if a cars top speed was 10mph, a 5 mph increase is fucking huge.
But getting a supercar to top off at 255 instead of 250, just isn't a huge deal. And you wouldn't notice unless you were testing it.
So even if they keep increasing power at a steady rate, the end user is going to notice it less and less everytime.
We had hardware getting massive leaps for years. Problem is, devs got used to hardware having enough grunt to overcome lack of optimizations. Now we got shit coming out barely holding 60+ on 4080s and requiring usage of FSR or DLSS as a bandaid to make the game get back to playable framerates.
If youβve got 30 series or 7000 series from AMD you donβt need to look for a more performant card, you need devs to put in time for polish and optimization before launch and not 6 months down the line IF the game is a commercial success.
Hell, Cyberpunk 2077 dropped 10-20fps with the last patch on my 4090, and the devs donβt care enough to fix it.
Cities Skylines 2 aims for only 30fps, and it canβt even hit that on my pretty good gaming PC.
Money is in the AI chips for datacenters, i think regular consumers will be more more only getting dinner's leftovers
From 2020 I planned on building a new gaming PC. Bought an ITX case and followed hardware releases closely... And then got disillusioned with it all.
Picked up a Steam Deck in August of 2022 and couldn't be happier with it. The ITX case is collecting dust.
I game exclusively on my Steam deck these days.
I absolutely love it. I dock it and use the desktop as my standard pc too. It does everything I need it to do.
That's exactly what I did!
You guys think I should upgrade my Voodoo 3 card? No one is joining my quake server anymore anyway
Come play Unreal with us then hehehe
Weβve all moved over to The Specialists!
Nah you need that Fireball upgrade man
Given technological progress and efficiency improvements I would argue that 2023 is the year the gpu ran backwards. We've been in a rut since 2020... and arguably since the 2018 crypto explosion.
Nah 2022 it was running backwards far more. 2023 was a slight recovery but still worse than 2021.
I wanted to upgrade my 1060 for the longest time for something like the 3080. But during to demand and prices hikes, I waited.. 40 series got released and the prices stayed high.
So I just gave up, I got a steam deck and PS5 instead.
A lot of people did this. The GPU market for gaming might have actually shrunk. You would think Nvidia would panic but due to AI chip demand their stock is at an ATH and no company changes course or reevaluates and what they're doing when shareholders are lining up to suck their dicks, so...no end in sight. Meanwhile AMD doesn't seem to want to even try to make a play for market share.
Technically AMD does have more market share when you think about all the devices has AMD in them like Playstation, Xbox, steam deck and other handhelds.
But yeah Nvidia doesn't care about gaming anymore, If I had to pick a GPU today, I would pick AMD because Nvidia 6-8 VRAM isn't enough and AMD is better on linux.
Still rocking a 1080. I don't see a big enough reason to upgrade yet. I mostly play PC games on my steam deck anyways. I thought starfield was going to give me a reason. Cyberpunk before that. I'm finally playing cyberpunk but the advanced haptics on PS5 sold me on going that route over my PC.
I just "upgraded" from a GTX 1080 to an RTX 4060 Ti 16Gb, but only because I was building a PC for my boyfriend and gave him the 1080. I'm really not seeing a noticeable difference in frame rate on 1440p.
Yeah I keep waiting for a good deal to retire my 1080ti.
Guess I could go for a 3060 or something but 4 series will probably leave my old CPU behind.
1080 gang rise up.
But seriously, my 1080 does fine for most things, and I have a 2k 144hz monitor. It's JUST starting to show its age as I can't blast everything on high/ultra anymore and have to turn down the biggest fps guzzling settings.
I just upgraded from a 1070 to a 3060ti. The numbers definitely did not justify a 4060ti.
How was that change? Iβm thinking of doing the same, but it requires a power supply update too, so Iβm on the fence.
As someone who upgraded from a 2016 GPU to a 2023 one I was completely fine with this. Prices finally came down and I got the best card 2023 offered me, which may not have been impressive for this generation but was incredible from what I came from.
And how much did you pay for the 2016 card, what range was it in, and what is the new card's cost and range?
Overal, gpus have been a major ripoff, despite these upgrades giving good performance boosts
I believe about $300 for an AMD RX480 (great card and still going strong). This time I had a bit more money and wanted something more powerful. I went with the AMD 7800 XT Nitro ($550) which I got on release day. Sure it's not top of the line but it has played pretty much everything I throw at it with all settings set to max and still maintaining 60fps or above. I have an UW monitor with its max resolution being 5120x1440 which is what most games will play at and everything still plays fine. It's almost crazy to me that this card would be considered mid range.
intel GPUs definitely won out for what you get for the money
That's not a sentence I'm used to seeing
I've been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.
I'm so glad that Intel has stepped into the GPU space, even if their cards are weaker. More competition will hopefully light a fire under NVidia to get their shit together.
I finally upgraded my GTX970 to a used RTX 3080 for 300β¬. The difference at least for me for the same 300β¬ was insane.
I just don't see the point in upgrading every new release anyway, or even buying the most expensive one. I've had my gigabyte Rx 570 for several years and I can play Baldurs Gate 3 full settings with no issues. Maybe I haven't tasted 120 fps but I'm just happy I can play modern games. When it comes time to get a new graphics card, which may be soon since I am planning to build my wife's PC, maybe then I'll see what's going on with the higher end ones. Maybe I'm just a broke ass though.
Ya the problem I landed in was not anticipating how hard it would be to push my new monitor. Ultra wide 2.5k resolution with 144Hz. I can't do cyberpunk full res more than 60fps, and that's with dlss enabled and not all settings at max.
2070s
I had to buy 3070 ti at scalped price. Ended up paying Β£700 for it. I hate myself for it but the prices didn't shift for months after and my gtx 1080 kicked the bucket. No way in hell am I buying anything this gen. My wife's 1080 is going for now, maybe we'll get 5080 if it's not a rip off.
Its nvidia, its always a ripoff :p
Especially now when gaming GPUs are an afterthought for them.
Thats only nvidia though. Amd seems to still be trying to compete with nvidia some way or another
NVIDIA fucking sucks. But I do a lot of modeling in blender and holy damn do I want that RTX.
So how about the 2Β½ years from 2016 to 2018 between Nvidia GFX 1080ti and RTX 2080?
I think the headline should say A Year not THE year.
What's everyone's recommendation for a cheap AMD GPU to use with Linux? I was looking recently at a Radeon RX 580, I know there are much better cards out there but the prices are about double (Β£350-400 instead of Β£180). I'd mostly be using it to play games like the remastered Rome Total War.
There are some used options e.g. 5700 XT-s are really cheap because many of them were mining card. For new cards there aren't many options RX 6600 has relatively good value, but it's only worth it if efficiency or features like hw video codecs are important for you.
6600XTs seem to be going for around Β£200, often Β£180 even (used, eBay).
If you'd prefer new, you can get a 6650XT for Β£240. A 6650XT will be 6% faster than a 6600XT.
It's double the performance of a 580, uses less power, will be supported longer, etc.
I upgraded from an RX 480 to an RTX 3060 a few days ago. Crazy difference, especially in compute
This is the best summary I could come up with:
The performance gains were small, and a drop from 12GB to 8GB of RAM isn't the direction we prefer to see things move, but it was still a slightly faster and more efficient card at around the same price.
In all, 2023 wasn't the worst time to buy a $300 GPU; that dubious honor belongs to the depths of 2021, when you'd be lucky to snag a GTX 1650 for that price.
But these numbers were only possible in games that supported these GPUs' newest software gimmick, DLSS Frame Generation (FG).
The technology is impressive when it works, and it's been successful enough to spawn hardware-agnostic imitators like the AMD-backed FSR 3 and an alternate implementation from Intel that's still in early stages.
And DLSS FG also adds a bit of latency, though this can be offset with latency-reducing technologies like Nvidia Reflex.
But to put it front-and-center in comparisons with previous-generation graphics cards is, at best, painting an overly rosy picture of what upgraders can actually expect.
The original article contains 787 words, the summary contains 168 words. Saved 79%. I'm a bot and I'm open source!