2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.
But boy did it change the price you have to pay for it.
Hands up if you/someone you know purchased a Steam Deck or other computer handheld, instead of upgrading their GPU 🙋♂️
To be honest I stopped following PC hardware altogether because things were so stagnant outside of Intel’s alder lake and the new x86 P/E cores. GPUs that would give me a noticeable performance uplift from my 1060 aren’t really at appealing prices outside the US either IMO
It’s diminishing returns.
We need a giant leap forward to show a noticeable effect now.
Like, if a cars top speed was 10mph, a 5 mph increase is fucking huge.
But getting a supercar to top off at 255 instead of 250, just isn’t a huge deal. And you wouldn’t notice unless you were testing it.
So even if they keep increasing power at a steady rate, the end user is going to notice it less and less everytime.
We had hardware getting massive leaps for years. Problem is, devs got used to hardware having enough grunt to overcome lack of optimizations. Now we got shit coming out barely holding 60+ on 4080s and requiring usage of FSR or DLSS as a bandaid to make the game get back to playable framerates.
If you’ve got 30 series or 7000 series from AMD you don’t need to look for a more performant card, you need devs to put in time for polish and optimization before launch and not 6 months down the line IF the game is a commercial success.
Hell, Cyberpunk 2077 dropped 10-20fps with the last patch on my 4090, and the devs don’t care enough to fix it.
Cities Skylines 2 aims for only 30fps, and it can’t even hit that on my pretty good gaming PC.
A fix that worked for me on Cyberpunk dropping in performance after that patch - turn everything to low, restart the game, then change settings back to what they were.
Yeah, with that trick it went from 50fps to 90fps on everything turned to max. Thank you so much!
Cities Skylines 2 is really bad because you’d expect given how poorly it runs on your 4090 that a meager 1060 wouldn’t run it at all, but on the contrary I’ll probably get the same performance as you. It’s like the game just… isn’t capable of taking advantage of your better card.
One thing that’s very apparent is that with more traffic the simulation slows down while the framerate isn’t (so all cars go in slow motion, even though I’m at 3x speed). This means that it’s severely CPU-limited.
I don’t know how multithreaded their simulation is, I have a 5950X with 32 hardware threads. Maybe an upgrade to the new generation of Ryzen CPUs that are going to come out around February could help.
Generally speaking, the simulations running behind the scenes in simulation games are always single-threaded. You’re always better off with a higher clock speed, those extra threads just won’t be utilized.
Well, that will get harder and harder to achieve, since CPUs are getting more cores, they aren’t getting much faster these days.
Money is in the AI chips for datacenters, i think regular consumers will be more more only getting dinner’s leftovers
I’m not entirely sure about AMD but NVIDIA certainly seems keen on the AI market to the point that they don’t really care about the consumer gaming market anymore.
From 2020 I planned on building a new gaming PC. Bought an ITX case and followed hardware releases closely… And then got disillusioned with it all.
Picked up a Steam Deck in August of 2022 and couldn’t be happier with it. The ITX case is collecting dust.
I game exclusively on my Steam deck these days.
I absolutely love it. I dock it and use the desktop as my standard pc too. It does everything I need it to do.
Same here! I was worried I wouldn’t use it (I haven’t been gaming on PC much) but I actually game on it much more than on PC
That’s exactly what I did!
Yep. In fact I bought a second deck (OLED) instead of upgrading GPU’s. Prices are nuts, I’ll wait.
To be honest I stopped following PC hardware altogether because things were so stagnant
That’s exactly what happened to me as well.
It’s not exciting at all to pay attention to mediocre launches of expensive products. The GPU in my gaming PC is several generations old at this point but I don’t really care. There are still plenty of good games that will run fine on it and I’m just going to hold tight. There are games that I still have yet to purchase that will run fine on my hardware. I’m not going to give my money to terribly optimized games or games that require high-end hardware.
The more expensive PC gaming becomes the more high-end hardware doesn’t really matter. I think developers and publishers know that they need to target the average consumer because they need to sell volume on these games. If the average gamer is playing on older and/or lower end hardware then they need to service that market. There aren’t enough 4090 buyers to sell the volume they need to make money. Hell, at these prices I’m not sure there are even enough 4070 or even 4060 level buyers to do that. Tons of people lost interest and aren’t buying into this even if you still see posts online of people purchasing new GPUs.
I waited out the crypto market and I don’t have problems waiting longer.
I’m surprised so many people are cross shopping tbh. I briefly considered steam deck, but specs are barely enough to play at 1080p so it’s completely useless when docked and a purely portable device with a tiny screen and gamepad carries very little value to me personally.
I ended up getting eGPU enclosure for my laptop and grabbing a 1080ti from a friend that didn’t need it anymore. I’m able to play D4 at 4k on medium settings.
Even if I had to buy a gpu like I was originally planning, ~$800 total to play in 4k on a 43" screen with a mouse and keyboard is a completely different experience from anything Xbox or steam deck offer.
Given technological progress and efficiency improvements I would argue that 2023 is the year the gpu ran backwards. We’ve been in a rut since 2020… and arguably since the 2018 crypto explosion.
Nah 2022 it was running backwards far more. 2023 was a slight recovery but still worse than 2021.
I feel the same way. I don’t have the data to prove it.
Anecdotal evidence is still data
No, it’s a datum - about how people feel
Performance numbers are easy to find. The prices have not been great and the 4060 is held back by its reduced memory speed, but it’s a performance increase nevertheless. The flagship product, the one that shows what is currently possible in terms of GPU power, did show remarkable improvement in top performance.
I’m more salty about AMD not supporting ai workloads on their consumer gpus. Yes, ROCm exists and it will work on quite a few cards, but officially it’s not supported. This is a major reason why Nvidia is still the only serious player in town.
Yeah AMD just seems like it just doesn’t want to market AI on consumer hardware for devs. They have a ryzen chip line with built in dedicated "NPU"s now, but honestly the fact there is a disconnect between AI for the GPUs and a focus on windows, even for development, just makes it feel clunky.
♪♪ I say data, you say datum ♪♪
Ok I thought it common knowledge but maybe I should specify.
Datum is the singular form of data. Data is a collection of many single datums. If you have ten thousand anecdotes they do in fact become statistically significant.
It is just my impression of things.
You guys think I should upgrade my Voodoo 3 card? No one is joining my quake server anymore anyway
Come play Unreal with us then hehehe
We’ve all moved over to The Specialists!
Nah you need that Fireball upgrade man
I just upgraded from a 1070 to a 3060ti. The numbers definitely did not justify a 4060ti.
How was that change? I’m thinking of doing the same, but it requires a power supply update too, so I’m on the fence.
Fwiw, I’ve been running a 3080FE for nearly 3 years now and it’s still more than enough to run basically anything I care to on max settings (or close to it) @2.5k. Got it through Best Buy, so I paid list price (but it was a massive pain in the ass to actually snag one through their queueing system). It was pricey, but it was a HUGE perf uplift, since I was coming from a GTX 1070 as well.
Still rocking a 1080. I don’t see a big enough reason to upgrade yet. I mostly play PC games on my steam deck anyways. I thought starfield was going to give me a reason. Cyberpunk before that. I’m finally playing cyberpunk but the advanced haptics on PS5 sold me on going that route over my PC.
I just “upgraded” from a GTX 1080 to an RTX 4060 Ti 16Gb, but only because I was building a PC for my boyfriend and gave him the 1080. I’m really not seeing a noticeable difference in frame rate on 1440p.
Yeah I keep waiting for a good deal to retire my 1080ti.
Guess I could go for a 3060 or something but 4 series will probably leave my old CPU behind.
CP77, at least before the upgrade (haven’t checked since then) ran perfectly… acceptable on my 4G 5500 XT. Back when I bought it (just before the price hikes) it was the “RX 590 performance but less watts and RDNA” option, the RX 590 hit the market in 2017. And I’m quite sure that people still rocking it are, well, still rocking it. Developers might be using newer and fancier features but I’ll expect they’ll continue to support that class of cards for quite some while, you don’t want to lose out on millions of sales because millions don’t want to pay for overpriced GPUs. Allthewhile you can get perfectly fine graphics with those cards, if you look back pretty much all 201x titles hold up well nowadays.
Due to ML workloads I’ve been eyeing the Arc (cheapest way to get 16G and it’s got some oomph) but honestly so far I couldn’t get myself to buy an Intel product that isn’t a NIC, would break a life-long streak. A system RAM upgrade is definitely in the pipeline, though, DDR4 has gotten quite cheap. It’s gotten to a point where I’d recommend 64G simply because 32G sticks are the cheapest per GB (and you probably have two memory controllers).
As someone who upgraded from a 2016 GPU to a 2023 one I was completely fine with this. Prices finally came down and I got the best card 2023 offered me, which may not have been impressive for this generation but was incredible from what I came from.
And how much did you pay for the 2016 card, what range was it in, and what is the new card’s cost and range?
Overal, gpus have been a major ripoff, despite these upgrades giving good performance boosts
I believe about $300 for an AMD RX480 (great card and still going strong). This time I had a bit more money and wanted something more powerful. I went with the AMD 7800 XT Nitro ($550) which I got on release day. Sure it’s not top of the line but it has played pretty much everything I throw at it with all settings set to max and still maintaining 60fps or above. I have an UW monitor with its max resolution being 5120x1440 which is what most games will play at and everything still plays fine. It’s almost crazy to me that this card would be considered mid range.
7800XT Nitro gang rise up! Come from a 1080 and it’s been a leap!
That’s about equal to a 3070ti, what are you playing to max settings 60fps on 32:9 1440 resolution on that? Because either you are straight up lying or being intentionally misleading by selecting a very narrow range of games.
I finally upgraded my GTX970 to a used RTX 3080 for 300€. The difference at least for me for the same 300€ was insane.
I had to buy 3070 ti at scalped price. Ended up paying £700 for it. I hate myself for it but the prices didn’t shift for months after and my gtx 1080 kicked the bucket. No way in hell am I buying anything this gen. My wife’s 1080 is going for now, maybe we’ll get 5080 if it’s not a rip off.
Its nvidia, its always a ripoff :p
Especially now when gaming GPUs are an afterthought for them.
Thats only nvidia though. Amd seems to still be trying to compete with nvidia some way or another
I wouldn’t say so, they also seem to have abandoned the gaming segment and nowadays are playing more or less ball with NVIDIA while trying to improve their AI stack so that they can get a higher chunk of the data centre business.
I don’t think that’s true at all. Let’s go back a while.
We had Polaris, a mid range 2016 architecture that was sold for years as a mid range then low end card.
They also had the Vega cards, which were compute-focussed and not particularly great at gaming.
Following that, they had the 5700 series. Decent gaming cards.
After that, the 6000s series. Right up there with Nvidia, and taking into consideration the die size, performance, and comparatively generous VRAM, you could argue they were the better gaming cards, despite losing in RT.
7000s series is pretty much like the 6000 except slightly further behind the 4090, albeit for half the real-world price due to AI demand bringing the already crazy 4090 prices even higher.
Idk to me it seems AMD is more competitive in gaming now than they have been for a long time.
Especially now when gaming GPUs are an afterthought for them.
I just don’t see the point in upgrading every new release anyway, or even buying the most expensive one. I’ve had my gigabyte Rx 570 for several years and I can play Baldurs Gate 3 full settings with no issues. Maybe I haven’t tasted 120 fps but I’m just happy I can play modern games. When it comes time to get a new graphics card, which may be soon since I am planning to build my wife’s PC, maybe then I’ll see what’s going on with the higher end ones. Maybe I’m just a broke ass though.
intel GPUs definitely won out for what you get for the money
That’s not a sentence I’m used to seeing
I’m so glad that Intel has stepped into the GPU space, even if their cards are weaker. More competition will hopefully light a fire under NVidia to get their shit together.
I’ve been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.
Have you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.
No, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.
What’s everyone’s recommendation for a cheap AMD GPU to use with Linux? I was looking recently at a Radeon RX 580, I know there are much better cards out there but the prices are about double (£350-400 instead of £180). I’d mostly be using it to play games like the remastered Rome Total War.
There are some used options e.g. 5700 XT-s are really cheap because many of them were mining card. For new cards there aren’t many options RX 6600 has relatively good value, but it’s only worth it if efficiency or features like hw video codecs are important for you.
Is there any issue with buying a card that was previously used for mining?
When you say RX 6600 do you mean that one specifically or the range including 6600XT etc? I don’t have a good handle on what the real world differences between the variants are.
Is there any issue with buying a card that was previously used for mining?
If used by a home user who didn’t know what they were doing they might have run it hotter for much longer than a typical gamer so the thermal paste might need a redo.
If used by some miner doing it even quasi-professionally or as a side-gig I’d much prefer it over a 2nd hand card from any typical gamer (most miners) they’ve kept the voltage/temps low and taken care of it far better than a gamer who might be power cycling regularly and definitely thermal cycling even more regularly.
Been waiting for a good deal to replace my rx480 in my sister’s rig. I think they announced rx400/500/vega GPUs will only get security driver updates now and only for a while, I assume that applies to Linux too. RX580 will play many games at 1080p 60fpd but not the modern demanding ones (maybe not even at low settings).
Rumors say nextgen AMD isn’t targeting high end, maybe we have another 480 price to performance king 🤞. Then again, with AI as the new crypto, who can say.
Same. I’ve been looking at AMD upgrade for my Linux Machine. Have been looking at the 6700xt which is about £330 for a 12GB GPU. If someone can think of anything better I’d like to know
I upgraded from an RX 480 to an RTX 3060 a few days ago. Crazy difference, especially in compute
This is the best summary I could come up with:
The performance gains were small, and a drop from 12GB to 8GB of RAM isn’t the direction we prefer to see things move, but it was still a slightly faster and more efficient card at around the same price.
In all, 2023 wasn’t the worst time to buy a $300 GPU; that dubious honor belongs to the depths of 2021, when you’d be lucky to snag a GTX 1650 for that price.
But these numbers were only possible in games that supported these GPUs’ newest software gimmick, DLSS Frame Generation (FG).
The technology is impressive when it works, and it’s been successful enough to spawn hardware-agnostic imitators like the AMD-backed FSR 3 and an alternate implementation from Intel that’s still in early stages.
And DLSS FG also adds a bit of latency, though this can be offset with latency-reducing technologies like Nvidia Reflex.
But to put it front-and-center in comparisons with previous-generation graphics cards is, at best, painting an overly rosy picture of what upgraders can actually expect.
The original article contains 787 words, the summary contains 168 words. Saved 79%. I’m a bot and I’m open source!
The thing is its working so damn well. 4090s are selling in huge numbers.
To be honest I think it’s just AI developers gobbling them all up because Nvidia’s dedicated workload and professional GPUs are always sold out. Plus spending 1400$ on games is ridiculous, and that’s coming from somebody with a ryzen 7800x3d and a 7900xtx. I regret it so much, such a waste of money.
Having a 7900XTX and a 5800X…I don’t really get the wate of money part. I can throw everything at it and it runs exceptionally well with 5120x1440 resolution. Most, if not all,is running well inside Freesync 2 range…I couldn’t be any happier and since I’m getting old now, I’d compare it to the Athlon 64 X2 times with a Radeon 850 XT…between that and now, I never had a system that did so well with the games of it’s time.
Edit: Oh you mean spending 1400 on games…well, yeah, games are ridiculously priced…considering you don’t really own a copy either…