Be interesting to see just how much these are basically AI accelerators in disguise, which accidentally do gaming too.
Expecting a 5090 that’s even more expensive than the 4090 and still sold out for 6 months.
As a laptop user it would be nice to go from 8GB vram in my 3070ti to something like 16gb, but I wonder what insane power requirements even the mobile parts will have.
A 16GB dGPU on a laptop is going to be annihilate the laptop battery. Efficiency focused GPUs are the realm of iGPUs.
I’m already using a mobile workstation type thing with 1h battery at most. That’s the unfortunate reality of mobile CGI / CAD work. I’d still be interested, as I have no alternatives.
I feel like the 4000 series just came out. I’m wrong aren’t i?
You’re not. Nvidia’s trying to invoke FOMO now that the crypto invoked demand is over
You’re just getting old.
No offence intended. I am in the same spot.
At this point, what is the point? Honestly. Why so much processing power? What is the average user going to do to require so much brawn on a graphics card?
4K videogames.
I might switch from a 3080 to a Redeon 8xxx next year, depending on the price and performance. Main reason are, despite me using Linux, not the drivers but the VRAM. I had several games now where 10GB on 3840x1600 resolution aren’t sufficient and it really bugs me. Why AMD instead of Nvidia? Well, I didn’t have that much issues with drivers, but AMD drivers on Linux are still better. Also Nvidias pricing is absurd. I don’t want to take apart in that.
It’ll be crucial to run Monster Hunter Wilds at 240p upscaled from a Tomogachi screen at 30 frames per minute
Nice to see
itthe VRAM increasing, but the bigger one for me is what the VRAM of the “mainstream” cards (if they even exist in Nvidia’s pricing model) is gonna be.