You are viewing a single comment's thread from:
RE: NVIDIA's next-gen Turing GPUs
I don't understand how they're going to cram 4K textures in 8GB of GDDR6. It'll be quicker, sure, to swap textures in and out of memory, but I feel like it will struggle at that resolution, unless they go all steam ahead on their procedually generated textures and do that in the extra shader bandwidth that they'll have at their disposal.
8 GB has been ample with highest quality textures at 4K in most games thus far. The only one I can remember that can actually exceed that is Shadow of War, with its optional Ultra texture pack, at 9 GB. Pretty much all developers have been happy to follow the consoles as the baseline. Some may offer something better for Xbox One X (which has about ~5 GB for the GPU; ~9 GB available to games for GPU+CPU combined) and PC, but even so, most games are only utilizing 5GB on average at 4K. TechPowerUp covers VRAM usage frequently; you'll see that their last tested game only uses 3.3 GB at 4K maxed out with a GTX 1080 Ti. Of course, this will start to change in 2020 with next-gen consoles.
The reason they are skimping on memory is quite simple - they are very expensive right now. Quadro 8000 has an incredible $3,700 markup for the extra 24 GB of VRAM. (Granted, that includes the "Quadro tax", but never before have we seen such relative differences in memory configurations)
The main tech content I consume is the stuff put out my GamersNexus - it tends to get a bit esoteric and highly technical, but doesn't really go into VRAM usage - I'll check out that TechPowerUp content. Thanks for pointing out its existence. :)
GDDR6 is going to be very expensive early, as the economies of scale aren't quite ready there. It will be interesting to see whether we see Samsung, Hynix, or Micron memory, with the slight differences in spec between each (in terms of clockspeed).
I owned a quadro back in the GeForce 2 days, and I used that thing with as a gaming card! I was very interested in 3D modelling and it had a whopping 64MB of VRAM back then! It was about 3x more expensive than the 32MB GeForce 2 GTS at the time.
I was a child, my parents bought it for me. I'm rambling, and now going off topic. :)
Depending on the outcomes of early tests, and the eventual (non-paper) outcomes of improving performance as drivers mature; I think this will be a compelling card.
We haven't yet seen anything (outside of Remedy's tech demo) about Ray-Tracing, but I think E3 2019 might be the year of the real-time ray traced games.
I'd predict we'll see something from Crystal Dynamics / Epic games and Remedy with Ray Tracing support. Perhaps, even from the Unity Engine.
Some brief predictions of my own!
Good to see you active on chain again. Your posts are always well written and thought out. :)
Ah, I support Steve & crew on Patreon. Yes, GN stopped doing their game analysis videos a while ago, so TPU is the best source for finding out what games really demand out of hardware. Digital Foundry's new PC guy Alex is doing a decent job too, but seems to test a very narrow selection of hardware so far.
Quadro RTX is confirmed to be Samsung, though of course the GeForce cards will likely be multiple-sourced as always.
Remedy's Control and Metro Exodus seem like the only games with partial ray-tracing support. But really, they'll be 99% rasterized, maybe some reflection or lighting effects here or there will have an optional ray-tracing Ultra setting. I don't expect anyone to invest much into RT till next-gen consoles, assuming Navi supports RT. Unreal Engine and Unity have both announced support for DXR, but that doesn't mean anything for game developer adoption. Things take time in this industry - I mean, it's been over 3 years since DX12, and there are precious few games using it. It almost seems like there were more games using DX12/Vulkan (at least partially) in 2015/16 than in 2018.