NVIDIA's next-gen Turing GPUs

in #hardware6 years ago (edited)

It's been a long wait, but next-gen GPUs are finally incoming. Here are my predictions, based on all leaks and information available.

What? GeForce 20 Series.

When? Announced on 20th August 2018.

Where? Gamescom 2018.

Product stack:

Rip-off fanboys - GeForce Titan RTX. $3000. Same die as recently announced RTX 8000. Between 4352 and 4608 SP. 12 GB GDDR6 w/ ~650 GB/s memory bandwidth. ~15 TFlops. 30% faster than GTX 1080 Ti.

Enthusiast - GeForce RTX 2080. $700. Same die as recently announced RTX 5000. 2944 SP. 8 GB GDDR6 w/ 448 GB/s memory bandwidth. ~11 TFlops. Similar performance as GTX 1080 Ti in DX11, 5%-10% better in DX12/Vulkan.

High-end - GeForce RTX 2070. $500. 2304 SP. 7 GB GDDR6 w/ ~350 GB/s memory bandwidth. Performance in between GTX 1080 / Vega 64 and GTX 2080 (though close to 1080).

The shtick for the RTX series - Ray-tracing, ray-tracing, ray-tracing, and a bit of machine learning. None of which is relevant to any games coming before 2020. If you have 10 Series or Vega / RX 500 series card, might as well wait for 7nm, which will bring the big leap forward. Expect the RTX 20 Series to roll out over the next 3 months.

Mid-range - GeForce GTX 2060 / 2050. Yeah, GTX. No ray-tracing, machine learning stuff here. Incremental updates over 1060 / 1050. I'd say 1536 SP for 2060; 896/1024 SP for 2050. $300 for 2060, approaching $200 for 2050. Similar price/performance versus old series, just offering at a higher performance. Not expecting these till end of the year, or early 2019.

The Competition - AMD Vega 20 will be first to 7 nm, and could be an amazing product for workstations / machine learning. No idea if it'll ever release for gaming, but if it does, expect performance just shy of RTX 2080, but well ahead of RTX 2070. Apart from that, the real competition releases in 2019 with AMD Navi. Since they will be first to 7nm, it could be very competitive with the 20 Series. Navi will also be the graphics architecture that'll make its way into PlayStation 5 and the next Xbox. However, I don't see any competition for the Titan RTX, AMD simply can't afford to play that game. What about the 2080 Ti? Expect it some time in 2019, basically just a rebranded Titan RTX for less money.

Note: These are just my predictions, and could be completely wrong. We shall see on 20th and over the coming months.

Sort:  

I don't understand how they're going to cram 4K textures in 8GB of GDDR6. It'll be quicker, sure, to swap textures in and out of memory, but I feel like it will struggle at that resolution, unless they go all steam ahead on their procedually generated textures and do that in the extra shader bandwidth that they'll have at their disposal.

8 GB has been ample with highest quality textures at 4K in most games thus far. The only one I can remember that can actually exceed that is Shadow of War, with its optional Ultra texture pack, at 9 GB. Pretty much all developers have been happy to follow the consoles as the baseline. Some may offer something better for Xbox One X (which has about ~5 GB for the GPU; ~9 GB available to games for GPU+CPU combined) and PC, but even so, most games are only utilizing 5GB on average at 4K. TechPowerUp covers VRAM usage frequently; you'll see that their last tested game only uses 3.3 GB at 4K maxed out with a GTX 1080 Ti. Of course, this will start to change in 2020 with next-gen consoles.

The reason they are skimping on memory is quite simple - they are very expensive right now. Quadro 8000 has an incredible $3,700 markup for the extra 24 GB of VRAM. (Granted, that includes the "Quadro tax", but never before have we seen such relative differences in memory configurations)

The main tech content I consume is the stuff put out my GamersNexus - it tends to get a bit esoteric and highly technical, but doesn't really go into VRAM usage - I'll check out that TechPowerUp content. Thanks for pointing out its existence. :)

GDDR6 is going to be very expensive early, as the economies of scale aren't quite ready there. It will be interesting to see whether we see Samsung, Hynix, or Micron memory, with the slight differences in spec between each (in terms of clockspeed).

I owned a quadro back in the GeForce 2 days, and I used that thing with as a gaming card! I was very interested in 3D modelling and it had a whopping 64MB of VRAM back then! It was about 3x more expensive than the 32MB GeForce 2 GTS at the time.

I was a child, my parents bought it for me. I'm rambling, and now going off topic. :)

Depending on the outcomes of early tests, and the eventual (non-paper) outcomes of improving performance as drivers mature; I think this will be a compelling card.

We haven't yet seen anything (outside of Remedy's tech demo) about Ray-Tracing, but I think E3 2019 might be the year of the real-time ray traced games.

I'd predict we'll see something from Crystal Dynamics / Epic games and Remedy with Ray Tracing support. Perhaps, even from the Unity Engine.

Some brief predictions of my own!

Good to see you active on chain again. Your posts are always well written and thought out. :)

Ah, I support Steve & crew on Patreon. Yes, GN stopped doing their game analysis videos a while ago, so TPU is the best source for finding out what games really demand out of hardware. Digital Foundry's new PC guy Alex is doing a decent job too, but seems to test a very narrow selection of hardware so far.

Quadro RTX is confirmed to be Samsung, though of course the GeForce cards will likely be multiple-sourced as always.

Remedy's Control and Metro Exodus seem like the only games with partial ray-tracing support. But really, they'll be 99% rasterized, maybe some reflection or lighting effects here or there will have an optional ray-tracing Ultra setting. I don't expect anyone to invest much into RT till next-gen consoles, assuming Navi supports RT. Unreal Engine and Unity have both announced support for DXR, but that doesn't mean anything for game developer adoption. Things take time in this industry - I mean, it's been over 3 years since DX12, and there are precious few games using it. It almost seems like there were more games using DX12/Vulkan (at least partially) in 2015/16 than in 2018.

Good to hear about the new gpu's but i think now onwards the craze of mining is going to decrease and hence the gpu's will be easily available and at at cheaper prices too...

Prices are back to normal. I mean, they should be lower still, but after nearly a year of ridiculous prices, this is most welcome.

This information is very valuable for us.

To listen to the audio version of this article click on the play image.

Brought to you by @tts. If you find it useful please consider upvoting this reply.

Poor guys of geforce titan I feel bad for them...

i think the prices of these new upcoming GPUs will remain normal only when the craze of mining stays at a lower level

Excellent article. I really liked it. Good luck to you and Love.

Отличная статья. Мне очень понравилось. Удачи Вам и Любви

XCITING! those prices are crazy....

Coin Marketplace

STEEM 0.33
TRX 0.11
JST 0.034
BTC 66407.27
ETH 3219.07
USDT 1.00
SBD 4.34