Sunday, December 3

The NVIDIA RTX 3090 Ti is the most powerful (and expensive) graphics card for end users. The question is whether such a product is worth it

NVIDIA announced yesterday the launch of its NVIDIA GeForce RTX 3090 Tia supervitamin version of those RTX 3090 that were presented a year and a half ago.

We are facing what is postulated as the swan song of Ampere architecture. Everything here is pushed to the limits, also including a $1,999 price tag that makes us wonder to what extent does such a product compensate. And not only because of the price or the improved performance.

Ampere wants to say goodbye in a big way

the graph is to some extent ‘boring’because it is nothing more than an almost natural successor to the RTX 3090. The GA102 GPU with 8nm photolithography that was present in the rest of the family models is maintained, but it is taken a little further.

Screenshot 2022 03 30 At 12 48 36

This graphics card is unrivaled in the segment of end-user GPUs, and it is an absolute behemoth that, yes, is also in price. Source: AnandTech.

Thus, in NVIDIA they raise the clock frequency of their cores and that of their memory (formed by new improved GDDR6X modules). That coupled with the largest number of CODA cores makes it possible reach a power of 40 TFLOPS in single precision calculations.

The use of the new memory modules is also relevant because it is no longer necessary to have modules at the bottom, which in turn avoid having to worry about refrigerating those chips. Everything is “on top”, and the cooling system only has to take care of maintaining good temperatures at the top/front of the graph, not at the bottom.

Also Read  What is the ideal age for your children to have a mobile?

That power doesn’t come for free. It does not do so in terms of consumption, because we are facing a graph that she consumes alone 450 Wwhen the RTX 3090 had a TDP of 350 W. This is in fact the first graphics card that makes use of the new 12VHPWR connector of the ATX 3.0 specification that has 16 pins instead of the usual 12 for these high-end models and high consumption .

I have built a beast with six RTX 3090, but not to mine cryptocurrencies, but to investigate artificial intelligence

The NVIDIA GeForce RTX 3090 Ti is aimed at gamers and content creators, but in fact those responsible admit that “today the most graphically intensive games they don’t use all the power that the RTX 3090 TI offers. And that’s fine.”

We are therefore facing a graph that cannot even be used in gaming in most scenarios because it is too powerful: 8K gaming may be the only major challenge here for this model, which makes it clear that this latest edition is a product for a very limited niche of gamers.

The idea here is to attract content creators, who have a interesting alternative to the Titan RTX: rendering performance with respect to them grows between 42% and 102% according to NVIDIA.

A graphic (very) difficult to recommend

In games the improvement is marginal compared to the RTX 3090, and according to NVIDIA the average improvement is 9% when the price has grown by 33% and it goes from 1,499 to 1,999 in this “Ti” edition.

Also Read  Electric car batteries are too big and heavy. There is a solution: integrate them into the chassis

That is an important reason to consider whether it is really worth making such an investment. The other is even more striking: it is expected that this year NVIDIA introduce your new RTX 4000 family with AD102 GPU, Ada architecture and with 5nm photolithography.

In fact the star model could be an absolute monster with 18,432 CUDA cores (remember, this other monster “only” has 10,752) and rumored consumption could top out at 850W. That would give AD102 GPUs performance that would theoretically be up to twice what current GA102 GPUs offer.

We will have to wait to confirm these rumors, something that maybe we can do in september, when that new family of NVIDIA graphics cards is expected to be introduced. While waiting for those details and the price of those graphics cards, this latest edition of the RTX 3000 seems to make sense in very, very specific areas.

Leave a Reply

Your email address will not be published. Required fields are marked *