Home » today » Technology » Rumor: Nvidia’s upcoming Lovelace GPUs will get up to 18,432 CUDA cores – computer – news

Rumor: Nvidia’s upcoming Lovelace GPUs will get up to 18,432 CUDA cores – computer – news


RT exists because rasterized became too easy for the GPUs, shareholders didn’t like that and hey, Nvidia still had some special purpose cores on the shelf.

Yes because RT is really something new and has not existed for decades …….

Wasn’t it ‘Ten years in the making’? Why then is there still almost no RT content that is worthwhile?

This is nothing more than an opinion, Control, Cyberpunk, Metro Exodus etc, all beautiful titles and more and more is coming out.

They are definitely a gene or two too early. The added value is not above solid SSR. At most it’s ‘different’ … but better? If you like a slideshow maybe. Meanwhile, every car looks like it just came off the car wash, just like the streets in Cyberpunk. It must be, but realism, is this really it? It is just as much a trick with a number of parameters and a lot of calculations to approximate ‘real’. We now brute force it instead of tackling it efficiently for minimal profit. Very, very special.-

Funny that you are talking about brute force but in my memory DLSS is constantly slated ……

The gain is enormous if you just look critically at lighting and reflections. In professional rendering you see nothing but ray tracing and the current accuracy of these algorithms has ensured that Ikea no longer even takes pictures of their furniture, but 3D rendered their entire folder. Not understanding the importance of RT shows how short-sighted you are when it comes to the steps required for true photorealism. Or would you rather keep looking at disappearing reflections and artifacts in SSR etc?

DLSS, yes that’s something, but yes. proprietary and I think you want to be able to use this universally without support. So nice idea, now I would like a versatile alternative for which you do not need Huang’s approval. Otherwise, it’s PhysX all over again – just like the RTX implementation, which is pretty much more than DXR. Nvidia now desperately needs DLSS to ensure that RTX is not a slideshow. That won’t stay that way, and they can do whatever they want with it.

Crazy that it’s proprietary, they have developed separate tensor cores to speed this up. So any other solution lacks the hardware for this. By the way, RTX runs fine on resolutions like 1440P without DLSS. Not exaggerate.

As an Nvidia customer, I’m not going to spend a penny on this nonsense because the perf / dollar is still crying, and the VRAM that is now standard on it is a gigantic decline compared to Pascal. Turing was already less, but this is simply crazy. The resale value of Ampere is therefore nil – games are already reaching the limit. And one may well believe that it will be enough anyway … we will talk to each other in two or three years :)

Games are now not at the limit at all.

Cyberpunk 7GB on 4K, AC Valhalla 6.5GB on 4K, RDR2, 5.5GB on 4K etc. Godfall where AMD claimed that it would consume 12GB is in practice 6GB on 4K. Pooh what a limit say. Before the 10GB is actually reached and where DIrectStorage offers no solution, the GPUs themselves are not even fast enough anymore. Especially now the market is moving with their engines to direct streaming from an NVME drive.

It’s about time they started to lower their tune, a bit like they had to do with Gsync. I didn’t spend a penny on that either, because proprietary while it should be standard. That worked out well, because I think there were more who thought that way and Nvidia felt compelled to go along.

To this day, the Freesync experience is still not as consistent and good as Gsync from 5 years ago. I have a 1300 euro Freesync pro monitor on my table, but it just isn’t. It is that there was no Ultrawide 1000nits HDR1000 gaming monitor on the market with a Gsync module, otherwise I would have bought it. 3 Freesync monitors tested last summer and all suffer from flickerings in certain refresh rate ranges, still tearing in some situations etc. But what a bad thing, pay more for a better experience!

No … I was always in favor of pushing the cutting edge the way Nvidia did, but what they’ve been doing since Turing is very flashy, very rushed, and very much a reaction to mining profits. at the time of Pascal – one of the best generations Nvidia has ever made if you ask me.

It was really AMD which was especially appreciated by the miners. That’s why the Polaris cards were so pricey over time.

[Reactie gewijzigd door ZeroNine op 28 december 2020 18:16]

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.