General / Off-Topic According Jenseng Huang, the people are "Crazy" to buy a GTX instead of a RTX

Robert Maynard

Volunteer Moderator
I just want to be able to use all of my RTX card, let's get the tensor cores doing something. Cuda cores all busy busy, tensor cores having holidays.

And if it can boost VR performance then many people will be happy as they prob have the new GPU already
I expect that Frontier are aware of what proportion of players have an RTX capable card - and, due to the next gen console release next year what the capabilities of the AMD GPUs are.

It makes little sense to undertake significant development for what may be a rather small number of players.
 
Am I the only one who doesn't care about raytracing?
Other applications (that I can't think of right now) than gaming aside, it seems like a lot of added overhead for little return.

Not only am I not interested in Ray tracing I couldn't give a rats crack about 4k either. 1080p is more than enough (I played original doom 240x320 scaled down on my 15" monitor to about a 3 inch square LOL Them were the days!) IMO. I want higher frame rates more detailed geometry and no pop up in games before 4k and ray tracing.
 
Yeah I remember trying to give smart phone to managers back in 2002, nobody wanted one. I was classed as mad having one.. Now look,
 
I'm not saying it doesn't have a future. But the bottom line is - it's just lighting effect. It's resource-heavy and harder to implement than the previous tech. That's not how you push technological advancement.

Ray tracing is more than just lighting effects...lighting is simply the best place to start. The only reason it's harder to implement is there is more than 25 years of rasterization precedent to overcome. A paradigm shift in rendering has to begin somewhere.

I just want to be able to use all of my RTX card, let's get the tensor cores doing something.

The hardware is too specialized. It was originally intended for AI inferencing and the only reason it's even in any consumer cards is so that NVIDIA wouldn't have to order entirely different silicon from their professional/HPC parts. They've tried to make use of it as best they can, so it's not just dead weight, but they are limited in what can be done with it, especially since games cannot be built around it without sacrificing support for every non-Turing part.

It’s like 4K Tvs.

Are interesting if TV shows are broadcast in 4k.

I have a 4k TV, it's hooked up to my HTPC and is mostly used for gaming.
 
Just want to opine that Elite Dangerous in particular would be an excellent candidate for raytracing, with its procedurally generated environments, and freedom and speed of object movement, where lighting and shadows are such a fundamentally important element, and where the developers can not prebake lighting and other things under the knowledge that the camera will only ever be in a scant few certain places, in a string of static rooms that make up a predefined path, along which the player is being corralled.

Only just now, I lifted off a planetside spaceport in night time, and for the n'th time, because my ship happened to be bathing in floodlight on the landing pad, the engine decided that this was cause to render the entire planet and its moons, as if their occluded sides were in daylight; As soon as I left the cones of influence of the floods, the celestial light switch predicatably went off.

It is tragic to see those jagged, incorrect shadows that crawl across stations and terrain, always cast by a single star, no matter how many there are that should have been influencing and tinting the situation, jumping, and shifting, and changing entirely, periodically, as their maps are updated at intervals (also due to lod switching on the occluding geometry) - same for reflection cubemaps.

As for DLSS... I am dubious about that one in general, but especially in ED context -- precomparing ground truth for procedural content in a whole galaxy... Hmm... Even if viable: If we throught that planets and contructs can feel same-y as it is now:p
 
Top Bottom