NVidias SMP rendering tech - The boost we need.

NVidia shows its new SMP rendering tech, based on their new Pascal graphic cards. It allows to render up to 16 different views of a scene in just one geometry pass... It will come to Unreal and Unity SOON™. Will we see it in the COBRA engine as well?

http://www.roadtovr.com/unreal-engine-unity-vr-simultaneous-multi-projection-smp-nvidia/

Thanks for the article! I'm glad to see Unreal/Unity engine devs are working on this..

I am a bit skeptical as to whether this will come out within relevant lifetime for these Pascal cards.. Nvidia promised VR SLI to boost performance with the launch of the GTX 970.. and almost 2 years later, there's still no VR SLI.. I stupidly ended up with 970 SLI due to this promise.. (later went to a single 980Ti and haven't looked back)..
 
The Road to VR site is pretty good, found them a few months back too.

This SMP rendering tech will almost certainly be on all nVidia cards from here on in. It is up to nVidia (and the game fans) to press the developers into using the technology so we see better frame rates. Until now, the game/engine developers have had to work out how the viewpoint and perspective etc all renders out into the final frame. Now nVidia have made it available in hardware (or at least in driver).

But *sigh* its early days yet - most ED players will not have 10-series cards, so FD will probably not spend any time given the benefit will be seen by only a few.
 
Last edited:
Until now, the game/engine developers have had to work out how the viewpoint and perspective etc all renders out into the final frame. Now nVidia have made it available in hardware (or at least in driver).

Let me correct you on that. With SMP developers still have "to work out how the viewpoint and perspective etc".

What SMP helps with is

a) Render both eye images at once without duplicated effort in the Vertex/Tesselation/Geometry Shader stages. On GL this is exposed via NV_stereo_view_rendering

b) Reduce unnecessary Pixel shader work by allowing directly rendering what is more closely to what the extra distortion pass produces nowadays (i.e. less pixel shader work is wasted in the outskirts of the image). This is exposed on GL via NV_clip_space_w_scaling and NV_viewport_array2
 
I planned to go for a 1080 next month. Yet i dont know if it is too early as we might see a VR optimized card next year. But i guess i will do it anyway as i really want that 1080...
 
I planned to go for a 1080 next month. Yet i dont know if it is too early as we might see a VR optimized card next year. But i guess i will do it anyway as i really want that 1080...

I kind of thought 1080 was the card optimized for VR for now, and they took their time with 1080. I doubt we'll see more than 1080ti next year.
 
The only optimizing a 10 series has "right now" is it's 40% faster than a 9. All these new features will require adoption by game developers and HMD makers. As well as DX12 support in games. Since VR is only now starting to make a commercial appearance, don't expect much for a couple of years or more. Another caveat is NVidia's features don't apply to AMD so this will also slow implementation. It is kind of ironic that so many have been outraged at Oculus for what they saw as hardware DRM, yet when practiced by their favorite gpu manufacturer, they can't wait to get their hands on it. LMAO
 
It is kind of ironic that so many have been outraged at Oculus for what they saw as hardware DRM, yet when practiced by their favorite gpu manufacturer, they can't wait to get their hands on it. LMAO

Explain how simultaneous multiprojection equates to hardware DRM? Will it prevent software from running on AMD cards?
 
Just like coding shaders. Use fallbacks to switch to, if specific technology isnt supported.

Most likely, NVidia will do all the work, so you will just have to flip the switch as a developer, just like Oculus does it, with their easy to use SDK. The question is, how fast and easy it can be integrated into existing projects and different engines.
 
Last edited:
Explain how simultaneous multiprojection equates to hardware DRM? Will it prevent software from running on AMD cards?

You are correct. I should not have said "the same". The point I was trying to make (so badly) was that hardware manufacturers will write to take advantage of their own hardware. Oculus does it, Vive does it., the consoles do it, Nvidia and AMD do it at a driver and feature level. I agree that Nvidia and AMD not sharing and collaborating on features does not negate both running the same game, but they do protect their own driver and feature software at an exclusive level to enhance sales or their devices, so I should maybe have said " when gpu companies invoke similar practices." 3DFX was a form of drm, as is sli vs crossfire in the sense they protect the exclusivity of hardware and driver software. Sure, the game companies can write for both and mobo makers can support both, however, it is my opinion that these are the same types of practices that Oculus was accused of. Don't get me wrong,I have no problem with DRM at any level. Companies should be able to protect their hardware and software at whatever level they see fit. It is their investment. having exclusive hardware, software or features may cost more, but it drives competition and competition drives down prices. It's all good.
perhaps a better title for the thread might have been "NVidias SMP rendering tech - The boost "HALF" of us need "
 
Back
Top Bottom