Deferred rendering performance issues
-
- OGRE Expert User
- Posts: 1227
- Joined: Thu Dec 11, 2008 7:56 pm
- Location: Bristol, UK
- x 157
Deferred rendering performance issues
I have recently been looking at deferred rendering techniques. I have also been following this years GSOC project on the subject. However Ogre's performance does not seem that great with deferred techniques. So could someone explain why this might be, ie is it compositor framework overheads, issues with instancing or is it just that you only see performance benefits with lots of lights ie 50+.
-
- Google Summer of Code Student
- Posts: 55
- Joined: Fri Mar 18, 2011 8:37 pm
- x 8
Re: Deferred rendering performance issues
Deferred lighting does have a pretty big overhead for transforming the geometry twice, but that's not that huge.
Deferred techniques in general eat up a lot of bandwidth(In ogre that's 16B/pixel affected by each light; that can go up a lot in a real game), not to mention the writing bandwidth needed to generate the GBuffer(again, 16B per pixel _on the screen_). So yeah, that's a huge overhead for the GPU.
BUT, it does mean you can use as many lights as you want, without re-rendering the geometry, and it also means less computation per light(in forward rendering a light affects every fragment that comes out of the rasterizer, as opposed to a small subset of pixels for deferred), so even for a moderate amount of lights it's mostly a win.
Deferred techniques in general eat up a lot of bandwidth(In ogre that's 16B/pixel affected by each light; that can go up a lot in a real game), not to mention the writing bandwidth needed to generate the GBuffer(again, 16B per pixel _on the screen_). So yeah, that's a huge overhead for the GPU.
BUT, it does mean you can use as many lights as you want, without re-rendering the geometry, and it also means less computation per light(in forward rendering a light affects every fragment that comes out of the rasterizer, as opposed to a small subset of pixels for deferred), so even for a moderate amount of lights it's mostly a win.
My Google summer of code 2011 topic: Modern Illumination Techniques
My Google summer of code thread
My Google summer of code wiki page
My Google summer of code thread
My Google summer of code wiki page
-
- OGRE Expert User
- Posts: 1227
- Joined: Thu Dec 11, 2008 7:56 pm
- Location: Bristol, UK
- x 157
Re: Deferred rendering performance issues
@andrei_radu
Thank you for replying, i was hoping you would!
So, in short the Ogre architecture does not have any real impact on Deferred techniques, its just that deferred techniques have an initial impact on performance but allow any number of dynamic lights, as well as benefiting other post processing effects.
I just want to clear things up, as there is some talk about how Ogre is not suited to deferred techniques which i think is misleading.
Thank you for replying, i was hoping you would!
So, in short the Ogre architecture does not have any real impact on Deferred techniques, its just that deferred techniques have an initial impact on performance but allow any number of dynamic lights, as well as benefiting other post processing effects.
I just want to clear things up, as there is some talk about how Ogre is not suited to deferred techniques which i think is misleading.
- Lee04
- Minaton
- Posts: 945
- Joined: Mon Jul 05, 2004 4:06 pm
- Location: Sweden
- x 1
Re: Deferred rendering performance issues
I suggested a long time ago that the demos and now the demo framework should have a deferred setting for each demo as well just as Ogre has the choice of DirectX/OpenGL for each demo. So once and for all get rid of all specualtion that Ogre dosn't support defereed rendering and also, to gather both camps of people around Ogre as a project better.
Ph.D. student in game development
-
- Goblin
- Posts: 287
- Joined: Mon Dec 08, 2008 4:49 pm
- x 10
Re: Deferred rendering performance issues
I agree on the thing about the demo. I've previously created a patch that moves deferred rendering into a plugin (instead of a sample). However, a lot of things are not supported well yet. Directional lights and point lights are only supported well without the shadows. More special materials don't work (so for some of the demo's that use shader based animation, deferred shading won't work well). Shader-based animation is not supported.
So, even though deferred shading can be done now, it is not yet as much an Ogre feature as RTSS yet. (comparing it with OpenGL/DirectX is a bit unfair, as that's a different level)
So, even though deferred shading can be done now, it is not yet as much an Ogre feature as RTSS yet. (comparing it with OpenGL/DirectX is a bit unfair, as that's a different level)
-
- Google Summer of Code Student
- Posts: 55
- Joined: Fri Mar 18, 2011 8:37 pm
- x 8
Re: Deferred rendering performance issues
The problem I see is that ogre's directX and ogl renderers are forward by nature. The rendering algorithm(reading batches and materials and rendering them as they are to the framebuffer) is coupled with the API abstraction. Ogre's demo assets are also created with a forward renderer in mind. The way I'm doing things right now for my framework is that I have an api abstraction, that wraps api calls(and makes some error checking, makes sure everything is consistent, etc), and a renderer abstraction, that can use any api. The renderer takes various inputs(basically batches and materials) and renders them, however it wants. This does add a bit of complexity(I'm using a policy based design to avoid virtual calls, so that makes things uglier still), but the benefit is being able to use whatever rendering algorithm you want, in a consistent fashion.
Let's take the skinning example: the renderer receives a mesh and a material. The deferred renderer will see weather the material uses alpha or not, keep the vertex shader and use a GBuffer fragment shader.
I mainly use this for testing ideas, and homework I get at school(not really allowed to use third party code for this kind of stuff), so I haven't tried anything complex, but so far it seems to work nicely. I do see some degree of code duplication(I still maintain a forward path for the deferred renderer), and lack of flexibility(the deferred rendering algo is coded, not scripted, so changing the GBuffer layout, for example, means I have to recompile the whole project).
Let's take the skinning example: the renderer receives a mesh and a material. The deferred renderer will see weather the material uses alpha or not, keep the vertex shader and use a GBuffer fragment shader.
I mainly use this for testing ideas, and homework I get at school(not really allowed to use third party code for this kind of stuff), so I haven't tried anything complex, but so far it seems to work nicely. I do see some degree of code duplication(I still maintain a forward path for the deferred renderer), and lack of flexibility(the deferred rendering algo is coded, not scripted, so changing the GBuffer layout, for example, means I have to recompile the whole project).
My Google summer of code 2011 topic: Modern Illumination Techniques
My Google summer of code thread
My Google summer of code wiki page
My Google summer of code thread
My Google summer of code wiki page